Why is the speed of the algorithm is measured in number of operations, not in seconds?

I wonder
Why is the speed of the algorithm is measured in number of operations, not in seconds?
Isn't it possible that some small program runs faster than the two line feature(built-in)?
April 3rd 20 at 18:29
2 answers
April 3rd 20 at 18:31
And that will give you seconds?
Here you have two programs that process the same data and receive the same results by different algorithms. The first program does it for 10 seconds, the second for 20. How to estimate how much it will increase the processing time by increasing the volume of data 10 times?
No, we do not know the complexity of the algorithm. If the first has a complexity of O(n2) and the second is O(n), the time of the first will grow 100 times and will be 1000 seconds, and the second only 10 times (200 seconds). That is, a program that was faster on small dataset suddenly becomes much slower on a large set. And the most important parameter here is the computation complexity of the algorithm.
Your answer is the best data
But could you bring some examples - lonny_Pagac79 commented on April 3rd 20 at 18:34
@lonny_Pagac79, the calculation of Fibonacci numbers
there is a little bit explained
It turns out that if, obviously, the order of difficulty of the second non-recursive algorithm is O(n), where n is the number of the corresponding Fibonacci numbers, then the order of complexity of the recursive algorithm is not obvious, is O(2^n) of degree n.
- Hayden commented on April 3rd 20 at 18:37
April 3rd 20 at 18:33
Because the execution speed depends on the frequency and type of the processor, and in our time, and the number of cores/processors (threads).

Find more questions by tags Algorithms