What does O(log n) mean exactly?

2018-12-31 06:05发布

I am currently learning about Big O Notation running times and amortized times. I understand the notion of O(n) linear time, meaning that the size of the input affects the growth of the algorithm proportionally...and the same goes for, for example, quadratic time O(n2) etc..even algorithms, such as permutation generators, with O(n!) times, that grow by factorials.

For example, the following function is O(n) because the algorithm grows in proportion to its input n:

f(int n) {
  int i;
  for (i = 0; i < n; ++i)
    printf("%d", i);
}

Similarly, if there was a nested loop, the time would be O(n2).

But what exactly is O(log n)? For example, what does it mean to say that the height of a complete binary tree is O(log n)?

I do know (maybe not in great detail) what Logarithm is, in the sense that: log10 100 = 2, but I cannot understand how to identify a function with a logarithmic time.

30条回答
妖精总统
2楼-- · 2018-12-31 06:23

You can think of O(log N) intuitively by saying the time is proportional to the number of digits in N.

If an operation performs constant time work on each digit or bit of an input, the whole operation will take time proportional to the number of digits or bits in the input, not the magnitude of the input; thus, O(log N) rather than O(N).

If an operation makes a series of constant time decisions each of which halves (reduces by a factor of 3, 4, 5..) the size of the input to be considered, the whole will take time proportional to log base 2 (base 3, base 4, base 5...) of the size N of the input, rather than being O(N).

And so on.

查看更多
几人难应
3楼-- · 2018-12-31 06:25

O(log N) basically means time goes up linearly while the n goes up exponentially. So if it takes 1 second to compute 10 elements, it will take 2 seconds to compute 100 elements, 3 seconds to compute 1000 elements, and so on.

​It is O(log n) when we do divide and conquer type of algorithms e.g binary search. Another example is quick sort where each time we divide the array into two parts and each time it takes O(N) time to find a pivot element. Hence it N O(log N)

查看更多
看风景的人
4楼-- · 2018-12-31 06:25

The explanation below is using the case of a fully balanced binary tree to help you understand how we get logarithmic time complexity.

Binary tree is a case where a problem of size n is divided into sub-problem of size n/2 until we reach a problem of size 1:

height of a binary tree

And that's how you get O(log n) which is the amount of work that needs to be done on the above tree to reach a solution.

A common algorithm with O(log n) time complexity is Binary Search whose recursive relation is T(n/2) + O(1) i.e. at every subsequent level of the tree you divide problem into half and do constant amount of additional work.

查看更多
墨雨无痕
5楼-- · 2018-12-31 06:25

First I recommend you to read following book;

Algorithms (4th Edition)

Here is some functions and their expected complexities. Numbers are indicating statement execution frequencies.

Here is some functions and their expected complexities

Following Big-O Complexity Chart also taken from bigocheatsheet Big-O Complexity Chart

Lastly very simple showcase there is shows how it is calculated;

Anatomy of a program’s statement execution frequencies.

Analyzing the running time of a program (example).

Analyzing the running time of a program

查看更多
查无此人
6楼-- · 2018-12-31 06:26

I would like to add that the height of the tree is the length of the longest path from the root to a leaf, and that the height of a node is the length of the longest path from that node to a leaf. The path means the number of nodes we encounter while traversing the tree between two nodes. In order to achieve O(log n) time complexity, the tree should be balanced, meaning that the difference of the height between the children of any node should be less than or equal to 1. Therefore, trees do not always guarantee a time complexity O(log n), unless they are balanced. Actually in some cases, the time complexity of searching in a tree can be O(n) in the worst case scenario.

You can take a look at the balance trees such as AVL tree. This one works on balancing the tree while inserting data in order to keep a time complexity of (log n) while searching in the tree.

查看更多
ら面具成の殇う
7楼-- · 2018-12-31 06:28

But what exactly is O(log n)? For example, what does it mean to say that the height of a >complete binary tree is O(log n)?

I would rephrase this as 'height of a complete binary tree is log n'. Figuring the height of a complete binary tree would be O(log n), if you were traversing down step by step.

I cannot understand how to identify a function with a logarithmic time.

Logarithm is essentially the inverse of exponentiation. So, if each 'step' of your function is eliminating a factor of elements from the original item set, that is a logarithmic time algorithm.

For the tree example, you can easily see that stepping down a level of nodes cuts down an exponential number of elements as you continue traversing. The popular example of looking through a name-sorted phone book is essentially equivalent to traversing down a binary search tree (middle page is the root element, and you can deduce at each step whether to go left or right).

查看更多
登录 后发表回答