What is the time complexity of inorder,postorder and preorder traversal of binary trees in data structures?? Is it O(n) or O(log n) or O(n^2)??
标签:
time-complexity
相关问题
- Time complexity for recursion in for loop
- Is there a sorting algorithm with linear time comp
- Time complexity of Array.from
- Time/space complexity of in-built python functions
- Efficiently searching a common node in 2 singly li
相关文章
- Is there any function that is in o(1)?
- TreeMap - Search Time Complexity
- Given a collection of integers and threshold value
- Sorting nearly sorted array with O(n*Log(K)) compl
- Time Complexity of Permutations of a String
- Big O Notation for two non-nested loops
- Is lseek() O(1) complexity?
- Approach and Code for o(log n) solution
O(n)
, because you traverse each node once. Or rather - the amount of work you do for each node is constant (does not depend on the rest of the nodes).Travesal is O(n) for any order - because you are hitting each node once. Lookup is where it can be less than O(n) IF the tree has some sort of organizing schema (ie binary search tree).
Introduction
Hi
I was asked this question today in class, and it is a good question! I will explain here and hopefully get my more formal answer reviewed or corrected where it is wrong. :)
Previous Answers
The observation by @Assaf is also correct since binary tree traversal travels recursively to visit each node once.
But!, since it is a recursive algorithm, you often have to use more advanced methods to analyze run-time performance. When dealing with a sequential algorithm or one that uses for-loops, using summations will often be enough. So, what follows is a more detailed explanation of this analysis for those who are curious.
The Recurrence
As previously stated,
T(n) = 2*T(n/2) + 1
where T(n) is the number of operations executed in your traversal algorithm (in-order, pre-order, or post-order makes no difference.
Explanation of the Recurrence
There are two T(n) because inorder, preorder, and postorder traversals all call themselves on the left and right child node. So, think of each recursive call as a T(n). In other words, **left T(n/2) + right T(n/2) = 2 T(n/2) **. The "1" comes from any other constant time operations within the function, like printing the node value, et cetera. (It could honestly be a 1 or any constant number & the asymptotic run-time still computes to the same value. Explanation follows.).
Analysis
This recurrence actually can be analyzed using big theta using the masters' theorem. So, I will apply it here.
T(n) = 2*T(n/2) + constant
where constant is some constant (could be 1 or any other constant).
Using the Masters' Theorem , we have T(n) = a*T(n/b) + f(n).
So, a=2, b=2, f(n) = constant since f(n) = n^c = 1, then it follows that c = 0 since f(n) is a constant.
From here, we can see that a = 2 and b^c = 2 ^0 = 1. So, a>b^c or 2>2^0. So, c < logb(a) or 0 < log2(2)
From here we have T(n) = BigTheta(n^{logb(a)}) = BigTheta(n^1) = BigTheta(n)
If your not famliar with BigTheta(n), it is "similar" ( please bear with me :) ) to O(n) but it is a "tighter bound" or tighter approximation of the run-time. So, BigTheta(n) is both worst-case O(n), and best-case BigOmega(n) run-time.
I hope this helps. Take care.
In-order, Pre-order, and Post-order traversals are Depth-First traversals.
For a Graph, the complexity of a Depth First Traversal is O(n + m), where n is the number of nodes, and m is the number of edges.
Since a Binary Tree is also a Graph, the same applies here. The complexity of each of these Depth-first traversals is O(n+m).
Since the number of edges that can originate from a node is limited to 2 in the case of a Binary Tree, the maximum number of total edges in a Binary Tree is n-1, where n is the total number of nodes.
The complexity then becomes O(n + n-1), which is O(n).
T(n) = 2T(n/2)+ c
T(n/2) = 2T(n/4) + c => T(n) = 4T(n/4) + 2c + c
similarly T(n) = 8T(n/8) + 4c+ 2c + c
....
....
last step ... T(n) = nT(1) + c(sum of powers of 2 from 0 to h(height of tree))
so Complexity is O(2^(h+1) -1)
but h = log(n)
so, O(2n - 1) = O(n)
O(n),I would say . I am doing for a balanced tree,applicable for all the trees. Assuming that you use recursion,
T(n) = 2*T(n/2) + 1 ----------> (1)
T(n/2) for left sub-tree and T(n/2) for right sub-tree and '1' for verifying the base case.
On Simplifying (1) you can prove that the traversal(either inorder or preorder or post order) is of order O(n).