1-)For sorted array I have used Binary Search. We know that the worst case complexity for SEARCH operation in sorted array is O(lg N), if we use Binary Search, where N are the number of items in an array. What is the worst case complexity for the search operation in the array that includes duplicate values, using binary search?? Will it be the be the same O(lg N)?? Please correct me if I am wrong!!
Also what is the worst case for INSERT operation in sorted array using binary search?? My guess is O(N).... is that right??
2-) For unsorted array I have used Linear search. Now we have an unsorted array that also accepts duplicate element/values. What are the best worst case complexity for both SEARCH and INSERT operation. I think that we can use linear search that will give us O(N) worst case time for both search and delete operations. Can we do better than this for unsorted array and does the complexity changes if we accepts duplicates in the array.
Yes.
The best case is uninteresting. (Think about why that might be.) The worst case is O(N), except for inserts. Inserts into an unsorted array are fast, one operation. (Again, think about it if you don't see it.)
In general, duplicates make little difference, except for extremely pathological distributions.
Some help on the way - but not the entire solution.
We can do better at deleting from an unordered array! As order doesn't matter in this case we can swap the element to be deleted with the last element which can avoid the unnecessary shifting of the elements in the array. Thus deleting in O(1) time.