So when a dynamic array is doubled in size each time an element is added, I understand how the time complexity for expanding is O(n) n being the elements. What about if the the array is copied and moved to a new array that is only 1 size bigger when it is full? (instead of doubling) When we resize by some constant C, it the time complexity always O(n)?
相关问题
- How to get the maximum of more than 2 numbers in V
- Faster loop: foreach vs some (performance of jsper
- Convert Array to custom object list c#
- pick a random item from a javascript array
- Newtonsoft DeserializeXNode expands internal array
相关文章
- Numpy matrix of coordinates
- PHP: Can an array have an array as a key in a key-
- Accessing an array element when returning from a f
- How can I convert a PHP function's parameter l
- How to make a custom list deserializer in Gson?
- Recursively replace keys in an array
- React Native - Dynamic Image Source
- js if any value in array is under or over value
If you grow by some fixed constant C, then no, the runtime will not be O(n). Instead, it will be Θ(n2).
To see this, think about what happens if you do a sequence of C consecutive operations. Of those operations, C - 1 of them will take time O(1) because space already exists. The last operation will take time O(n) because it needs to reallocate the array, add space, and copy everything over. Therefore, any sequence of C operations will take time O(n + c).
So now consider what happens if you perform a sequence of n operations. Break those operations up into blocks of size C; there will be n / C of them. The total work required to perform those operations will be
Contrast this with the math for when you double the array size whenever you need more space: the total work done is
Transplanted from SO Documentation.
Sums of powers of 2 — 1 + 2 + 4 + 8 + 16 + …
The sum
simplifies to 2n - 1. This explains why the maximum value that can be stored in an unsigned 32-bit integer is 232 - 1.