By now many of you must have heard about HashDoS. The researchers who found this, claim in their video that the worst case complexity of Hastable is O(n^2)
. How can this be?
相关问题
- How to toggle on Order in ReactJS
- PHP Recursively File Folder Scan Sorted by Modific
- facebook error invalid key hash for some devices
- Change sort order of strings includes with a speci
- Change first key of multi-dimensional Hash in perl
相关文章
- Sort TreeView Automatically Upon Adding Nodes
- Bcrypt vs Hash in laravel
- Why does array_unique sort the values?
- What is the fastest way to map group names of nump
- Sorting a data stream before writing to file in no
- Sort a List by a property and then by another
- Finding out whether there exist two identical subs
- Oracle STANDARD_HASH not available in PLSQL?
The question is worded in an incorrect way. The researchers do not claim that "the worst case complexity of Hashtables is O(n^2)".
What they claim is that "The [...] complexity of inserting n elements into the table [...] goes to O(n^2)." So, the complexity of a single operation is O(n). Which makes sense: if all keys have the same hash, then they all go into the same bucket, which is just an array or a linked list, so it needs to be searched linearly.