Can anyone give me references of a web site containing a summary of the main Java data structures, and their respective complexity in time (for some given operations like add, find, remove), e.g. Hashtable
s are O(1) for finding, while LinkedList
s are O(n). Some details like memory usage would be nice too.
This would be really helpful for thinking in data structures for algorithms.
Is there a reason to think that Java's implementation is different (in terms of complexity) than a generic, language agnostic implementation? In other words, why not just refer to a general reference on the complexity of various data structures:
NIST Dictionary of Algorithms and Data Structures
But, if you insist on Java-specific:
Java standard data structures Big O notation
Java Collections cheatsheet V2 (dead link, but this is the first version of the cheatsheet)
The most comprehensive Java Collections overview is here
http://en.wikiversity.org/wiki/Java_Collections_Overview
I found very useful The Collections Framework page, expecially the Outline of the Collections Framework, where every interface/class is breeefly described. Unfortunately there's no big-O information.
I couldn't see this particular resource mentioned here, i've found it of great use in the past. Know Thy Complexities!
http://bigocheatsheet.com/
Time and space complexities for the main collection classes should correspond to data structures known time complexity. I don't think there's anything Java specific about it, e.g. (as you say) hash lookup should be O(1). You could look here or here.
I don't believe there is any single website outlining this (sounds like a good idea for a project though). I think part of the problem is that an understanding in how each of the algorithms runs is very important. For the most part, it sounds like you understand Big-O, so I would use that as your best guesses. Follow it up with some benchmarking/profiling to see what runs faster/slower.
And, yes, the Java docs should have much of this information in java.util
.