What the iteration cost on a HashSet also depend on the capacity of backing map?

From the JavaDocs of HashSet:

This class offers constant time performance for the basic operations (add, remove, contains and size), assuming the hash function disperses the elements properly among the buckets. Iterating over this set requires time proportional to the sum of the HashSet instance's size (the number of elements) plus the "capacity" of the backing HashMap instance (the number of buckets). Thus, it's very important not to set the initial capacity too high (or the load factor too low) if iteration performance is important

Why does iteration takes time proportional to the sum(number of elements in set+ capacity of backing map) and not only to the number of elements in the set itself ?

.


ANSWERS:


HashSet is imlemented using a HashMap where the elements are the map keys. Since a map has a defined number of buckets that can contain one or more elements, iteration needs to check each bucket, whether it contains elements or not.


Using LinkedHashSet follows the "linked" list of entries so the number of blanks doesn't matter. Normally you wouldn't have a HashSet where the capacity is much more than double the size actually used. Even if you do, scanning a million entries, mostly null doesn't take much time (milli-seconds)


Why does iteration takes time proportional to the sum(number of elements in set+ capacity of backing map) and not only to the number of elements in the set itself ?

The elements are dispersed inside the underlying HashMap which is backed by an array.
So it is not known which buckets are occupied (but it is known how many elements are totally available).
So to iterate over all elements all buckets must be checked


If your concern is the time it takes to iterate around the set, and you are using Java 6 or greater take a look at this beauty:

ConcurrentSkipListSet



 MORE:


 ? What is the complexity of retrieval/insertion in JavaScript associative arrays (dynamic object properties) in the major javascript engines?
 ? Big-Oh complexity of multithreaded code
 ? How to make a for-loop, which iterates over 15mln records, space-efficient?
 ? Time and space complexity for removing duplicates from a list
 ? Complexity, both time and space
 ? Merging two sorted linked list--understanding why it's O(1) vs. O(N) space complexity
 ? Space Complexity: Array of Linked List Nodes (Heads)
 ? Longest Increasing Path in a Matrix Time Complexity Analysis
 ? The space complexity of Fibonacci Sequence
 ? Does updating the list take extra space?