I have a vector with 1000 "nodes"
if(count + 1 > m_listItems.capacity())
m_listItems.reserve(count + 100);
The problem is I also clear it out when I'm about to refill it.
m_listItems.clear();
The capacity doesn't change.
I've used the resize(1); but that doesn't seem to alter the capacity.
So how does one change the reserve?
vector<Item>(m_listItems).swap(m_listItems);
will shrink m_listItems
again: http://www.gotw.ca/gotw/054.htm (Herb Sutter)
If you want to clear it anyway, swap with an empty vector:
vector<Item>().swap(m_listItems);
which of course is way more efficient. (Note that swapping vectors basicially means just swapping two pointers. Nothing really time consuming going on)
You can swap the vector as others have suggested, and as described in http://www.gotw.ca/gotw/054.htm but be aware that it is not free, you're performing a copy of every element, because the vector has to allocate a new, smaller, chunk of memory, and copy all the old contents over. (The swap operation is essentially free, but you're swapping with a temporary initialized with a copy of the original vector's data, which is not free)
If you know in advance how big the vector is, you should allocate the right size to begin with, so no resizing is necessary:
std::vector<foo> v(1000); // Create a vector with capacity for 1000 elements
And if you don't know the capacity in advance, why does it matter whether it wastes a bit of space? Is it worth the time spent copying every element to a new and smaller vector (which is what std::vector(v).swap(v) will do), just to save a few kilobytes of memory?
Similarly, when you clear the vector, if you intend to refill it anyway, setting its capacity to zero seems to be an impressive waste of time.
Edit:
baash05: what if you had 1000000 items
an 10 megs of ram. would you say
reducing the amount of overhead is
important?
No. Resizing the vector requires more memory, temporarily, so if you're memory-limited, that might break your app. (You have to have the original vector in memory, and the temporary, before you can swap them, so you end up using up to twice as much RAM at that point). Afterwards, you might save a small amount of memory (up to a couple of MB), but this doesn't matter, because the excess capacity in the vector would never be accessed, so it would get pushed to the pagefile, and so not count towards your RAM limit in the first place.
If you have 1000000 items, then you should initialize the vector to the correct size in the first place.
And if you can't do that, then you'll typically be better off leaving the capacity alone. Especially since you stated that you're going to refill the vector, you should definitely reuse the capacity that has already been allocated, rather than allocating, reallocating, copying and freeing everything constantly.
You have two possible cases. Either you know how many elements you need to store, or you don't. If you know, then you can create the vector with the correct size in the first place, and so you never need to resize it, or you don't know, and then you might as well keep the excess capacity, so at least it won't have to resize upwards when you refill your vector.
You could try this technique from here
std::vector< int > v;
// ... fill v with stuff...
std::vector< int >().swap( v );
You can swap
it with a new vector that has desired capacity.
vector< int > tmp;
old.swap( tmp );
As far as I can tell, you can't reallocate a vector to a lower capacity than it ever has; you can only allocate it larger. There are good reasons for this; among them is that the reallocation process is hugely computationally intensive. If you really need to have a smaller vector, free the old one and create a new one that's smaller. That's actually computationally much simpler than having the vector actually resize smaller.