Today I've faced with a strange behavior of NuGet when installing a package.
A brief description: as a result of my build script there's a NuGet package. I don't change the version each time, so each and every build produces MyPackage.1.0.0.nupkg
. As the final step of the build, I push the package to the NuGet server deployed inside the local network.
Now, on a different machine, I run nuget install MyPackage -Source http://myserver/nuget
, which obviously installs the NuGet package.
The problem comes into play when I push another update of MyPackage
- still of version 1.0.0
. When I try to re-install it on client machine, I get the previous version of the package.
I found out it is the local cache to be blamed: if the package was installed, it gets into the local cache and the next time the package of the same version is installed, it is taken from cache. Fair enough!
But, on the other hand, there's a -NoCache
option of the nuget install
command, and I expect it to ignore the local cache.
However, this is not true. The first time I run it with -NoCache
, it updates the cache and installs the true latest version. But, the next time the package is still taken from cache, even with -NoCache
option.
Is it expected? Is it because of the version not being changed?
Just in case: all NuGet operations are done with NuGet.exe
and from PowerShell session.
UPDATE: I observe strange behavior I can only explain by cache expiration. When the package is cached, all subsequent calls to nuget install
pull the package from cache, until some time passes. I didn't notice the exact period, but it's definitely more than an hour. After this, nuget install
updates the package in the cache, and the situation becomes the same...