I'm wondering if we should be tracking node_modules in our repo or doing an npm install when checking out the code?
相关问题
- npm WARN optional SKIPPING OPTIONAL DEPENDENCY: fs
- google-drive can't get push notifications
- Failed at the electron@1.8.2 postinstall script
- How to reimport module with ES6 import
- Webpack getting started, import error
相关文章
- 请教Git如何克隆本地库?
- node连接远程oracle报错
- How can make folder with Firebase Cloud Functions
- What is the tortoisehg gui equivalent of doing “hg
- @angular-cli install fails with deprecated request
- How to use Mercurial from Visual Studio 2010?
- GitHub:Enterprise post-receive hook
- node.js modify file data stream?
node_modules is not required to be checked-in if dependencies are mentioned in package.json. Any other programmer can simply get it by doing npm install and the npm is smart enough to make the node_modules in you working directory for the project.
I agree with ivoszz that it's sometimes useful to check the node_modules folder, but...
scenario 1:
One scenario: You use a package that gets removed from npm. If you have all the modules in the folder node_modules, then it won't be a problem for you. If you do only have the package name in the package.json, you can't get it anymore. If a package is less than 24 hours old, you can easily remove it from npm. If it's older than 24 hours old, then you need to contact them. But:
read more
So the chances for this are low, but there is scenario 2...
scenario 2:
An other scenario where this is the case: You develop an enterprise version of your software or a very important software and write in your package.json:
You use the method
function1(x)
of that package.Now the developers of studpid-package rename the method
function1(x)
tofunction2(x)
and they make a fault... They change the version of their package from1.0.1
to1.1.0
. That's a problem because when you callnpm install
the next time, you will accept version1.1.0
because you used the tilde ("studpid-package": "~1.0.1"
).Calling
function1(x)
can cause errors and problems now.But:
Pushing the whole node_modules folder (often more than 100 MB) to your repository, will cost you memory space. A few kb (package.json only) compared with hundreds of MB (package.json & node_modules)... Think about it.
You could do it / should think about it if:
the software is very important.
it costs you money when something fails.
you don't trust the npm registry. npm is centralized and could theoretically be shut down.
You don't need to publish the node_modules folder in 99.9% of the cases if:
you develop a software just for yourself.
you've programmed something and just want to publish the result on GitHub because someone else could maybe be interested in it.
If you don't want the node_modules to be in your repository, just create a
.gitignore
file and add the linenode_modules
.I would recommend against checking in node_modules because of packages like PhantomJS and node-sass for example, which install the appropriate binary for the current system.
This means that if one Dev runs
npm install
on Linux and checks in node_modules – it won't work for another Dev who clones the repo on Windows.It's better to check in the tarballs which npm install downloads and point
npm-shrinkwrap.json
at them. You can automate this process using shrinkpack.This topic is pretty old, I see. But I'm missing some update to arguments provided here due to changed situation in npm's eco system.
I'd always advise not to put node_modules under version control. Nearly all benefits from doing so as listed in context of accepted answer are pretty outdated as of now.
Published packages can't be revoked from npm registry that easily anymore. So you don't have to fear loosing dependencies your project has relied on before.
Putting package-json.lock file in VCS is helping with frequently updated dependencies probably resulting in different setups though relying on same package.json file.
So, putting node_modules into VCS in case of having offline build tools might be considered the only eligible use case left. However, node_modules usually grows pretty fast. Any update will change a lot of files. And this is affecting repositories in different ways. If you really consider long-term affects that might be an impediment as well.
Centralized VCS' like svn require transferring committed and checked out files over the network which is going to be slow as hell when it comes to checking out or updating a node_modules folder.
When it comes to git this high number of additional files will instantly pollute the repository. Keep in mind that git isn't tracking differences between versions of any file, but is storing copies of either version of a file as soon as a single character has changed. Every update to any dependency will result in another large changeset. Your git repository will quickly grow huge because of this affecting backups and remote synchronization. If you decide to remove node_modules from git repository later it is still part of it for historical reasons. If you have distributed your git repository to some remote server (e.g. for backup) cleaning it up is another painful and error-prone task you'd be running into.
Thus, if you care for efficient processes and like to keep things "small" I'd rather use a separate artifacts repository such as Nexos Repository (or just some HTTP server with ZIP archives) providing some previously fetched set of dependencies for download.
One more thing to consider: checking in
node_modules
makes it harder / impossible to use the difference betweendependencies
anddevDependencies
.On the other hand though, one could say it's reassuring to push to production the exact same code that went through tests - so including
devDependencies
.Modules details are stored in
packages.json
, that is enough. There's no need to checkinnode_modules
.People used to store
node_modules
in version control to lock dependencies of modules, but with npm shrinkwrap that's not needed anymore.Another justification for this point, as @ChrisCM wrote in the comment: