I have legacy code and a module definition like this:
define(["a", "b", "c", "d"], function(a, b, c) {
...
});
You can see that there are more dependencies than actual params that we are using at module.
Does it make any sense? I think module "d" is redundant.
This only means that module "d" (or, to be more precise, the result of calling "d"'s factory function) won't be passed as a parameter (e.g. d
) to the function, so it won't be available inside it.
It's possible that module "d" executes some code with side-effects so removing it could potentially change how you application behaves (i.e. break something).
Having a define with a list of dependencies longer than the list of parameters passed declared on the factory function of a define
call is a common occurrence with RequireJS, and not a sign that anything is wrong with the code.
For instance, when using jQuery with plugins, it is completely normal to have something like:
define(['jquery', 'jquery.foo'], function ($) {
// Use the foo plugin.
$('p').foo(...)
where 'jquery.foo' is a module implementing a jQuery plugins. jQuery plugins install themselves as methods on the jQuery
object (which we get as $
in the code above). When such plugins are loaded with RequireJS, they quite often have no useful module value. If changed the function above to function ($, foo)
, the foo
parameter would most likely be undefined
.
Note that none of this entails leaking globals. If jquery.foo
is written so that it registers as a proper AMD module then it does not depend on a jQuery
(or $
global) and thus jQuery.noConflict(true)
could have been called before it loads to remove the globals that jQuery creates by default.