I am starting a long-term project based on node.js and therefore I am looking based on a solid DI system. Although node.js in its core implies the use of a simple require () module for wiring components, I believe that this approach is not suitable for a large project (for example, the required modules in each file are not supported, tested or dynamic). A.
Now I have done my research before posting this question, and I found out some interesting DI libraries for node.js (see wire.js , dependable.js )
However, for maximum simplicity and minimal repetition, I came up with my proposal for implementing DI:
you have a di.js module that acts like a container and is initialized by specifying a json file that stores a map of dependency names and their corresponding .js files. this already provides a dynamic character for DI, since you can easily change test / dev dependencies. a container can return dependencies using the inject () function, which finds a mapping of dependencies and calls require () with it.
for simplicity, the module is assigned to a global variable, i.e. global. $ di, so any file in the project can use a container / injector by calling $ di.inject ()
here is the gist of the implementation:
di.js:
module.exports = function(path) {
this.deps = require(path);
return {
inject: function(name) {
if (!deps[name])
throw new Error('dependency "' + name + '" isn\'t registered');
return require(deps[name]);
}
};
};
json dependency configuration file:
{
"vehicle": "lib/jetpack",
"fuel": "lib/benzine",
"octane": "lib/octane98"
}
initialize $ di in the js main file according to dev / test mode:
var path = 'dep-map-' + process.env.NODE_ENV + '.json;
$di = require('di')(path);
use it in some file:
var vehicle = $di.inject('vehicle');
vehicle.go();
, , $di;
, , , .
- ?
!