Do I need dependency injection in NodeJS, or how to deal with ...?

node.jsDependency InjectionInversion of-Control

node.js Problem Overview


I currently creating some experimental projects with nodejs. I have programmed a lot Java EE web applications with Spring and appreciated the ease of dependency injection there.

Now I am curious: How do I do dependency injection with node? Or: Do I even need it? Is there a replacing concept, because the programming style is different?

I am talking about simple things, like sharing a database connection object, so far, but I have not found a solution that satisfies me.

node.js Solutions


Solution 1 - node.js

In short, you don't need a dependency injection container or service locater like you would in C#/Java. Since Node.js, leverages the module pattern, it's not necessary to perform constructor or property injection. Although you still can.

The great thing about JS is that you can modify just about anything to achieve what you want. This comes in handy when it comes to testing.

Behold my very lame contrived example.

MyClass.js:

var fs = require('fs');

MyClass.prototype.errorFileExists = function(dir) {
	var dirsOrFiles = fs.readdirSync(dir);
	for (var d of dirsOrFiles) {
		if (d === 'error.txt') return true;
	}
    return false;
};

MyClass.test.js:

describe('MyClass', function(){
	it('should return an error if error.txt is found in the directory', function(done){
		var mc = new MyClass();
		assert(mc.errorFileExists('/tmp/mydir')); //true
	});
});

Notice how MyClass depends upon the fs module? As @ShatyemShekhar mentioned, you can indeed do constructor or property injection as in other languages. But it's not necessary in Javascript.

In this case, you can do two things.

You can stub the fs.readdirSync method or you can return an entirely different module when you call require.

Method 1:

var oldmethod = fs.readdirSync;
fs.readdirSync = function(dir) { 
	return ['somefile.txt', 'error.txt', 'anotherfile.txt']; 
};

*** PERFORM TEST ***
*** RESTORE METHOD AFTER TEST ****
fs.readddirSync = oldmethod;

Method 2:

var oldrequire = require
require = function(module) {
	if (module === 'fs') {
		return {
			readdirSync: function(dir) { 
				return ['somefile.txt', 'error.txt', 'anotherfile.txt']; 
			};
		};
	} else
		return oldrequire(module);
			
}

The key is to leverage the power of Node.js and Javascript. Note, I'm a CoffeeScript guy, so my JS syntax might be incorrect somewhere. Also, I'm not saying that this is the best way, but it is a way. Javascript gurus might be able to chime in with other solutions.

Update:

This should address your specific question regarding database connections. I'd create a separate module to encapsulate your database connection logic. Something like this:

MyDbConnection.js: (be sure to choose a better name)

var db = require('whichever_db_vendor_i_use');

module.exports.fetchConnection() = function() {
	//logic to test connection
	
	//do I want to connection pool?
	
	//do I need only one connection throughout the lifecyle of my application?
	
	return db.createConnection(port, host, databasename); //<--- values typically from a config file	
}

Then, any module that needs a database connection would then just include your MyDbConnection module.

SuperCoolWebApp.js:

var dbCon = require('./lib/mydbconnection'); //wherever the file is stored

//now do something with the connection
var connection = dbCon.fetchConnection(); //mydbconnection.js is responsible for pooling, reusing, whatever your app use case is

//come TEST time of SuperCoolWebApp, you can set the require or return whatever you want, or, like I said, use an actual connection to a TEST database. 

Do not follow this example verbatim. It's a lame example at trying to communicate that you leverage the module pattern to manage your dependencies. Hopefully this helps a bit more.

Solution 2 - node.js

require() and most recently ES Modules (import) are THE way for managing dependencies in Node.js and surely it is intuitive and effective, but it has also its limitations.

My advice is to take a look at some of the Dependency Injection containers available today for Node.js to have an idea of what are their pros/cons. Some of them are:

Just to name a few.

Now the real question is, what can you achieve with a Node.js DI container, compared to a simple require() or import?

Pros:

  • better testability: modules accepts their dependencies as input
  • Inversion of Control: decide how to wire your modules without touching the main code of your application.
  • a customizable algorithm for resolving modules: dependencies have "virtual" identifiers, usually they are not bound to a path on the filesystem.
  • Better extensibility: enabled by IoC and "virtual" identifiers.
  • Other fancy stuff possible:
    • Async initialization
    • Module lifecycle management
    • Extensibility of the DI container itself
    • Can easily implement higher level abstractions (e.g. AOP)

Cons:

  • Different from the Node.js "experience": using DI definitely feels like you are deviating from the Node way of thinking.
  • The relationship between a dependency and its implementation is not always explicit. A dependency may be resolved at runtime and influenced by various parameters. The code becomes more difficult to understand and debug
  • Slower startup time
  • Most DI containers will not play well with module bundlers like Browserify and Webpack.

As with anything related to software development, choosing between DI or require()/import depends on your requirements, your system complexity, and your programming style.

Solution 3 - node.js

I know this thread is fairly old at this point, but I figured I'd chime in with my thoughts on this. The TL;DR is that due to the untyped, dynamic nature of JavaScript, you can actually do quite a lot without resorting to the dependency injection (DI) pattern or using a DI framework. However, as an application grows larger and more complex, DI can definitely help the maintainability of your code.

DI in C#

To understand why DI isn't as big of a need in JavaScript, it's helpful to look at a strongly typed language like C#. (Apologies to those who don't know C#, but it should be easy enough to follow.) Say we have an app that describes a car and its horn. You would define two classes:

class Horn
{
    public void Honk()
    {
        Console.WriteLine("beep!");
    }
}

class Car
{
    private Horn horn;

    public Car()
    {
        this.horn = new Horn();
    }

    public void HonkHorn()
    {
        this.horn.Honk();
    }
}

class Program
{
    static void Main()
    {
        var car = new Car();
        car.HonkHorn();
    }
}

There are few issues with writing the code this way.

  1. The Car class is tightly coupled to the particular implementation of the horn in the Horn class. If we want to change the type of horn used by the car, we have to modify the Car class even though its usage of the horn doesn't change. This also makes testing difficult because we can't test the Car class in isolation from its dependency, the Horn class.
  2. The Car class is responsible for the lifecycle of the Horn class. In a simple example like this it's not a big issue, but in real applications dependencies will have dependencies, which will have dependencies, etc. The Car class would need to be responsible for creating the entire tree of its dependencies. This is not only complicated and repetitive, but it violates the "single responsibility" of the class. It should focus on being a car, not creating instances.
  3. There is no way to reuse the same dependency instances. Again, this isn't important in this toy application, but consider a database connection. You would typically have a single instance that is shared across your application.

Now, let's refactor this to use a dependency injection pattern.

interface IHorn
{
    void Honk();
}

class Horn : IHorn
{
    public void Honk()
    {
        Console.WriteLine("beep!");
    }
}

class Car
{
    private IHorn horn;

    public Car(IHorn horn)
    {
        this.horn = horn;
    }

    public void HonkHorn()
    {
        this.horn.Honk();
    }
}

class Program
{
    static void Main()
    {
        var horn = new Horn();
        var car = new Car(horn);
        car.HonkHorn();
    }
}

We've done two key things here. First, we've introduced an interface that our Horn class implements. This lets us code the Car class to the interface instead of the particular implementation. Now the code could take anything that implements IHorn. Second, we've taken the horn instantiation out of Car and passed it in instead. This resolves the issues above and leaves it to the application's main function to manage the specific instances and their lifecycles.

What this means is that we could introduce a new type of horn for the car to use without touching the Car class:

class FrenchHorn : IHorn
{
    public void Honk()
    {
        Console.WriteLine("le beep!");
    }
}

The main could just inject an instance of the FrenchHorn class instead. This also dramatically simplifies testing. You could create a MockHorn class to inject into the Car constructor to ensure you are testing just the Car class in isolation.

The example above shows manual dependency injection. Typically DI is done with a framework (e.g. Unity or Ninject in the C# world). These frameworks will do all of the dependency wiring for you by walking your dependency graph and creating instances as needed.

The Standard Node.js Way

Now let's look at the same example in Node.js. We would probably break our code into 3 modules:

// horn.js
module.exports = {
    honk: function () {
        console.log("beep!");
    }
};

// car.js
var horn = require("./horn");
module.exports = {
    honkHorn: function () {
        horn.honk();
    }
};

// index.js
var car = require("./car");
car.honkHorn();

Because JavaScript is untyped, we don't have quite the same tight coupling that we had before. There is no need for interfaces (nor do they exist) as the car module will just attempt to call the honk method on whatever the horn module exports.

Additionally, because Node's require caches everything, modules are essentially singletons stored in a container. Any other module that performs a require on the horn module will get the exact same instance. This makes sharing singleton objects like database connections very easy.

Now there is still the issue that the car module is responsible for fetching its own dependency horn. If you wanted the car to use a different module for its horn, you'd have to change the require statement in the car module. This is not a very common thing to do, but it does cause issues with testing.

The usual way people handle the testing problem is with proxyquire. Owing to the dynamic nature of JavaScript, proxyquire intercepts calls to require and returns any stubs/mocks you provide instead.

var proxyquire = require('proxyquire');
var hornStub = {
    honk: function () {
        console.log("test beep!");
    }
};

var car = proxyquire('./car', { './horn': hornStub });

// Now make test assertions on car...

This is more than enough for most applications. If it works for your app then go with it. However, in my experience as applications grow larger and more complex, maintaining code like this becomes harder.

DI in JavaScript

Node.js is very flexible. If you aren't satisfied with the method above, you can write your modules using the dependency injection pattern. In this pattern, every module exports a factory function (or a class constructor).

// horn.js
module.exports = function () {
    return {
        honk: function () {
            console.log("beep!");
        }
    };
};

// car.js
module.exports = function (horn) {
    return {
        honkHorn: function () {
            horn.honk();
        }
    };
};

// index.js
var horn = require("./horn")();
var car = require("./car")(horn);
car.honkHorn();

This is very much analogous to the C# method earlier in that the index.js module is responsible for instance lifecycles and wiring. Unit testing is quite simple as you can just pass in mocks/stubs to the functions. Again, if this is good enough for your application go with it.

Bolus DI Framework

Unlike C#, there are no established standard DI frameworks to help with your dependency management. There are a number of frameworks in the npm registry but none have widespread adoption. Many of these options have been cited already in the other answers.

I wasn't particularly happy with any of the options available so I wrote my own called bolus. Bolus is designed to work with code written in the DI style above and tries to be very DRY and very simple. Using the exact same car.js and horn.js modules above, you can rewrite the index.js module with bolus as:

// index.js
var Injector = require("bolus");
var injector = new Injector();
injector.registerPath("**/*.js");

var car = injector.resolve("car");
car.honkHorn();

The basic idea is that you create an injector. You register all of your modules in the injector. Then you simply resolve what you need. Bolus will walk the dependency graph and create and inject dependencies as needed. You don't save much in a toy example like this, but in large applications with complicated dependency trees the savings are huge.

Bolus supports a bunch of nifty features like optional dependencies and test globals, but there are two key benefits I've seen relative to the standard Node.js approach. First, if you have a lot of similar applications, you can create a private npm module for your base that creates an injector and registers useful objects on it. Then your specific apps can add, override, and resolve as needed much like how AngularJS's injector works. Second, you can use bolus to manage various contexts of dependencies. For example, you could use middleware to create a child injector per request, register the user id, session id, logger, etc. on the injector along with any modules depending on those. Then resolve what you need to serve requests. This gives you instances of your modules per request and prevents having to pass the logger, etc. along to every module function call.

Solution 4 - node.js

I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:

var rewire = require("rewire"),
    myModule = rewire("./path/to/myModule.js"); // exactly like require()

// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123


// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
    readFile: function (path, encoding, cb) {
        cb(null, "Success!");
    }
});
myModule.readSomethingFromFileSystem(function (err, data) {
    console.log(data); // = Success!
});

I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.

Solution 5 - node.js

I built Electrolyte for just this purpose. The other dependency injection solutions out there were too invasive for my tastes, and messing with the global require is a particular grievance of mine.

Electrolyte embraces modules, specifically those that export a "setup" function like you see in Connect/Express middleware. Essentially, these types of modules are just factories for some object they return.

For example, a module that creates a database connection:

var mysql = require('mysql');

exports = module.exports = function(settings) {
  var connection = mysql.createConnection({
    host: settings.dbHost,
    port: settings.dbPort
  });

  connection.connect(function(err) {
    if (err) { throw err; }
  });

  return connection;
}

exports['@singleton'] = true;
exports['@require'] = [ 'settings' ];

What you see at the bottom are annotations, an extra bit of metadata that Electrolyte uses to instantiate and inject dependencies, automatically wiring your application's components together.

To create a database connection:

var db = electrolyte.create('database');

Electrolyte transitively traverses the @require'd dependencies, and injects instances as arguments to the exported function.

The key is that this is minimally invasive. This module is completely usable, independent of Electrolyte itself. That means your unit tests can test just the module under test, passing in mock objects without need for additional dependencies to rewire internals.

When running the full application, Electrolyte steps in at the inter-module level, wiring things together without the need for globals, singletons or excessive plumbing.

Solution 6 - node.js

I looked into this myself. I dislike introducing magic dependency utils libraries which provide mechanisms to hijack my module imports. Instead I came up with a "design guideline" for my team to rather explicitly state what dependencies can be mocked by introducing a factory function export within my modules.

I make extensive use of ES6 features for parameters and destructuring in order to avoid some boilerplate and provide a named dependency override mechanism.

Here is an example:

import foo from './utils/foo';
import bob from './utils/bob';

// We export a factory which accepts our dependencies.
export const factory = (dependencies = {}) => {
  const {
    // The 'bob' dependency.  We default to the standard 'bob' imp if not provided.
    $bob = bob, 
    // Instead of exposing the whole 'foo' api, we only provide a mechanism
    // with which to override the specific part of foo we care about.
    $doSomething = foo.doSomething // defaults to standard imp if none provided.
  } = dependencies;  

  return function bar() {
    return $bob($doSomething());
  }
}

// The default implementation, which would end up using default deps.
export default factory();

And here is an example of it's usage

import { factory } from './bar';

const underTest = factory({ $bob: () => 'BOB!' }); // only override bob!
const result = underTest();

Excuse the ES6 syntax for those unfamiliar with it.

Solution 7 - node.js

I recently checked this thread for much the same reason as the OP - most of the libs I've encountered temporarily rewrite the require statement. I've had mixed degrees of success with this method, and so I ended up using the following approach.

In the context of an express application - I wrap app.js in a bootstrap.js file:

var path = require('path');
var myapp = require('./app.js');

var loader = require('./server/services/loader.js');

// give the loader the root directory
// and an object mapping module names 
// to paths relative to that root
loader.init(path.normalize(__dirname), require('./server/config/loader.js')); 

myapp.start();

The object map passed to the loader looks like this:

// live loader config
module.exports = {
    'dataBaseService': '/lib/dataBaseService.js'
}

// test loader config
module.exports = {
    'dataBaseService': '/mocks/dataBaseService.js'
    'otherService' : {other: 'service'} // takes objects too...
};

Then, rather than directly calling require...

var myDatabaseService = loader.load('dataBaseService');

If no alias is located in the loader - then it will just default to a regular require. This has two benefits: I can swap in any version of the class, and it remove the need to use relative path names throughout the application (so If I need a custom lib below or above the current file, I don't need to traverse, and require will cache the module against the same key). It also allows me to specify mocks at any point in the app, rather than in the immediate test suite.

I've just published a little npm module for convenience:

https://npmjs.org/package/nodejs-simple-loader

Solution 8 - node.js

The reality is that you can test your node.js without IoC container because JavaScript is a really dynamic programming language and you can modify almost everything at run-time.

Consider the following:

import UserRepository from "./dal/user_repository";

class UserController {
    constructor() {
        this._repository = new UserRepository();
    }
    getUsers() {
        this._repository.getAll();
    }
}

export default UserController;

So you can override the coupling between components at run-time. I like to think that we should aim to decouple our JavaScript modules.

The only way to achieve real decoupling is by removing the reference to the UserRepository:

class UserController {
    constructor(userRepository) {
        this._repository = userRepository;
    }
    getUsers() {
        this._repository.getAll();
    }
}

export default UserController;

This means that somewhere else you will need to do the object composition:

import UserRepository from "./dal/user_repository";
import UserController from "./dal/user_controller";

export default new UserController(new UserRepository());

I like the idea of delegating the object composition to an IoC container. You can learn more about this idea in the article The current state of dependency inversion in JavaScript. The article tries to debunk some "JavaScript IoC container myths":

> Myth 1: There is no place for IoC containers in JavaScript > > Myth 2: We don’t need IoC containers, we already have module loaders! > > Myth 3: Dependency inversion === injecting dependencies

If you also like the idea of using an IoC container you could take a look to InversifyJS. The latest release (2.0.0) supports many use cases:

  • Kernel modules
  • Kernel middleware
  • Use classes, string literals or Symbols as dependency identifiers
  • Injection of constant values
  • Injection of class constructors
  • Injection of factories
  • Auto factory
  • Injection of providers (async factory)
  • Activation handlers (used to inject proxies)
  • Multi injections
  • Tagged bindings
  • Custom tag decorators
  • Named bindings
  • Contextual bindings
  • Friendly exceptions (e.g. Circular dependencies)

You can learn more about it at InversifyJS.

Solution 9 - node.js

For ES6 I developed this container https://github.com/zazoomauro/node-dependency-injection

import {ContainerBuilder} from 'node-dependency-injection'

let container = new ContainerBuilder()
container.register('mailer', 'Mailer')

Then you can set, for example, the choice of transport in the container:

import {ContainerBuilder} from 'node-dependency-injection'

let container = new ContainerBuilder()
container
  .register('mailer', 'Mailer')
  .addArgument('sendmail')

This class is now much more flexible as you have separated the choice of transport out of the implementation and into the container.

Now that the mailer service is in the container you can inject it as a dependency of other classes. If you have a NewsletterManager class like this:

class NewsletterManager {
    construct (mailer, fs) {
        this._mailer = mailer
        this._fs = fs
    }
}

export default NewsletterManager

When defining the newsletter_manager service, the mailer service does not exist yet. Use the Reference class to tell the container to inject the mailer service when it initializes the newsletter manager:

import {ContainerBuilder, Reference, PackageReference} from 'node-dependency-injection'
import Mailer from './Mailer'
import NewsletterManager from './NewsletterManager'

let container = new ContainerBuilder()

container
  .register('mailer', Mailer)
  .addArgument('sendmail')

container
  .register('newsletter_manager', NewsletterManager)
  .addArgument(new Reference('mailer'))
  .addArgument(new PackageReference('fs-extra'))

You can also setting up the container with configuration files like Yaml, Json or JS files

The service container can be compiled for various reasons. These reasons include checking for any potential issues such as circular references and making the container more efficient.

container.compile()

Solution 10 - node.js

It depends on the design of your application. You can obviously do a java like injection where you create an object of a class with the dependency passed in the constructor like this.

function Cache(store) {
   this._store = store;
}

var cache = new Cache(mysqlStore);

If you are not doing OOP in javascript, you can make an init function that sets everything up.

However, there is another approach that you can take which is more common in an event based system such as node.js. If you can model you application to only(most of the time) act on events then all you need to do is to set everything up(which I usually do by calling an init function) and emit events from a stub. This makes testing fairly easier and readable.

Solution 11 - node.js

I always liked the simplicity of IoC concept - "You don't have to know anything about environment, you'll be called by someone when needed"

But all IoC implementations I saw did exactly the opposite - they clutter the code with even more things than without it. So, I created my own IoC that works as I'd like it to be - it stays hidden and invisible 90% of time.

It's used in MonoJS web framework http://monojs.org

> I am talking about simple things, like sharing a database connection object, so > far, but I have not found a solution that satisfies me.

It's done like this - register component once in config.

app.register 'db', -> 
  require('mongodb').connect config.dbPath

And use it anywhere

app.db.findSomething()

You can see the full component definition code (with DB Connection and other Components) here https://github.com/sinizinairina/mono/blob/master/mono.coffee

This is the only place when you have to tell IoC what to do, after that all those components will be created and wired automatically and you don't have to see IoC specific code in your application anymore.

The IoC itself https://github.com/alexeypetrushin/miconjs

Solution 12 - node.js

I think we still need Dependency Injection in Nodejs because it loosens the dependencies between services and make application clearer.

Inspired by Spring Framework, I also implement my own module to support dependency injection in Nodejs. My module is also able to detect the code changes and auto reload the services without restart your application.

Visit my project at: Buncha - IoC container

Thank you!

Solution 13 - node.js

Node.js requires DI as much as any other platform. If you are building something big, DI will make it easier to mock the dependencies of your code and test your code thoroughly.

Your database layer modules for example, shouldn’t just get required at your business code modules because, when unit testing these business code modules, the daos will load and connect to the database.

One solution would be to pass the dependencies as module parameters:

module.exports = function (dep1, dep2) {
// private methods

   return {
	// public methods
	   test: function(){...}
   }
}

This way dependencies can be mocked easily and naturally and you can stay focused on testing your code, without using any tricky 3rd party library.

There are other solutions out there (broadway, architect etc) which can help you with this. although they may do more than what you want or use more clutter.

Solution 14 - node.js

I discovered this question while answering to an issue on my own DI module asking why one would ever need a DI system for NodeJS programming.

The answer was clearly tending to the ones given in this thread: it depends. There are trade-offs for both approaches and reading this question's answers give a good shape of them.

So, the real answer to this question, should be that in some situations, you would use a DI system, in others not.

That said, what you want as a developer is to not repeat yourself and reuse your services across your various applications.

This means that we should write services that are ready to be used in DI system but not tied to DI libraries. To me, it means that we should write services like this:

module.exports = initDBService;

// Tells any DI lib what it expects to find in it context object
// The $inject prop is the de facto standard for DI imo 
initDBService.$inject = ['ENV'];

// Note the context object, imo, a DI tool should bring
// services in a single context object
function initDBService({ ENV }) {
/// actual service code
}

That way your service works not matter if you use it with or without a DI tool.

Solution 15 - node.js

I worked with .Net, PHP and Java for long time so I wanted to have a convenient Dependency Injection in NodeJS too. People said the built-in DI in NodeJS is enough as we can get it with Module. But it didn't satisfy me well. I wanted to keep a Module no more than a Class. Additionally, I wanted the DI to have a full support for Module life cycle management (singleton module, transient module etc.) but with Node module, I had to write manual code very often. Lastly, I wanted to make Unit Test easier. That's why I created a Dependency Injection for myself.

If you are looking for a DI, give it a try. It can be found here: https://github.com/robo-creative/nodejs-robo-container. It's fully documented. It also addresses some common problems with DI and how to solve them in OOP way. Hope it helps.

Solution 16 - node.js

TypeDI is the sweetest of all mentioned here, look this code in TypeDI

import "reflect-metadata";
import {Service, Container} from "typedi";

@Service()
class SomeClass {

    someMethod() {
    }

}

let someClass = Container.get(SomeClass);
someClass.someMethod();

Look this code too:

import {Container, Service, Inject} from "typedi";

// somewhere in your global app parameters
Container.set("authorization-token", "RVT9rVjSVN");

@Service()
class UserRepository {

    @Inject("authorization-token")
    authorizationToken: string;

}

Solution 17 - node.js

Have a look at dips (A simple yet powerful dependency injection and entity (file) management framework for Node.js)

https://github.com/devcrust/node-dips

Solution 18 - node.js

I think other posts have done a great job in the argument for using DI. For me the reasons are

  1. Inject dependencies without knowing their paths. This means that if you change a module location on disk or swap it with another, you don't need to touch every file that depends on it.

  2. It makes it a lot easier to mock dependencies for testing without the pain of overriding the global require function in a way that works without problems.

  3. It helps you organize and reason about you application as loosely coupled modules.

But I had a really hard time finding a DI framework that my team and I can easily adopt. So I recently built a framework called deppie based on these features

  • Minimal API that can be learned in a few minutes

  • No extra code/config/annotations required

  • One to one direct mapping to require modules

  • Can be adopted partially to work with existing code

Solution 19 - node.js

It should be flexible and simple like this:

var MyClass1 = function () {}
var MyClass2 = function (myService1) {
    // myService1.should.be.instanceof(MyClass1); 
}
 

container.register('myService1', MyClass1);
container.register('myService2', MyClass2, ['myService1']);

I have written article about Dependency Injection in node.js.

I hope it can help you with this.

Solution 20 - node.js

I have developed a library that handles the dependency injection with a simple way, that decreases the boilerplate code. Each module is defined by a unique name and a controller function. The parameters of the controller reflects the module's dependencies.

Read more on KlarkJS

Brief example:

KlarkModule(module, 'myModuleName1', function($nodeModule1, myModuleName2) {
    return {
        log: function() { console.log('Hello from module myModuleName1') }
    };
});
  • myModuleName1 is the name of the module.

  • $nodeModule1 is an external library from node_module. The name resolves to node-module1. The prefix $ indicates that it is an external module.

  • myModuleName2 is the name of an internal module.

  • The return value of the controller is used from the other internal modules, when they define the parameter myModuleName1.

Solution 21 - node.js

I recently created a library called circuitbox which allows you to use dependency-injection with node.js. It does true dependency-injection vs. many of the dependency-lookup based libraries I have seen. Circuitbox also supports asynchronous creation and initialization routines. Below is an example:

Assume the following code is in a file called consoleMessagePrinter.js

'use strict';

// Our console message printer
// deps is injected by circuitbox with the dependencies
function ConsoleMessagePrinter(deps) {
  return {
    print: function () {
      console.log(deps.messageSource.message());
    }
  };
}

module.exports = ConsoleMessagePrinter;

Assume the following is in the file main.js

'use strict';

// our simple message source
// deps is injected by circuitbox with the dependencies
var simpleMessageSource = function (deps) {
  return {
    message: function () {
      return deps.message;
    }
  };
};

// require circuitbox
var circuitbox = require('../lib');

// create a circuitbox
circuitbox.create({
  modules: [
    function (registry) {
      // the message to be used
      registry.for('message').use('This is the message');

      // define the message source
      registry.for('messageSource').use(simpleMessageSource)
        .dependsOn('message');

      // define the message printer - does a module.require internally
      registry.for('messagePrinter').requires('./consoleMessagePrinter')
        .dependsOn('messageSource');
    }
  ]
}).done(function (cbx) {
  
  // get the message printer and print a message
  cbx.get('messagePrinter').done(function (printer) {
    printer.print();
  }, function (err) {
    console.log('Could not recieve a printer');
    return;
  });

}, function (err) {
  console.log('Could not create circuitbox');
});

Circuitbox lets you define your components and declare their dependencies as modules. Once its initialized, it allows you to retrieve a component. Circuitbox automatically injects all the components the target component requires and gives it to you for use.

The project is in alpha version. Your comments, ideas and feedback are welcome.

Hope it helps!

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionErikView Question on Stackoverflow
Solution 1 - node.jsJP RichardsonView Answer on Stackoverflow
Solution 2 - node.jsMarioView Answer on Stackoverflow
Solution 3 - node.jsDave JohnsonView Answer on Stackoverflow
Solution 4 - node.jsJohannes EwaldView Answer on Stackoverflow
Solution 5 - node.jsJared HansonView Answer on Stackoverflow
Solution 6 - node.jsctrlplusbView Answer on Stackoverflow
Solution 7 - node.jssunwukungView Answer on Stackoverflow
Solution 8 - node.jsRemo H. JansenView Answer on Stackoverflow
Solution 9 - node.jsMauroView Answer on Stackoverflow
Solution 10 - node.jsSatyam ShekharView Answer on Stackoverflow
Solution 11 - node.jsAlex CraftView Answer on Stackoverflow
Solution 12 - node.jsThoView Answer on Stackoverflow
Solution 13 - node.jsuser2468170View Answer on Stackoverflow
Solution 14 - node.jsnfroidureView Answer on Stackoverflow
Solution 15 - node.jsRoboView Answer on Stackoverflow
Solution 16 - node.jsahmadalibalochView Answer on Stackoverflow
Solution 17 - node.jsMarioView Answer on Stackoverflow
Solution 18 - node.jsgafiView Answer on Stackoverflow
Solution 19 - node.jsslavaView Answer on Stackoverflow
Solution 20 - node.jsApostolidisView Answer on Stackoverflow
Solution 21 - node.jsoddjobsmanView Answer on Stackoverflow