Understanding Node.js modules: multiple requires return the same object?

node.js

node.js Problem Overview


I have a question related to the node.js documentation on module caching:

> Modules are cached after the first time they are loaded. This means > (among other things) that every call to require('foo') will get > exactly the same object returned, if it would resolve to the same > file. > > Multiple calls to require('foo') may not cause the module code to be > executed multiple times. This is an important feature. With it, > "partially done" objects can be returned, thus allowing transitive > dependencies to be loaded even when they would cause cycles.

What is meant with may?

I want to know if require will always return the same object. So in case I require a module A in app.js and change the exports object within app.js (the one that require returns) and after that require a module B in app.js that itself requires module A, will I always get the modified version of that object, or a new one?

// app.js

var a = require('./a');
a.b = 2;
console.log(a.b); //2

var b = require('./b');
console.log(b.b); //2

// a.js

exports.a = 1;

// b.js

module.exports = require('./a');

node.js Solutions


Solution 1 - node.js

If both app.js and b.js reside in the same project (and in the same directory) then both of them will receive the same instance of A. From the node.js documentation:

> ... every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.


The situation is different when a.js, b.js and app.js are in different npm modules. For example:

[APP] --> [A], [B]
[B]   --> [A]

In that case the require('a') in app.js would resolve to a different copy of a.js than require('a') in b.js and therefore return a different instance of A. There is a blog post describing this behavior in more detail.

Solution 2 - node.js

node.js has some kind of caching implemented which blocks node from reading files 1000s of times while executing some huge server-projects.

This cache is listed in the require.cache object. I have to note that this object is read/writeable which gives the ability to delete files from the cache without killing the process.

http://nodejs.org/docs/latest/api/globals.html#require.cache

Ouh, forgot to answer the question. Modifying the exported object does not affect the next module-loading. This would cause much trouble... Require always return a new instance of the object, no reference. Editing the file and deleting the cache does change the exported object

After doing some tests, node.js does cache the module.exports. Modifying require.cache[{module}].exports ends up in a new, modified returned object.

Solution 3 - node.js

Since the question was posted, the document has been updated to make it clear why "may" was originally used. It now answers the question itself by making things explicit (my emphasis to show what's changed):

> Modules are cached after the first time they are loaded. This means > (among other things) that every call to require('foo') will get > exactly the same object returned, if it would resolve to the same > file. > > Provided require.cache is not modified, multiple calls to > require('foo') will not cause the module code to be executed multiple > times. This is an important feature. With it, "partially done" objects > can be returned, thus allowing transitive dependencies to be loaded > even when they would cause cycles.

Solution 4 - node.js

For what I have seen, if the module name resolve to a file previosuly loaded, the cached module will be returned, otherwise the new file will be loaded separately.

That is, caching is based on the actual file name that gets resolved. This is because, in general, there can be different versions of the same package that are installed at different levels of the file hierarchy and that must be loaded accordingly.

What I am not sure about is wether there are cases of cache invalidation not under the programmer's control or awareness, that might make it possible to accidentaly reload the very same package file multiple times.

Solution 5 - node.js

In case the reason why you want require(x) to return a fresh object every time is just because you modify that object directly - which is a case I ran into - just clone it, and modify and use only the clone, like this:

var a = require('./a');
a = JSON.parse(JSON.stringify(a));

Solution 6 - node.js

try drex: https://github.com/yuryb/drex

> drex is watching a module for updates and cleanly re-requires the > module after the update. New code is being require()d as if the new > code is a totally different module, so require.cache is not a problem.

Solution 7 - node.js

When you require an object, you are requiring its reference address, and by requiring the object twice, you will get the same address! To have copies of the same object, You should copy (clone) it.

var obj = require('./obj');

a = JSON.parse(JSON.stringify(obj));
b = JSON.parse(JSON.stringify(obj));
c = JSON.parse(JSON.stringify(obj));

Cloning is done in multiple ways, you can see this, for further information.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionXombyView Question on Stackoverflow
Solution 1 - node.jsPetr StodulkaView Answer on Stackoverflow
Solution 2 - node.jsmoeView Answer on Stackoverflow
Solution 3 - node.jsReg EditView Answer on Stackoverflow
Solution 4 - node.jsSimone C.View Answer on Stackoverflow
Solution 5 - node.jsEvgeniy BerezovskyView Answer on Stackoverflow
Solution 6 - node.jsuser2030657View Answer on Stackoverflow
Solution 7 - node.jsAmir FoView Answer on Stackoverflow