Movatterモバイル変換


[0]ホーム

URL:


Packt
Search iconClose icon
Search icon CANCEL
Subscription
0
Cart icon
Your Cart(0 item)
Close icon
You have no products in your basket yet
Save more on your purchases!discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Profile icon
Account
Close icon

Change country

Modal Close icon
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timerSALE ENDS IN
0Days
:
00Hours
:
00Minutes
:
00Seconds
Home> Programming> Design Patterns> Node.js Design Patterns
Node.js Design Patterns
Node.js Design Patterns

Node.js Design Patterns: Design and implement production-grade Node.js applications using proven patterns and techniques , Third Edition

Arrow left icon
Profile Icon Mario CasciaroProfile Icon Luciano Mammino
Arrow right icon
$46.99
Full star iconFull star iconFull star iconFull star iconHalf star icon4.4(202 Ratings)
eBookJul 2020664 pages3rd Edition
eBook
$46.99
Paperback
$57.99
Subscription
Free Trial
Renews at $19.99p/m
Arrow left icon
Profile Icon Mario CasciaroProfile Icon Luciano Mammino
Arrow right icon
$46.99
Full star iconFull star iconFull star iconFull star iconHalf star icon4.4(202 Ratings)
eBookJul 2020664 pages3rd Edition
eBook
$46.99
Paperback
$57.99
Subscription
Free Trial
Renews at $19.99p/m
eBook
$46.99
Paperback
$57.99
Subscription
Free Trial
Renews at $19.99p/m

What do you get with eBook?

Product feature iconInstant access to your Digital eBook purchase
Product feature icon Download this book inEPUB andPDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature iconDRM FREE - Read whenever, wherever and however you want
Product feature iconAI Assistant (beta) to help accelerate your learning
OR

Contact Details

Modal Close icon
Payment Processing...
tickCompleted

Billing Address

Table of content iconView table of contentsPreview book icon Preview Book

Node.js Design Patterns

The Module System

InChapter 1,The Node.js Platform, we briefly introduced the importance of modules in Node.js. We discussed how modules play a fundamental role in defining some of the pillars of the Node.js philosophy and its programming experience. But what do we actually mean when we talk about modules and why are they so important?

In generic terms, modules are the bricks for structuring non-trivial applications. Modules allow you to divide the codebase into small units that can be developed and tested independently. Modules are also the main mechanism to enforce information hiding by keeping private all the functions and variables that are not explicitly marked to be exported.

If you come from other languages, you have probably seen similar concepts being referred to with different names:package (Java, Go, PHP, Rust, or Dart),assembly (.NET),library (Ruby), orunit (Pascal dialects). The terminology is not perfectly interchangeable because every language or ecosystem comes with its own unique characteristics, but there is a significant overlap between these concepts.

Interestingly enough, Node.js currently comes with two different module systems:CommonJS (CJS) andECMAScript modules (ESM orES modules). In this chapter, we will discuss why there are two alternatives, we will learn about their pros and cons, and, finally, we will analyze several common patterns that are relevant when using or writing Node.js modules. By the end of this chapter, you should be able to make pragmatic choices about how to use modules effectively and how to write your own custom modules.

Getting a good grasp of Node.js' module systems and module patterns is very important as we will rely on this knowledge in all the other chapters of this book.

In short, these are the main topics we will be discussing throughout this chapter:

  • Why modules are necessary and the different module systems available in Node.js
  • CommonJS internals and module patterns
  • ES modules (ESM) in Node.js
  • Differences and interoperability between CommonJS and ESM

Let's begin with why we need modules.

The need for modules

A good module system should help with addressing some fundamental needs of software engineering:

  • Having a way to split the codebase into multiple files. This helps with keeping the codemore organized, making it easier to understand but also helps with developing and testing various pieces of functionality independently from each other.
  • Allowing code reuse across different projects. A module can, in fact, implement a generic feature that can be useful for different projects. Organizing such functionality within a module can make it easier to bring it into the different projects that may want to use it.
  • Encapsulation (orinformation hiding). It is generally a good idea to hide implementationcomplexity and only expose simple interfaces with clear responsibilities. Most module systems allow to selectively keep theprivate part of the code hidden, while exposing apublic interface, such as functions, classes, or objects that are meant to be used by the consumers of the module.
  • Managing dependencies. A good module system should make it easy for module developers to build on top of existing modules, including third-party ones. A module system should also make it easy for module users to import the chain of dependencies that are necessary for a given module to run (transient dependencies).

It is important to clarify the distinction betweena module anda module system. We can define a module as the actual unit of software, while a module system is the syntax and the tooling that allows us to define modules and to use them within our projects.

Module systems in JavaScript and Node.js

Not allprogramming languages come with a built-in module system, and JavaScript had been lacking thisfeature for a long time.

In the browser landscape, it is possible to split the codebase into multiple files and then import them by using different<script> tags. For many years, this approach was good enough to build simple interactive websites, and JavaScript developers managed to get things done without having a fully-fledged module system.

Only when JavaScript browser applications became more complicated and frameworks likejQuery,Backbone, andAngularJS took over the ecosystem, the JavaScript community came up with several initiatives aimed at defining a module system that could be effectively adopted within JavaScript projects. Themost successfulones wereasynchronous module definition (AMD), popularized by RequireJS (nodejsdp.link/requirejs), andlaterUniversal Module Definition (UMDnodejsdp.link/umd).

When Node.js was created, it was conceived as a server runtime for JavaScript with direct access to the underlying filesystem so there was a unique opportunity to introduce a different way to manage modules. The idea was not to rely on HTML<script> tags and resources accessible through URLs. Instead, the choice was to rely purely on JavaScript files available on the local filesystem. For its module system, Node.js came up with animplementation of theCommonJS specification (sometimes also referred to asCJS,nodejsdp.link/commonjs), which was designed to provide a module system for JavaScript in browserless environments.

CommonJS has been the dominant module system in Node.js since its inception and it has become very prominent also in thebrowser landscape thankstomodule bundlers like Browserify (nodejsdp.link/browserify) and webpack (nodejsdp.link/webpack).

In 2015, with the release ofECMAScript 6 (also calledECMAScript 2015 or ES2015), there was finally an official proposal for a standard module system:ESM orECMAScript modules. ESM brings a lot of innovation in the JavaScript ecosystem and, among other things, it tries to bridge the gap between how modules are managed on browsers and servers.

ECMAScript 6 defined only the formal specification for ESM in terms of syntax and semantics, but it didn't provide any implementation details. It took different browser companies and the Node.js community several years to come up with solid implementations of the specification. Node.js ships with stable support for ESM starting from version 13.2.

At the time of writing, the general feeling is that ESM is going to become the de facto way to manage JavaScript modules in both the browser and the server landscape. The reality today, though, is that the majority of projects are still heavily relying on CommonJS and it will take some time for ESM to catch up and eventually become the dominant standard.

To provide a comprehensive overview of module-related patterns in Node.js, in the first part of this chapter, we will discuss them in the context of CommonJS, and then, in the second part of the chapter, we will revisit our learnings using ESM.

The goal of this chapter is to make you comfortable with both module systems, but in the rest of the book, we will only be using ESM for our code examples. The idea is to encourage you to leverage ESM as much as possible so that your code will be more future-proof.

If you are reading this chapter a fewyears after its publication, you are probably not too worriedabout CommonJS, and you might want to jump straight into the ESM part. This is probably fine, but we still encourage you to go through the entire chapter, because understanding CommonJS and its characteristics will certainly be beneficial in helping you to understand ESM and its strengths in much more depth.

The module system and its patterns

As we said, modules are the bricks for structuring non-trivial applications and the main mechanism toenforce information hiding by keeping private all the functions and variables that are not explicitly marked to be exported.

Before getting into the specifics of CommonJS, let's discuss a generic pattern that helps with information hiding and that we will be using for building a simple module system, which is therevealing module pattern.

The revealing module pattern

One of the bigger problems with JavaScript in the browser is the lack of namespacing. Every script runs in the global scope; therefore, internal application code or third-party dependencies canpollute the scope while exposing their own pieces of functionality. This can be extremely harmful. Imagine, for instance, that a third-party library instantiates a global variable calledutils. If any other library, or the application code itself, accidentally overrides or altersutils, the code that relies on it will likely crash in some unpredictable way. Unpredictable side effects can also happen if other libraries or the application code accidentally invoke a function of another library meant for internal use only.

In short, relying on the global scope is a very risky business, especially as your application grows and you have to rely more and more on functionality implemented by other individuals.

A popular technique to solve this class of problems is called the revealing module pattern, and it lookslike this:

const myModule = (() => {const privateFoo =() => {}const privateBar = []const exported = {publicFoo:() => {},publicBar:() => {}  }return exported})()// once the parenthesis here are parsed, the function// will be invokedconsole.log(myModule)console.log(myModule.privateFoo, myModule.privateBar)

This pattern leverages a self-invoking function. This type of function is sometimes also referred to asImmediately Invoked Function Expression (IIFE) and it isused to create a private scope, exporting only the parts that are meant to be public.

In JavaScript, variables created inside a function are not accessible from the outer scope (outside the function). Functions can use thereturn statement to selectively propagate information to the outer scope.

This pattern is essentially exploiting these properties to keep the private information hidden and export only a public-facing API.

In the preceding code, the myModule variable contains only the exported API, while the rest of the module content is practically inaccessible from outside.

Thelog statement is going to print something like this:

{ publicFoo: [Function: publicFoo],  publicBar: [Function: publicBar] }undefined undefined

This demonstrates that only theexported properties are directly accessible frommyModule.

As we will see in amoment, the idea behind this pattern is used as a base for the CommonJS module system.

CommonJS modules

CommonJS is the first module system originally built into Node.js. Node.js' CommonJS implementationrespects the CommonJS specification, with the addition of some custom extensions.

Let's summarize two of the main concepts of the CommonJS specification:

  • require is a function thatallows you to import a module from the local filesystem
  • exports andmodule.exports are specialvariables that can be used to export public functionality from the current module

This information is sufficient for now; we will learn more details and some of the nuances of the CommonJS specification in the next few sections.

A homemade module loader

To understand how CommonJS works in Node.js, let's build a similar system from scratch. The code thatfollows creates afunction that mimics a subset of the functionality of the original require() function of Node.js.

Let's start by creating a function that loads the content of a module, wraps it into a private scope, and evaluates it:

functionloadModule (filename, module, require){const wrappedSrc =`(function (module, exports, require) {${fs.readFileSync(filename,'utf8')}    })(module, module.exports, require)`eval(wrappedSrc)}

The source code of a module is essentially wrapped into a function, as it was for the revealing module pattern. The difference here is that we pass a list of variables to the module, in particular, moduleexports, and require. Make a note of how the exports argument of the wrapping function is initialized with the content of module.exports, as we will talk about this later.

Another important detail to mention is that we are usingreadFileSync to read the module's content. While it is generally not recommended to use the synchronous version of the filesystem APIs, here it makes sense to do so. The reason for that is that loadingmodules in CommonJS are deliberatelysynchronous operations. This approach makes sure that, if we are importing multiple modules, they (and their dependencies) are loaded in the right order. We will talk more about this aspect later in the chapter.

Bear in mind that this is only an example, and you will rarely need to evaluate some source code in a real application. Features such as eval() or the functions of the vm module (nodejsdp.link/vm) can be easily used in the wrong way or with the wrong input, thus opening a system to code injection attacks. They should always be used with extreme care or avoided altogether.

Let's now implement the require() function:

functionrequire (moduleName){console.log(`Require invoked for module:${moduleName}`)const id =require.resolve(moduleName)// (1)if (require.cache[id]) {// (2)returnrequire.cache[id].exports  }// module metadataconstmodule = {// (3)exports: {},    id  }// Update the cacherequire.cache[id] =module// (4)// load the module  loadModule(id,module,require)// (5)// return exported variablesreturnmodule.exports// (6)}require.cache = {}require.resolve =(moduleName) => {/* resolve a full module id from the moduleName */}

The previous function simulates the behavior of the original require() function of Node.js, which is used to load a module. Of course, this is just for educational purposes and does not accurately or completely reflect the internal behavior of the real require() function, but it's great to understand the internals of the Node.js module system, including how amodule is defined and loaded.

What our homemade module system does is explained as follows:

  1. A module name is accepted as input, and the very first thing that we do is resolve the full path of themodule, which we call id. This task is delegated to require.resolve(), which implements a specific resolving algorithm (we will talk about it later).
  2. If the module has already been loaded in the past, it should be available in the cache. If this is the case, we just return it immediately.
  3. If the module has never been loaded before, we set up the environment for the first load. In particular, we create a module object that contains an exports property initialized with an empty object literal. This object will be populated by the code of the module to export its public API.
  4. After the first load, the module object is cached.
  5. The module source code is read from its file and the code is evaluated, as we saw before. We provide the module with the module object that we just created, and a reference to the require() function. The module exports its public API by manipulating or replacing the module.exports object.
  6. Finally, the content of module.exports, which represents the public API of the module, is returned to the caller.

As we can see, there is nothing magical behind the workings of the Node.js module system. The trick is all in the wrapper we create around a module's source code and the artificial environment in which we run it.

Defining a module

By looking at how our custom require() function works, we should now be able to understand how todefine a module. The following code gives us an example:

// load another dependencyconst dependency =require('./anotherModule')// a private functionfunctionlog(){console.log(`Well done${dependency.username}`)}// the API to be exported for public usemodule.exports.run =() => {  log()}

The essential concept to remember is that everything inside a module is private unless it's assigned to the module.exports variable. The content of this variable is then cached and returned when the module is loaded using require().

module.exports versus exports

For many developers who are not yet familiar with Node.js, a common source of confusion is the difference between using exports and module.exports to expose a public API. Thecode of our custom require() function should again clear any doubt. Theexports variable is just a reference to the initial value of module.exports. We have seen that such a value is essentially a simple object literal created before the module is loaded.

This means that we can only attach new properties to the object referenced by the exports variable, as shown in the following code:

exports.hello =() => {console.log('Hello')}

Reassigning the exports variable doesn't have any effect, because it doesn't change the content of module.exports. It will only reassign the variable itself. The following code is therefore wrong:

exports =() => {console.log('Hello')}

If we want to export something other than an object literal, such as a function, an instance, or evena string, we have to reassign module.exports as follows:

module.exports =() => {console.log('Hello')}

The require function is synchronous

A very important detail that we should take into account is that our homemade require() functionis synchronous. In fact, it returns the module contents using a simple direct style, and no callback is required. This istrue for the original Node.js require() function too. As a consequence, any assignment to module.exports must be synchronous as well. For example, the following code is incorrect and it will cause trouble:

setTimeout(() => {module.exports =function(){...}},100)

The synchronous nature ofrequire() has important repercussions on the way we define modules, as it limits us to mostly using synchronous code during the definition of a module. This is one of the most important reasons why the core Node.js libraries offer synchronous APIs as an alternative to most of the asynchronous ones.

If we need some asynchronous initialization steps for a module, we can always define and export an uninitialized module that is initialized asynchronously at a later time. The problem with this approach, though, is that loading such a module using require() does not guarantee that it's ready to be used. In Chapter 11,Advanced Recipes, we will analyze this problem in detail and present some patterns to solve this issue elegantly.

For the sake of curiosity, you might want to know that in its early days, Node.js used to have an asynchronous version of require(), but it was soon removed because it was overcomplicating a functionality that was actually only meant to be used at initialization time and where asynchronous I/O brings more complexities than advantages.

The resolving algorithm

The term dependency hell describes a situation whereby two or more dependencies of a program in turn depend on a shared dependency, but require different incompatible versions. Node.js solves this problem elegantly by loading a different version of a module depending onwhere the module isloaded from. All the merits of this feature go to the way Node.js package managers (such as npm or yarn) organize the dependencies of the application, and also to the resolving algorithm used in the require() function.

Let's now give a quick overview of this algorithm. As we saw, the resolve() function takes a module name (which we will callmoduleName) as input and it returns the full path of the module. This path is then used to load its code and also to identify the module uniquely. The resolving algorithm can be divided into the following three major branches:

  • File modules: If moduleName starts with /, it is already considered an absolute path to the moduleand it's returned as it is. If it starts with ./, then moduleName is considered a relative path, which is calculated starting from the directory of the requiring module.
  • Core modules: If moduleName is notprefixed with / or ./, the algorithm will first try to search within the core Node.js modules.
  • Package modules: If no core module is found matching moduleName, then the search continues by looking for amatching module in the first node_modules directory that is found navigating up in the directory structure starting from the requiring module. The algorithm continues to search for a match by looking into the next node_modules directory up in the directory tree, until it reaches the root of the filesystem.

For file and package modules, both files and directories can match moduleName. In particular, the algorithm will try to match the following:

  • <moduleName>.js
  • <moduleName>/index.js
  • The directory/file specified in the main property of<moduleName>/package.json

The complete, formal documentation of the resolving algorithm can be found atnodejsdp.link/resolve.

The node_modules directory is actually where the package managers install the dependencies of each package. This means that, based on the algorithm we just described, each package can have its own private dependencies. For example, consider the following directory structure:

myApp├── foo.js└── node_modules    ├── depA    │   └── index.js    ├── depB    │   ├── bar.js    │   └── node_modules    │       └── depA    │           └── index.js    └── depC        ├── foobar.js        └── node_modules            └── depA                └── index.js

In the previous example, myAppdepB, and depC all depend on depA. However, they all have their ownprivate version of the dependency! Following the rules of the resolving algorithm, using require('depA') will load a different filedepending on the module that requires it, for example:

  • Calling require('depA') from /myApp/foo.js will load /myApp/node_modules/depA/index.js
  • Calling require('depA') from /myApp/node_modules/depB/bar.js will load /myApp/node_modules/depB/node_modules/depA/index.js
  • Calling require('depA') from /myApp/node_modules/depC/foobar.js will load /myApp/node_modules/depC/node_modules/depA/index.js

The resolving algorithm is the core part behind the robustness of the Node.js dependency management, and it makes it possible to have hundreds or even thousands of packages in an application without having collisions or problems of version compatibility.

The resolving algorithm is applied transparently for us when we invoke require(). However, if needed, it can still be used directly by any module by simply invoking require.resolve().

The module cache

Each module is only loaded and evaluated the first time it is required, since any subsequent call of require() will simply return the cached version. This should be clear by looking at the code of our homemade require() function. Caching is crucial for performance, but it alsohas some important functional implications:

  • It makes itpossible to have cycles within module dependencies
  • It guarantees, to some extent, that the same instance is always returned when requiring the same module from within a given package

The module cache is exposed via the require.cache variable, so it is possible to directly access it if needed. A common use case is to invalidate any cached module by deleting the relative key in the require.cache variable, a practice that can be useful during testing but very dangerous if applied in normal circumstances.

Circular dependencies

Many consider circular dependencies an intrinsic design issue, but it is something that might actually happenin a real project, so it'suseful for us to know at least how this works with CommonJS. If we look again at our homemade require() function, we immediately get a glimpse of how this might work and what its caveats are.

But let's walk together through an example to see how CommonJS behaves when dealing with circular dependencies. Let's suppose we have the scenario represented inFigure 2.1:

02%20The%20Module%20system%20-%20Images/circular-dependency-example.png

Figure 2.1: An example of circular dependency

A module calledmain.js requiresa.js andb.js. In turn,a.js requiresb.js. Butb.js relies ona.js as well! It's obvious that we have a circular dependency here as modulea.js requires moduleb.js and moduleb.js requires modulea.js. Let's have a look at the code of these two modules:

  • Module a.js:
    exports.loaded =falseconst b =require('./b')module.exports = {  b,loaded:true// overrides the previous export}
  • Module b.js:
    exports.loaded =falseconst a =require('./a')module.exports = {  a,loaded:true}

Now, let'ssee how thesemodules are required by main.js:

const a =require('./a')const b =require('./b')console.log('a ->',JSON.stringify(a,null,2))console.log('b ->',JSON.stringify(b,null,2))

If we runmain.js, we will see the following output:

a -> {  "b": {    "a": {      "loaded": false    },    "loaded": true  },  "loaded": true}b -> {  "a": {    "loaded": false  },  "loaded": true}

This result reveals the caveats of circular dependencies with CommonJS, that is, different parts of our application will have a different view of what is exported by modulea.js and moduleb.js, dependingon the order in which those dependencies are loaded. While both themodules are completely initialized as soon as they are required from the modulemain.js, the a.js module will be incomplete when it is loaded from b.js. In particular, its state will be the one that it reached the momentb.js was required.

In order to understand in more detail what happens behind the scenes, let's analyze step by step how the different modules are interpreted and how their local scope changes along the way:

Figure 2.2: A visual representation of how a dependency loop is managed in Node.js

The steps are as follows:

  1. The processing starts inmain.js, which immediately requiresa.js
  2. The first thing that modulea.js does is set an exported value calledloaded to false
  3. At this point, modulea.js requires moduleb.js
  4. Likea.js, the first thing that moduleb.js does is set an exported value calledloaded tofalse
  5. Now,b.js requiresa.js (cycle)
  6. Sincea.js has already been traversed, its currently exported value is immediately copied into the scope of moduleb.js
  7. Moduleb.js finally changes theloaded value totrue
  8. Now thatb.js has been fully executed, the control returns toa.js, which now holds a copy of the current state of moduleb.js in its own scope
  9. The last step of modulea.js is to set itsloaded value totrue
  10. Modulea.js is now completely executed and the control returns tomain.js, which now has a copy of thecurrent state of modulea.js in its internal scope
  11. main.js requiresb.js, which is immediately loaded from cache
  12. The current state of moduleb.js is copied into the scope of modulemain.js where we can finally see the complete picture ofwhat the state of every module is

As we said, the issue here is that moduleb.js has a partial view of modulea.js, and this partial view gets propagated over whenb.js is required inmain.js. This behavior should spark an intuition which can be confirmed if we swap the order in which the two modules are required in main.js. If you actually try this, you will see that this time it will be thea.js module that will receive an incomplete version ofb.js.

We understand now that this can become quite a fuzzy business if we lose control of which module is loaded first, which can happen quite easily if the project is big enough.

Later in this chapter, we will see how ESM can deal with circular dependencies in a more effective way. Meanwhile, if you are using CommonJS, be very careful about this behavior and the way it can affect your application.

In the next section, we will discuss some patterns to define modules in Node.js.

Module definition patterns

The module system, besides being a mechanism for loading dependencies, is also a tool for defining APIs. Like anyother problem related to API design, the main factor to consider is the balance between private and public functionality. The aim is to maximize information hiding and API usability, while balancing these with other software qualities, such as extensibility and code reuse.

In this section, we will analyze some of the most popular patterns for defining modules in Node.js, such as namedexports, exporting functions, classes and instances, and monkey patching. Each one has its own balance of information hiding, extensibility, and code reuse.

Named exports

The most basic method for exposing a public API is using named exports, which involves assigning the values wewant to make public to properties of the object referenced by exports (or module.exports). In this way, the resulting exported object becomes a container or namespace for a set of related functionalities.

The following code shows a module implementing this pattern:

// file logger.jsexports.info =(message) => {console.log(`info:${message}`)}exports.verbose =(message) => {console.log(`verbose:${message}`)}

The exported functions are then available as properties of the loaded module, as shown in the following code:

// file main.jsconst logger =require('./logger')logger.info('This is an informational message')logger.verbose('This is a verbose message')

Most of the Node.js core modules use this pattern. However, the CommonJS specification only allows the use of the exports variable to expose public members. Therefore, the named exports pattern is the only one that is really compatible with the CommonJS specification. The use of module.exports is an extension provided by Node.js to support a broader range of module definition patterns, which we are going to see next.

Exporting a function

One of the most popular module definition patterns consists of reassigning the whole module.exports variable to a function. The main strength of this pattern is the fact that it allows you toexpose only a single functionality, which provides a clear entry point for the module, making it simpler to understand and use; it also honors the principle of small surface area very well. This way of defining modules is alsoknown in the community as the substack pattern, after one of its most prolific adopters, James Halliday (nickname substack –https://github.com/substack). Have a look at this pattern in the following example:

// file logger.jsmodule.exports =(message) => {console.log(`info:${message}`)}

A possible extension of this pattern is using the exported function as a namespace for other public APIs. This is a very powerful combination because it still gives the module the clarity of a single entry point (the main exported function) and at the same time it allows us to expose other functionalities that have secondary or more advanced use cases. The following code shows us how to extend the module we defined previously by using the exported function as a namespace:

module.exports.verbose =(message) => {console.log(`verbose:${message}`)}

This code demonstrates how to use the module that we just defined:

// file main.jsconst logger =require('./logger')logger('This is an informational message')logger.verbose('This is a verbose message')

Even though exporting just a function might seem like a limitation, in reality, it's a perfect way to put the emphasis on a single functionality, the most important one for the module, while giving less visibility to secondary or internal aspects, which are instead exposed as properties of the exported function itself. The modularity of Node.js heavily encourages theadoption of the single-responsibility principle (SRP): every module should have responsibility over a single functionality and that responsibility should be entirely encapsulated by the module.

Exporting a class

A module that exports a class is a specialization of a module that exports a function. The difference is that with this newpattern we allow the user to create new instances using the constructor, but we also give them the ability to extend its prototype and forge new classes. The following is an example of this pattern:

classLogger{constructor (name) {this.name = name  }  log (message) {console.log(`[${this.name}]${message}`)  }  info (message) {this.log(`info:${message}`)  }  verbose (message) {this.log(`verbose:${message}`)  }}module.exports = Logger

And, we can use the preceding module as follows:

// file main.jsconst Logger =require('./logger')const dbLogger =new Logger('DB')dbLogger.info('This is an informational message')const accessLogger =new Logger('ACCESS')accessLogger.verbose('This is a verbose message')

Exporting a class still provides a single entry point for the module, but compared to the substack pattern, it exposes a lot more of the module internals. On the other hand, it allows much more power when it comes to extending its functionality.

Exporting an instance

We can leverage the caching mechanism of require() to easily define stateful instances created from a constructor or afactory, which can be shared across different modules. The following code shows an example of this pattern:

// file logger.jsclassLogger{constructor (name) {this.count =0this.name = name  }  log (message) {this.count++console.log('[' +this.name +'] ' + message)  }}module.exports =new Logger('DEFAULT')

This newly defined module can then be used as follows:

// main.jsconst logger =require('./logger')logger.log('This is an informational message')

Because the module is cached, every module that requires the logger module will actually always retrieve the same instance of the object, thus sharing its state. This pattern is very much like creating a singleton. However, it does not guarantee the uniqueness of the instance across the entire application, as it happens in the traditional singleton pattern. When analyzing the resolving algorithm, we have seen that a module might be installed multiple times inside the dependency tree of an application. This results in multiple instances of the same logical module, all running in the context of the same Node.js application. We will analyze the Singleton pattern and its caveats in more detail inChapter 7,Creational Design Patterns.

One interesting detail of this pattern is that it does not preclude the opportunity to create new instances, even if we are not explicitly exporting the class. In fact, we can rely on theconstructor property of the exported instance to construct a new instance of the same type:

const customLogger =new logger.constructor('CUSTOM')customLogger.log('This is an informational message')

As you can see, by usinglogger.constructor(), we can instantiate newLogger objects. Note that this technique must be used with caution or avoided altogether. Consider that, if the module author decided not to export the class explicitly, they probably wanted to keep this class private.

Modifying other modules or the global scope

A module can even export nothing. This can seem a bit out of place; however, we should not forget that amodule can modify the global scope and any object in it, including othermodules in the cache. Please note that these are in general considered bad practices, but since this pattern can be useful and safe under some circumstances (for example, for testing) and it's sometimes used in real-life projects, it's worth knowing.

We said that a module can modify other modules or objects in the global scope; well, this is called monkey patching. It generallyrefers to the practice of modifying the existing objects at runtime to change or extend their behavior or to apply temporary fixes.

The following example shows us how we can add a new function to another module:

// file patcher.js// ./logger is another modulerequire('./logger').customMessage =function (){console.log('This is a new functionality')}

Using our new patcher module is as easy as writing the following code:

// file main.jsrequire('./patcher')const logger =require('./logger')logger.customMessage()

The technique described here can be very dangerous to use. The main concern is that having a module that modifies the global namespace or other modules is an operation withside effects. In other words, it affects the state of entities outside their scope, which can have consequences that aren't easily predictable, especially when multiple modules interact with the same entities. Imagine having two different modules trying to set the same global variable, or modifying the same property of the same module. The effects can be unpredictable (which module wins?), but most importantly it would have repercussions on the entire application.

So, again use this technique with care and make sure you understand all the possible side effects while doing so.

If you want a real-life example of how this can be useful, have a look atnock (nodejsdp.link/nock), a module thatallows you to mock HTTP responses in your tests. The waynock works is by monkey patching the Node.jshttp module and by changing its behavior so that it will provide the mocked response rather than issuing a real HTTP request. This allows our unit test to run without hitting the actual production HTTP endpoints, something that's very convenient when writing tests for code that relies on third-party APIs.

At this point, we shouldhave a quite complete understanding of CommonJS and some of the patterns that are generally used with it. In the next section, we will explore ECMAScript modules, also known as ESM.

ESM: ECMAScript modules

ECMAScript modules (also known as ES modules or ESM) were introduced as part of the ECMAScript 2015 specification with the goal to give JavaScript an officialmodule system suitable for different execution environments. The ESM specification tries to retain some good ideas from previous existing module systems like CommonJS and AMD. The syntax is very simple and compact. There is support for cyclic dependencies and the possibility to load modules asynchronously.

The most important differentiator between ESM and CommonJS is that ES modules arestatic, which means that imports are described at the top level of every module and outside any control flow statement. Also, the name of the imported modules cannot be dynamically generated at runtime using expressions, only constant strings are allowed.

For instance, the following code wouldn't be valid when using ES modules:

if (condition) {import module1from'module1'}else {import module2from'module2'}

While in CommonJS, it is perfectly fine to write something like this:

letmodule =nullif (condition) {module =require('module1')}else {module =require('module2')}

At a first glance, this characteristic of ESM might seem an unnecessary limitation, but in reality, having static imports opens up a number of interesting scenarios that are not practical with the dynamicnature of CommonJS. For instance, static imports allow the static analysis of the dependency tree, which allows optimizations such as dead code elimination (tree shaking) and more.

Using ESM in Node.js

Node.js will consider every.js file to be written using the CommonJS syntax by default; therefore, if weuse the ESM syntax inside a.js file, theinterpreter will simply throw an error.

There are several ways to tell the Node.js interpreter to consider a given module as an ES module rather than a CommonJS module:

  • Give the module file the extension.mjs
  • Add to the nearest parentpackage.json a field called "type" with a value of"module"

Throughout the rest of this book and in the code examples provided, we will keep using the.js extension to keep the code more easily accessible to most text editors, so if you are copying and pasting examples straight from the book, make sure that you also create apackage.json file with the"type":"module" entry.

Let's now have a look at the ESM syntax.

Named exports and imports

ESM allows us toexport functionalityfrom a modulethrough theexport keyword.

Note that ESM uses the singular wordexport as opposed to the plural (exports andmodule.exports) used by CommonJS.

In an ES module, everything is private by default and only exported entities are publicly accessible from other modules.

Theexport keyword can be used in front of the entities that we want to make available to the module users. Let's see an example:

// logger.js// exports a function as `log`exportfunctionlog (message){console.log(message)}// exports a constant as `DEFAULT_LEVEL`exportconst DEFAULT_LEVEL ='info'// exports an object as `LEVELS`exportconst LEVELS = {error:0,debug:1,warn:2,data:3,info:4,verbose:5}// exports a class as `Logger`exportclassLogger{constructor (name) {this.name = name  }  log (message) {console.log(`[${this.name}]${message}`)  }}

If we want to import entities from a module we can use theimport keyword. The syntax is quite flexible, and it allows us to import one or more entities and even to rename imports. Let's see some examples:

import *as loggerModulefrom'./logger.js'console.log(loggerModule)

In this example, we areusing the* syntax (also callednamespace import) to import all the members of themoduleand assign them to the localloggerModule variable. Thisexample will output something like this:

[Module] {  DEFAULT_LEVEL: 'info',  LEVELS: { error: 0, debug: 1, warn: 2, data: 3, info: 4,    verbose: 5 },  Logger: [Function: Logger],  log: [Function: log]}

As we can see, all the entities exported in our module are now accessible in theloggerModule namespace. For instance, we could refer to thelog() function throughloggerModule.log.

It's very important to note that, as opposed to CommonJS, with ESM we have to specify the file extension of the imported modules. With CommonJS we can use either./logger or./logger.js, with ESM we are forced to use./logger.js.

If we are using a large module, most often we don't want to import all of its functionality, but only one or few entities from it:

import { log }from'./logger.js'log('Hello World')

If we want to import more than one entity, this is how we would do that:

import { log, Logger }from'./logger.js'log('Hello World')const logger =new Logger('DEFAULT')logger.log('Hello world')

When we use this type ofimport statement, the entities are imported into the current scope, so there is a risk of a name clash. The following code, for example, would not work:

import { log }from'./logger.js'const log =console.log

If we try to executethe preceding snippet, the interpreter fails with the following error:

SyntaxError: Identifier 'log' has already been declared

In situations likethis one, we can resolvethe clash by renaming the imported entity with theas keyword:

import { logas log2 }from'./logger.js'const log =console.loglog('message from log')log2('message from log2')

This approach can be particularly useful when the clash is generated by importing two entities with the same name from different modules, and therefore changing the original names is outside the consumer's control.

Default exports and imports

One widely used feature of CommonJS is the ability to export a single unnamed entity through theassignment ofmodule.exports. We saw that this is very convenient as it encourages moduledevelopers to follow the single-responsibility principle and expose only one clear interface. With ESM, we can do something similar through what's called adefault export. A defaultexport makes use of theexport default keywords and it looks like this:

// logger.jsexportdefaultclassLogger{constructor (name) {this.name = name  }  log (message) {console.log(`[${this.name}]${message}`)  }}

In this case, the nameLogger is ignored, and the entity exported is registered under the namedefault. This exported name is handled in a special way, and it can be imported as follows:

// main.jsimport MyLoggerfrom'./logger.js'const logger =new MyLogger('info')logger.log('Hello World')

The difference with named ESM imports is that here, since the default export is considered unnamed, we can import it and at the same time assign it a local name of our choice. In thisexample, we can replaceMyLogger with anything else that makes sense in our context. This isvery similar to what we do with CommonJS modules. Note also that we don't have to wrap the import name around brackets or use theas keyword when renaming.

Internally, a default export is equivalentto a named export withdefault as the name. We can easily verify this statement by running the following snippet of code:

// showDefault.jsimport *as loggerModulefrom'./logger.js'console.log(loggerModule)

When executed, the previous code will print something like this:

[Module] { default: [Function: Logger] }

One thing that we cannot do, though, is import the default entity explicitly. In fact, something like the following will fail:

import {default }from'./logger.js'

The execution will fail with aSyntaxError: Unexpected reserved word error. This happensbecause thedefault keyword cannot be used as a variable name. It is valid as an objectattribute, so in theprevious example, it is okay to useloggerModule.default, but we can't have a variable nameddefault directly in the scope.

Mixed exports

It is possible to mix namedexports and a defaultexport within an ES module. Let's have a look at an example:

// logger.jsexportdefaultfunctionlog (message){console.log(message)}exportfunctioninfo (message){  log(`info:${message}`)}

The preceding code is exporting thelog() function as a default export and a named export for a function calledinfo(). Note thatinfo() can referencelog() internally. It would not be possible to replace the call tolog() withdefault() to do that, as it would be a syntax error (Unexpected tokendefault).

If we want to import both the default export and one or more named exports, we can do it using the following format:

import mylog, { info }from'./logger.js'

In the preceding example, we are importing the default export fromlogger.js asmylog and also the named exportinfo.

Let's now discuss some key details and differences between the default export and named exports:

  • Named exports are explicit. Having predetermined names allows IDEs to support the developer with automatic imports, autocomplete, and refactoring tools. For instance, if we typewriteFileSync, the editor might automatically addimport { writeFileSync } from 'fs' at the beginning of the current file. Default exports, on the contrary, make all these things more complicated as a given functionality could have different names in different files, so it's harder to make inferences on which module might provide a given functionality based only on a given name.
  • The default export is a convenient mechanism to communicate what is the single most important functionality for a module. Also, from the perspective of the user, it can be easier to import the obvious piece of functionality without having to know the exactname of the binding.
  • In some circumstances, default exports might make it harder to apply dead code elimination (tree shaking). For example, a module could provide only a default export, which is an object where all the functionality is exposed as properties of such an object. When we import this default object, most module bundlers will consider the entire object being used and they won't be able to eliminate any unused code from the exported functionality.

For these reasons, it is generally considered good practice to stick with named exports, especially when you want toexpose more than one functionality, and only use default exports if it's one clear functionality you want to export.

This is not a hard rule and there are notable exceptions to this suggestion. For instance, all Node.js core modules have both a default export and a number of named exports. Also, React (nodejsdp.link/react) uses mixed exports.

Consider carefully what the best approach for your specific module is and what you want the developer experience to be for the users of your module.

Module identifiers

Module identifiers(also calledmodule specifiers) are the different types of values that we can use in ourimport statementsto specifythe location of the module we want to load.

So far, we have seen only relative paths, but there are several other possibilities and some nuances to keep in mind. Let's list all the possibilities:

  • Relative specifiers like./logger.js or../logger.js. They are used to refer to a path relative to the location of the importing file.
  • Absolute specifiers likefile:///opt/nodejs/config.js. They refer directly and explicitly to a full path. Note that this is the only way with ESM to refer to an absolute path for a module, using a/ or a// prefix won't work. This is a significant difference with CommonJS.
  • Bare specifiers are identifiers likefastify orhttp, and they represent modules available in thenode_modules folder and generally installed through a package manager (such as npm) or available as core Node.js modules.
  • Deep import specifiers likefastify/lib/logger.js, which refer to a path within a package innode_modules (fastify, in this case).

In browserenvironments, it is possible to import modules directly byspecifying the module URL, for instance,https://unpkg.com/lodash. This featureis not supported by Node.js.

Async imports

As we have seen in the previous section, theimport statement is static and therefore subject to two important limitations:

  • A moduleidentifier cannot be constructed at runtime
  • Module imports aredeclared at the top level of every file and they cannot be nested within control flow statements

There are some use cases when these limitations can become a little bit too restrictive. Imagine, for instance, if we have to import a specific translation module for the current user language, or a variation of a module that depends on the user's operating system.

Also, what if we want to load a given module, which might be particularly heavy, only if the user is accessing the piece of functionality that requires that module?

To allow us to overcome these limitations ES modules providesasync imports (also calleddynamic imports).

Async imports can be performed at runtime using the specialimport() operator.

Theimport() operator is syntactically equivalent to a function that takes a module identifier as an argument and it returns a promise that resolves to a module object.

We will learn more about promises inChapter 5,Asynchronous Control Flow Patterns with Promises and Async/Await, so don't worry too much about understanding all the nuances of the specific promise syntax for now.

The module identifier can be any module identifier supported by static imports as discussed in the previous section. Now, let's see how to use dynamic imports with a simple example.

We want to build a command line application that can print "Hello World" in different languages. In the future, we will probably want to support many more phrases and languages, so it makessense to have one file with the translations of all the user-facing strings for each supported language.

Let's create someexample modules for some of the languages we want to support:

// strings-el.jsexportconst HELLO ='Γεια σου κόσμε'// strings-en.jsexportconst HELLO ='Hello World'// strings-es.jsexportconst HELLO ='Hola mundo'// strings-it.jsexportconst HELLO ='Ciao mondo'// strings-pl.jsexportconst HELLO ='Witaj świecie'

Now let's create the main script that takes a language code from the command line and prints "Hello World" in the selected language:

// main.jsconst SUPPORTED_LANGUAGES = ['el','en','es','it','pl']// (1)const selectedLanguage = process.argv[2]// (2)if (!SUPPORTED_LANGUAGES.includes(selectedLanguage)) {// (3)console.error('The specified language is not supported')  process.exit(1)}const translationModule =`./strings-${selectedLanguage}.js`// (4)import(translationModule)// (5)  .then((strings) => {// (6)console.log(strings.HELLO)  })

The first part of the script is quite simple. What we do there is:

  1. Define a list of supported languages.
  2. Read the selected language from the first argument passed in the command line.
  3. Finally, we handle the case where the selected language is not supported.

The second part of the code is where we actually use dynamic imports:

  1. First of all, we dynamically build the name of the module we want to import based on the selected language. Note that the module name needs to be a relative path to the module file, that's why we are prepending./ to the filename.
  2. We use theimport() operator to trigger the dynamic import of the module.
  3. The dynamic importhappens asynchronously, so we can use the.then() hook on the returnedpromise to get notified when the module is ready to be used. The function passed tothen() will be executed when the module is fully loaded andstrings will be the module namespace imported dynamically. After that, we can accessstrings.HELLO and print its value to the console.

Now we can execute this script like this:

node main.js it

And we should seeCiao mondo being printed to our console.

Module loading in depth

To understand how ESMactually works and how it can deal effectively with circular dependencies, we have to deep dive a little bit more into how JavaScript code is parsed and evaluated when using ES modules.

In this section, we will learn how ECMAScript modules are loaded, we will present the idea of read-onlylive bindings, and, finally, we will discuss an example with circular dependencies.

Loading phases

The goal of the interpreter is tobuild a graph of all the necessarymodules (a dependency graph).

In generic terms, adependency graph can be defined as adirected graph (nodejsdp.link/directed-graph) representing thedependencies of a group of objects. In the context of this section, when we refer to a dependency graph, we want to indicate the dependency relationship between ECMAScript modules. As we will see, using a dependency graph allows us to determine the order in which all the necessary modules should be loaded in a given project.

Essentially, the dependency graph is needed by the interpreter to figure out how modules depend on each other and in what order the code needs to be executed. When thenode interpreter is launched, it gets passed some code to execute, generally in the form of a JavaScript file. This file is the starting point for the dependency resolution, and it is called theentry point. From theentry point, the interpreter will find and follow all theimport statements recursively in a depth-first fashion, until all the necessary code is explored and then evaluated.

More specifically, this process happens in three separate phases:

  • Phase 1 - Construction (or parsing): Find allthe imports and recursively load the content of every module from the respective file.
  • Phase 2 - Instantiation: For every exported entity, keep a named reference in memory, but don't assign anyvalue just yet. Also, references are created for all theimport andexport statements tracking the dependency relationship between them (linking). No JavaScript code has been executed at this stage.
  • Phase 3 - Evaluation: Node.js finally executes the code so that all the previously instantiated entities can get anactual value. Now running the code from the entry point is possible because all the blanks have been filled.

In simple terms, we could say that Phase 1 is about finding all the dots, Phase 2 connects those creating paths, and, finally, Phase 3 walks through the paths in the right order.

At first glance, this approach doesn't seem very different from what CommonJS does, but there's a fundamental difference. Due to its dynamic nature, CommonJS will execute all the files while the dependency graph is explored. We have seen that every time a newrequire statement is found, all the previous code has already been executed. This is why you can userequire even withinif statements or loops, and construct module identifiers from variables.

In ESM, these three phases are totally separate from each other, no code can be executed until thedependency graph has been fully built, and therefore module imports and exports have to be static.

Read-only live bindings

Another fundamentalcharacteristic of ES modules, which helps with cyclic dependencies, is the idea that imported modules are effectivelyread-only live bindings to their exported values.

Let's clarify what this means with a simple example:

// counter.jsexportlet count =0exportfunctionincrement (){  count++}

This module exports two values: a simple integer counter calledcount and anincrement function that increases the counter by one.

Let's now write some code that uses this module:

// main.jsimport { count, increment }from'./counter.js'console.log(count)// prints 0increment()console.log(count)// prints 1count++// TypeError: Assignment to constant variable!

What we can see in this code is that we can read the value ofcount at any time and change it using theincrement() function, but as soon as we try to mutate thecount variable directly, we get an error as if we were trying to mutate aconst binding.

This proves that when an entity is imported in the scope, the binding to its original value cannot be changed (read-only binding) unless the bound value changes within the scope of the original module itself (live binding), which is outside the direct control of the consumer code.

This approach is fundamentally different from CommonJS. In fact, in CommonJS, the entireexports object iscopied (shallow copy) when required from a module. This means that, if the value of primitive variables like numbers or string is changed at a later time, the requiring module won't be able to see those changes.

Circular dependency resolution

Now to close the circle, let's reimplementthe circular dependency example we saw in theCommonJS modules section using the ESM syntax:

02%20The%20Module%20system%20-%20Images/circular-dependency-example.png

Figure 2.3: An example scenario with circular dependencies

Let's have a look at the modulesa.js andb.js first:

// a.jsimport *as bModulefrom'./b.js'exportlet loaded =falseexportconst b = bModuleloaded =true// b.jsimport *as aModulefrom'./a.js'exportlet loaded =falseexportconst a = aModuleloaded =true

And now let's see how to import those two modules in ourmain.js file (the entry point):

// main.jsimport *as afrom'./a.js'import *as bfrom'./b.js'console.log('a ->', a)console.log('b ->', b)

Note that this time we are not usingJSON.stringify because that will fail with aTypeError: Converting circular structure to JSON, since there's an actual circular reference betweena.js andb.js.

When we runmain.js, we will see the following output:

a -> <ref *1> [Module] {  b: [Module] { a: [Circular *1], loaded: true },  loaded: true}b -> <ref *1> [Module] {  a: [Module] { b: [Circular *1], loaded: true },  loaded: true}

The interesting bit here is that the modulesa.js andb.js have a complete picture of each other, unlike what would happen with CommonJS, where they would only hold partial information of each other. We can see thatbecause all theloaded values are set totrue. Also,b withina is an actual reference to the sameb instance available in the current scope, and the same goes fora withinb. That's the reason why we cannot useJSON.stringify() to serialize these modules. Finally, if we swap the order of the imports for the modulesa.js andb.js, the final outcome does not change, which is another important difference in comparison with how CommonJS works

It's worth spending some more time observing what happens in the three phases of the module resolution (parsing, instantiation, and evaluation) for this specific example.

Phase 1: Parsing

During the parsing phase, the code is explored starting from the entry point (main.js). The interpreterlooks only forimport statements to find all the necessary modules and to load the source code from the module files. The dependency graph is explored in a depth-first fashion, and every module is visited only once. This way the interpreter builds a view of the dependencies that looks like a tree structure, as shown inFigure 2.4:

02%20The%20Module%20system%20-%20Images/esm-circular-dependency-parsing-steps.png

Figure 2.4: Parsing of cyclic dependencies with ESM

Given the example inFigure 2.4, let's discuss the various steps of the parsing phase:

  1. Frommain.js, the first import found leads us straight intoa.js.
  2. Ina.js we find an import pointing tob.js.
  3. Inb.js, we also have an import back toa.js (our cycle), but sincea.js has already been visited, this path is not explored again.
  4. At this point, theexploration starts to wind back:b.js doesn't have other imports, so we go back toa.js;a.js doesn't have otherimport statements so we go back tomain.js. Here we find another import pointing tob.js, but again this module has been explored already, so this path is ignored.

At this point, our depth-first visit of the dependency graph has been completed and we have a linear view of the modules, as shown inFigure 2.5:

02%20The%20Module%20system%20-%20Images/esm-circular-dependency-parsing-result.png

Figure 2.5: A linear view of the module graph where cycles have been removed

This particular view isquite simple. In more realistic scenarios with a lot more modules, the view will look more like a tree structure.

Phase 2: Instantiation

In the instantiation phase, the interpreter walks the tree view obtained from the previous phase from the bottom to the top. For everymodule, the interpreter will look for all the exported properties first and build out a map of the exported names in memory:

02%20The%20Module%20system%20-%20Images/esm-circular-dependencies-instantiation-link-exports.png

Figure 2.6: A visual representation of the instantiation phase

Figure 2.6 describes the order in which every module is instantiated:

  1. The interpreter starts fromb.js and discovers that the module exportsloaded anda.
  2. Then, the interpreter moves toa.js, which exportsloaded andb.
  3. Finally, it moves tomain.js, which does not export any functionality.
  4. Note that, in this phase, theexports map keeps track of the exported names only; their associated values are considered uninitialized for now.

After this sequence of steps, the interpreter will do another pass to link the exported names to the modules importing them, as shown inFigure 2.7:

02%20The%20Module%20system%20-%20Images/esm-circular-dependencies-instantiation-link-imports.png

Figure 2.7: Linking exports with imports across modules

We can describe what we see inFigure 2.7 through the following steps:

  1. Moduleb.js will link the exports froma.js, referring to them asaModule.
  2. In turn,a.js will link to all the exports fromb.js, referring to them asbModule.
  3. Finally,main.js will import all the exports inb.js, referring to them asb; similarly, it will import everything froma.js, referring to them asa.
  4. Again, it's important to note that all the values are still uninitialized. In this phase, we are only linking references to values that will be available at the end of the next phase.

Phase 3: Evaluation

The last step is the evaluation phase. In this phase, all the code in every file is finally executed. The executionorder is again bottom-up respecting the post-order depth-first visit of our original dependency graph. With this approach,main.js is the last file to be executed. This way, we can be sure that all the exported values have been initialized before we start executing our main business logic:

02%20The%20Module%20system%20-%20Images/esm-circular-dependency-evaluation.png

Figure 2.8: A visual representation of the evaluation phase

Following along from the diagram inFigure 2.8, this is what happens:

  1. The execution starts fromb.js and the first line to be evaluated initializes theloaded export tofalse for the module.
  2. Similarly, here the exported propertya gets evaluated. This time, it will be evaluated to a reference to the module object representing modulea.js.
  3. The value of theloaded property gets changed totrue. At this point, we have fully evaluated the state of the exports for moduleb.js.
  4. Now the execution moves toa.js. Again, we start by settingloaded tofalse.
  5. At this point, theb export is evaluated to a reference to moduleb.js.
  6. Finally, theloaded property is changed totrue. Now we have finally evaluated all the exports fora.js as well.

After all these steps, the code inmain.js can be executed, and at this point, all the exported properties are fully evaluated. Since imported modules are tracked as references, we can be sure every module has an up-to-date picture of the other modules, even in the presence of circular dependencies.

Modifying other modules

We saw that entities imported through ES modules areread-only live bindings, and therefore we cannot reassign them from an external module.

There's a caveat, though. It is true that we can't change the bindings of the default export or named exports ofan existing module from another module, but, if one of these bindings is an object, we can still mutate the object itself by reassigning some of the object properties.

This caveat can give us enough freedom to alter the behavior of other modules. To demonstrate this idea, let's write a module that can alter the behavior of the corefs module so that it prevents the module from accessing the filesystem and returns mocked data instead. This kind of module is something that could be useful while writing tests for a component that relies on the filesystem:

// mock-read-file.jsimport fsfrom'fs'// (1)const originalReadFile = fs.readFile// (2)let mockedResponse =nullfunctionmockedReadFile (path, cb){// (3)  setImmediate(() => {    cb(null, mockedResponse)  })}exportfunctionmockEnable (respondWith){// (4)  mockedResponse = respondWith  fs.readFile = mockedReadFile}exportfunctionmockDisable (){// (5)  fs.readFile = originalReadFile}

Let's review the preceding code:

  1. The first thing we do is import the default export of thefs module. We will get back to this in a second, for now, just keep in mind that the default export of thefs module is an object that contains a collection of functions that allows us to interact with the filesystem.
  2. We want to replace thereadFile() function with a mock implementation. Before doing that, wesave a reference to the original implementation. We also declare amockedResponse value that we will be using later.
  3. The functionmockedReadFile() is the actual mocked implementation that we want to use to replace the original implementation. This function invokes the callback with the current value ofmockedResponse. Note that this is a simplified implementation; the real function accepts an optionaloptions argument before the callback argument and is able to handle different types of encoding.
  4. The exportedmockEnable() function can be used to activate the mocked functionality. The original implementation will be swapped with the mocked one. The mocked implementation will return the same value passed here through therespondWith argument.
  5. Finally, the exportedmockDisable() function can be used to restore the original implementation of thefs.readFile() function.

Now let's see a simple example that uses this module:

// main.jsimport fsfrom'fs'// (1)import { mockEnable, mockDisable }from'./mock-read-file.js'mockEnable(Buffer.from('Hello World'))// (2)fs.readFile('fake-path',(err, data) => {// (3)if (err) {console.error(err)    process.exit(1)  }console.log(data.toString())// 'Hello World'})mockDisable()

Let's discuss step by step what happens in this example:

  1. The first thing that we do is import the default export of thefs module. Again, note that we are importing specifically the default export exactly as we did in ourmock-read-file.js module, but more on this later.
  2. Here we enable the mock functionality. We want, for every file read, to simulate that the file contains the string "Hello World."
  3. Finally, we read a file using a fake path. This code will print "Hello World" as it will be using the mockedversion of thereadFile() function. Note that, after calling this function, we restore the original implementation by callingmockDisable().

This approach works, but it is very fragile. In fact, there are a number of ways in which this may not work.

On themock-read-file.js side, we could have tried the two following imports for thefs module:

import *as fsfrom'fs'// then use fs.readFile

or

import { readFile }from'fs'

Both of them are valid imports because thefs module exports all the filesystem functions as named exports (other than a default export which is an object with the same collection of functions as attributes).

There are certain issues with the preceding twoimport statements:

  • We would get a read-only live binding into thereadFile() function, and therefore, we would be unable to mutate it from an external module. If we try these approaches, we will get an error when trying to reassignreadFile().
  • Another issue is on the consumer side within ourmain.js, where we could use these two alternative import styles as well. In this case, we won't end up using the mocked functionality, and therefore the code will trigger an error while trying to read a nonexistent file.

The reason why using one of the twoimport statements mentioned above would not work is because our mocking utility is altering only the copy of thereadFile() function that is registered inside the object exported as the default export, but not the one available as a named export at the top level of the module.

This particular example shows ushow monkey patching could be much more complicated and unreliable in the context of ESM. For this reason, testing frameworkssuch as Jest (nodejsdp.link/jest) provide special functionalities to be able to mock ES modules more reliably (nodejsdp.link/jest-mock).

Another approach that can be used to mock modules is to rely on the hooks available in a special Node.js coremodule calledmodule (nodejsdp.link/module-doc). One simple library that takes advantage of this module ismocku (nodejsdp.link/mocku). Checkout its source code if you are curious.

We could also use thesyncBuiltinESMExports() function from themodule package. When this function is invoked, the value of the properties in the default exports object gets mapped again into the equivalent named exports, effectively allowing us to propagate any external change applied to the module functionality even to named exports:

import fs, { readFileSync }from'fs'import { syncBuiltinESMExports }from'module'fs.readFileSync =() => Buffer.from('Hello, ESM')syncBuiltinESMExports()console.log(fs.readFileSync === readFileSync)// true

We could use this to make our small filesystem mocking utility a little bit more flexible by invoking thesyncBuiltinESMExports() function after we enable the mock or after we restore the original functionality.

Note thatsyncBuiltinESMExports() works only for built-in Node.js modules like thefs module in our example.

This concludes our exploration of ESM. At this point, we should be able to appreciate how ESM works, how it loads modules, and how it deals with cyclic dependencies. To close this chapter, we are now ready to discuss some key differences and some interesting interoperability techniques between CommonJS and ECMAScript modules.

ESM and CommonJS differences and interoperability

We already mentioned several important differences between ESM and CommonJS, such as having toexplicitly specify file extensions in imports with ESM, while file extensions are totally optional with the CommonJSrequire function.

Let's close this chapter bydiscussing some other important differences between ESM and CommonJS and how the two module systems can work together when necessary.

ESM runs in strict mode

ES modules run implicitly in strict mode. This means that we don't have to explicitly add the"use strict" statementsat the beginning of every file. Strict mode cannot be disabled; therefore, we cannot use undeclared variables or thewith statement or have other features that are only available in non-strict mode, but this is definitely a good thing, as strict mode is a safer execution mode.

If you are curious to find out more about the differences between the two modes, you can check out a very detailed article on MDN Web Docs (https://nodejsdp.link/strict-mode).

Missing references in ESM

In ESM, some important CommonJS references are not defined. These includerequire,exports,module.exports,__filename, and__dirname. If we try to use any of them within an ES module, since italso runs in strict mode, we will get aReferenceError:

console.log(exports)// ReferenceError: exports is not definedconsole.log(module)// ReferenceError: module is not definedconsole.log(__filename)// ReferenceError: __filename is not definedconsole.log(__dirname)// ReferenceError: __dirname is not defined

We already discussed at length the meaning ofexports andmodule in CommonJS;__filename and__dirname represent the absolute path to the current module file and the absolute path to its parent folder. Those special variables can be very useful when we need to build a path relative to the current file.

In ESM, it is possible to get a reference to the current file URL by using the special objectimport.meta. Specifically,import.meta.url is a reference to the current module file in a format similar tofile:///path/to/current_module.js. This value can be used to reconstruct__filename and__dirname in the form of absolute paths:

import { fileURLToPath }from'url'import { dirname }from'path'const __filename = fileURLToPath(import.meta.url)const __dirname = dirname(__filename)

It is also possible to recreate therequire() function as follows:

import { createRequire }from'module'constrequire = createRequire(import.meta.url)

Now we can userequire() to importfunctionality coming from CommonJS modules in the context of ES modules.

Another interesting difference is the behavior of thethis keyword.

In the global scope of an ES module,this isundefined, while in CommonJS,this is a reference toexports:

// this.js - ESMconsole.log(this)// undefined// this.cjs – CommonJSconsole.log(this ===exports)// true

Interoperability

We discussed in the previous section how to import CommonJS modules in ESM by using themodule.createRequire function. It isalso possible to import CommonJS modules from ESM by using the standardimport syntax. This is only limited to default exports, though:

import packageMainfrom'commonjs-package'// Worksimport { method }from'commonjs-package'// Errors

Unfortunately, it is not possible to import ES modules from CommonJS modules.

Also, ESM cannot import JSON files directly as modules, a feature that is used quite frequently with CommonJS. The followingimport statement will fail:

import datafrom'./data.json'

It will produce aTypeError (Unknown file extension: .json).

To overcome this limitation, we can use again themodule.createRequire utility:

import { createRequire }from'module'constrequire = createRequire(import.meta.url)const data =require('./data.json')console.log(data)

There is ongoing work tosupport JSON modules natively even in ESM, so we may not need to rely oncreateRequire() in the near future for this functionality.

Summary

In this chapter, we explored in depth what modules are, why they are useful, and why we need a module system. We also learned about the history of modules in JavaScript and about the two module systems available today in Node.js, namely CommonJS and ESM. We also explored some common patterns that are useful when creating modules or when using third-party modules.

You should now be comfortable with understanding and writing code that takes advantage of the features of both CommonJS and ESM.

In the rest of the book, we will rely mostly on ES modules, but you should now be equipped to be flexible with your choices and be able to deal with CommonJS effectively if necessary.

In the next chapter, we will start to explore the idea of asynchronous programming with JavaScript, and we will examine callbacks, events, and their patterns in depth.

Left arrow icon

Page1 of 9

Right arrow icon
Download code iconDownload Code

Key benefits

  • Learn how to create solid server-side applications by leveraging the full power of Node.js
  • Understand how Node.js works and learn how to take full advantage of its core components as well as the solutions offered by its ecosystem
  • Avoid common mistakes and use proven patterns to create production grade Node.js applications

Description

In this book, we will show you how to implement a series of best practices and design patterns to help you create efficient and robust Node.js applications with ease.We kick off by exploring the basics of Node.js, analyzing its asynchronous event driven architecture and its fundamental design patterns. We then show you how to build asynchronous control flow patterns with callbacks, promises and async/await. Next, we dive into Node.js streams, unveiling their power and showing you how to use them at their full capacity. Following streams is an analysis of different creational, structural, and behavioral design patterns that take full advantage of JavaScript and Node.js. Lastly, the book dives into more advanced concepts such as Universal JavaScript, scalability and messaging patterns to help you build enterprise-grade distributed applications.Throughout the book, you’ll see Node.js in action with the help of several real-life examples leveraging technologies such as LevelDB, Redis, RabbitMQ, ZeroMQ, and many others. They will be used to demonstrate a pattern or technique, but they will also give you a great introduction to the Node.js ecosystem and its set of solutions.

Who is this book for?

This book is for developers and software architects who have some prior basic knowledge of JavaScript and Node.js and now want to get the most out of these technologies in terms of productivity, design quality, and scalability. Software professionals with intermediate experience in Node.js and JavaScript will also find valuable the more advanced patterns and techniques presented in this book.This book assumes that you have an intermediate understanding of web application development, databases, and software design principles.

What you will learn

  • Become comfortable with writing asynchronous code by leveraging callbacks, promises, and the async/await syntax
  • Leverage Node.js streams to create data-driven asynchronous processing pipelines
  • Implement well-known software design patterns to create production grade applications
  • Share code between Node.js and the browser and take advantage of full-stack JavaScript
  • Build and scale microservices and distributed systems powered by Node.js
  • Use Node.js in conjunction with other powerful technologies such as Redis, RabbitMQ, ZeroMQ, and LevelDB

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date :Jul 29, 2020
Length:664 pages
Edition :3rd
Language :English
ISBN-13 :9781839210440
Category :
Languages :
Tools :

What do you get with eBook?

Product feature iconInstant access to your Digital eBook purchase
Product feature icon Download this book inEPUB andPDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature iconDRM FREE - Read whenever, wherever and however you want
Product feature iconAI Assistant (beta) to help accelerate your learning
OR

Contact Details

Modal Close icon
Payment Processing...
tickCompleted

Billing Address

Product Details

Publication date :Jul 29, 2020
Length:664 pages
Edition :3rd
Language :English
ISBN-13 :9781839210440
Category :
Languages :
Concepts :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99billed monthly
Feature tick iconUnlimited access to Packt's library of 7,000+ practical books and videos
Feature tick iconConstantly refreshed with 50+ new titles a month
Feature tick iconExclusive Early access to books as they're written
Feature tick iconSolve problems while you work with advanced search and reference features
Feature tick iconOffline reading on the mobile app
Feature tick iconSimple pricing, no contract
$199.99billed annually
Feature tick iconUnlimited access to Packt's library of 7,000+ practical books and videos
Feature tick iconConstantly refreshed with 50+ new titles a month
Feature tick iconExclusive Early access to books as they're written
Feature tick iconSolve problems while you work with advanced search and reference features
Feature tick iconOffline reading on the mobile app
Feature tick iconChoose a DRM-free eBook or Video every month to keep
Feature tick iconPLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick iconExclusive print discounts
$279.99billed in 18 months
Feature tick iconUnlimited access to Packt's library of 7,000+ practical books and videos
Feature tick iconConstantly refreshed with 50+ new titles a month
Feature tick iconExclusive Early access to books as they're written
Feature tick iconSolve problems while you work with advanced search and reference features
Feature tick iconOffline reading on the mobile app
Feature tick iconChoose a DRM-free eBook or Video every month to keep
Feature tick iconPLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick iconExclusive print discounts

Frequently bought together


Node.js Design Patterns
Node.js Design Patterns
Read more
Jul 2020664 pages
Full star icon4.4 (202)
eBook
eBook
$46.99
$57.99
Node Cookbook
Node Cookbook
Read more
Nov 2020512 pages
Full star icon4.5 (11)
eBook
eBook
$31.99$35.99
$44.99
Node.js Web Development
Node.js Web Development
Read more
Jul 2020760 pages
Full star icon3.7 (10)
eBook
eBook
$26.98$29.99
$43.99
Stars icon
Total$146.97
Node.js Design Patterns
$57.99
Node Cookbook
$44.99
Node.js Web Development
$43.99
Total$146.97Stars icon

Table of Contents

15 Chapters
The Node.js PlatformChevron down iconChevron up icon
The Node.js Platform
The Node.js philosophy
How Node.js works
JavaScript in Node.js
Summary
The Module SystemChevron down iconChevron up icon
The Module System
The need for modules
Module systems in JavaScript and Node.js
The module system and its patterns
CommonJS modules
Module definition patterns
ESM: ECMAScript modules
ESM and CommonJS differences and interoperability
Summary
Callbacks and EventsChevron down iconChevron up icon
Callbacks and Events
The Callback pattern
The Observer pattern
Summary
Exercises
Asynchronous Control Flow Patterns with CallbacksChevron down iconChevron up icon
Asynchronous Control Flow Patterns with Callbacks
The difficulties of asynchronous programming
Callback best practices and control flow patterns
The async library
Summary
Exercises
Asynchronous Control Flow Patterns with Promises and Async/AwaitChevron down iconChevron up icon
Asynchronous Control Flow Patterns with Promises and Async/Await
Promises
Async/await
The problem with infinite recursive promise resolution chains
Summary
Exercises
Coding with StreamsChevron down iconChevron up icon
Coding with Streams
Discovering the importance of streams
Getting started with streams
Asynchronous control flow patterns with streams
Piping patterns
Summary
Exercises
Creational Design PatternsChevron down iconChevron up icon
Creational Design Patterns
Factory
Builder
Revealing Constructor
Singleton
Wiring modules
Summary
Exercises
Structural Design PatternsChevron down iconChevron up icon
Structural Design Patterns
Proxy
Decorator
The line between proxy and decorator
Adapter
Summary
Exercises
Behavioral Design PatternsChevron down iconChevron up icon
Behavioral Design Patterns
Strategy
State
Template
Iterator
Middleware
Command
Summary
Exercises
Universal JavaScript for Web ApplicationsChevron down iconChevron up icon
Universal JavaScript for Web Applications
Sharing code with the browser
Fundamentals of cross-platform development
A brief introduction to React
Creating a Universal JavaScript app
Summary
Exercises
Advanced RecipesChevron down iconChevron up icon
Advanced Recipes
Dealing with asynchronously initialized components
Asynchronous request batching and caching
Canceling asynchronous operations
Running CPU-bound tasks
Summary
Exercises
Scalability and Architectural PatternsChevron down iconChevron up icon
Scalability and Architectural Patterns
An introduction to application scaling
Cloning and load balancing
Decomposing complex applications
Summary
Exercises
Messaging and Integration PatternsChevron down iconChevron up icon
Messaging and Integration Patterns
Fundamentals of a messaging system
Publish/Subscribe pattern
Task distribution patterns
Request/Reply patterns
Summary
Exercises
Other Books You May EnjoyChevron down iconChevron up icon
Other Books You May Enjoy
IndexChevron down iconChevron up icon
Index

Recommendations for you

Left arrow icon
Debunking C++ Myths
Debunking C++ Myths
Read more
Dec 2024226 pages
Full star icon5 (1)
eBook
eBook
$27.99$31.99
$39.99
Go Recipes for Developers
Go Recipes for Developers
Read more
Dec 2024350 pages
eBook
eBook
$27.99$31.99
$39.99
50 Algorithms Every Programmer Should Know
50 Algorithms Every Programmer Should Know
Read more
Sep 2023538 pages
Full star icon4.5 (68)
eBook
eBook
$35.98$39.99
$49.99
$49.99
Asynchronous Programming with C++
Asynchronous Programming with C++
Read more
Nov 2024424 pages
Full star icon5 (1)
eBook
eBook
$29.99$33.99
$41.99
Modern CMake for C++
Modern CMake for C++
Read more
May 2024504 pages
Full star icon4.7 (12)
eBook
eBook
$35.98$39.99
$49.99
Learn Python Programming
Learn Python Programming
Read more
Nov 2024616 pages
Full star icon5 (1)
eBook
eBook
$31.99$35.99
$39.99
Learn to Code with Rust
Learn to Code with Rust
Read more
Nov 202457hrs 40mins
Video
Video
$74.99
Modern Python Cookbook
Modern Python Cookbook
Read more
Jul 2024818 pages
Full star icon4.9 (21)
eBook
eBook
$38.99$43.99
$54.99
Right arrow icon

Customer reviews

Top Reviews
Rating distribution
Full star iconFull star iconFull star iconFull star iconHalf star icon4.4
(202 Ratings)
5 star60.4%
4 star25.2%
3 star11.4%
2 star1.5%
1 star1.5%
Filter icon Filter
Top Reviews

Filter reviews by




LéoOct 28, 2023
Full star iconFull star iconFull star iconFull star iconFull star icon5
All content is amazing, this book will boost your knowledge to a professional level. Highly recommended
Subscriber reviewPackt
Vishal ChepuriDec 18, 2023
Full star iconFull star iconFull star iconFull star iconFull star icon5
Udemy Verified reviewUdemy
NivDec 08, 2020
Full star iconFull star iconFull star iconFull star iconFull star icon5
If you are working with Node.js, do yourself a favor and make sure you get this.This book is OK can take you from bdgginer level to intermediate. It is amazing book, because it shows you the problem and the solution with a reference code. Tha. The authors ask a question and revile another problem with the implementation, and once again explain with a refeecode the solution.
Amazon Verified reviewAmazon
MOSHEOct 01, 2022
Full star iconFull star iconFull star iconFull star iconFull star icon5
This book is amazing, in depth and super informative. If you really into nodejs you have to grab this book as soon as possible.The only complain I have is that the book is not loosely coupled. Every chapter is building on the previous one, which I find unnecessary. For example the last chapter, Messaging techniques shouldn't be coupled with the rest of the book.
Amazon Verified reviewAmazon
Bilal ArainAug 07, 2023
Full star iconFull star iconFull star iconFull star iconFull star icon5
Thanks, book is fresh
Amazon Verified reviewAmazon
  • Arrow left icon Previous
  • 1
  • 2
  • 3
  • 4
  • 5
  • ...
  • Arrow right icon Next

People who bought this also bought

Left arrow icon
50 Algorithms Every Programmer Should Know
50 Algorithms Every Programmer Should Know
Read more
Sep 2023538 pages
Full star icon4.5 (68)
eBook
eBook
$35.98$39.99
$49.99
$49.99
Event-Driven Architecture in Golang
Event-Driven Architecture in Golang
Read more
Nov 2022384 pages
Full star icon4.9 (11)
eBook
eBook
$35.98$39.99
$49.99
The Python Workshop Second Edition
The Python Workshop Second Edition
Read more
Nov 2022600 pages
Full star icon4.6 (22)
eBook
eBook
$36.99$41.99
$51.99
Template Metaprogramming with C++
Template Metaprogramming with C++
Read more
Aug 2022480 pages
Full star icon4.6 (14)
eBook
eBook
$33.99$37.99
$46.99
Domain-Driven Design with Golang
Domain-Driven Design with Golang
Read more
Dec 2022204 pages
Full star icon4.4 (19)
eBook
eBook
$31.99$35.99
$44.99
Right arrow icon

About the authors

Left arrow icon
Profile icon Mario Casciaro
Mario Casciaro
Mario Casciaro is a software engineer and entrepreneur, passionate about technology, science and open source knowledge. Mario graduated with a master's degree in software engineering and started his professional career at IBM where he worked for several years on different enterprise products such as Tivoli Endpoint Manager, Cognos Insight, and SalesConnect. Next, he moved to D4H Technologies, a growing SaaS company, to lead the development of a new bleeding-edge product for managing emergency operations in real time. Currently, Mario is the co-founder and CEO of Sponsorama.com, a platform to help online projects raise funds through corporate sponsorship. Mario is also the author of the first edition of Node.js Design Patterns.
Read more
See other products by Mario Casciaro
Profile icon Luciano Mammino
Luciano Mammino
LinkedIn iconGithub icon
Luciano Mammino is a software engineer born in 1987, the same year that the Nintendo released Super Mario Bros in Europe, which by chance is his favorite video-game. He started coding at the age of 12 using his father's old Intel 386, provided only with the DOS operating system and the qBasic interpreter. After a master's degree in computer science he developed his programming skills mostly as a web developer working mainly as freelancer for companies and startups all around Italy. After a start-up parenthesis of 3 years as CTO and co-founder of Sbaam.com in Italy and in Ireland, he decided to relocate in Dublin to work as senior PHP engineer at Smartbox. He loves developing open source libraries and working with frameworks such as Symfony and Express. He is convinced that the JavaScript fame is still at the very beginning and that this technology will have a huge impact in the future of most of the web-and mobile-related technologies. For this reason, he spends most of his free time improving his knowledge of JavaScript and playing with Node.js.
Read more
See other products by Luciano Mammino
Right arrow icon
Getfree access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

How do I buy and download an eBook?Chevron down iconChevron up icon

Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.

If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.

Please Note: Packt eBooks are non-returnable and non-refundable.

Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:

  • You may make copies of your eBook for your own use onto any machine
  • You may not pass copies of the eBook on to anyone else
How can I make a purchase on your website?Chevron down iconChevron up icon

If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:

  1. Register on our website using your email address and the password.
  2. Search for the title by name or ISBN using the search option.
  3. Select the title you want to purchase.
  4. Choose the format you wish to purchase the title in; if you order the Print Book, you get a free eBook copy of the same title. 
  5. Proceed with the checkout process (payment to be made using Credit Card, Debit Cart, or PayPal)
Where can I access support around an eBook?Chevron down iconChevron up icon
  • If you experience a problem with using or installing Adobe Reader, the contact Adobe directly.
  • To view the errata for the book, see www.packtpub.com/support and view the pages for the title you have.
  • To view your account details or to download a new copy of the book go to www.packtpub.com/account
  • To contact us directly if a problem is not resolved, use www.packtpub.com/contact-us
What eBook formats do Packt support?Chevron down iconChevron up icon

Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.

You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.

What are the benefits of eBooks?Chevron down iconChevron up icon
  • You can get the information you need immediately
  • You can easily take them with you on a laptop
  • You can download them an unlimited number of times
  • You can print them out
  • They are copy-paste enabled
  • They are searchable
  • There is no password protection
  • They are lower price than print
  • They save resources and space
What is an eBook?Chevron down iconChevron up icon

Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.

When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.

For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.


[8]ページ先頭

©2009-2025 Movatter.jp