Thewebpack compiler can understand modules written as ES2015 modules, CommonJS or AMD. However, some third party libraries may expect global dependencies (e.g.$ forjQuery). The libraries might also create globals which need to be exported. These "broken modules" are one instance whereshimming comes into play.
We don't recommend using globals! The whole concept behind webpack is to allow more modular front-end development. This means writing isolated modules that are well contained and do not rely on hidden dependencies (e.g. globals). Please use these features only when necessary.
Another instance whereshimming can be useful is when you want topolyfill browser functionality to support more users. In this case, you may only want to deliver those polyfills to the browsers that need patching (i.e. load them on demand).
The following article will walk through both of these use cases.
For simplicity, this guide stems from the examples inGetting Started. Please make sure you are familiar with the setup there before moving on.
Let's start with the first use case of shimming global variables. Before we do anything let's take another look at our project:
project
webpack-demo|- package.json|- package-lock.json|- webpack.config.js|- /dist |- index.html|- /src |- index.js|- /node_modulesRemember thatlodash package we were using? For demonstration purposes, let's say we wanted to instead provide this as a global throughout our application. To do this, we can useProvidePlugin.
TheProvidePlugin makes a package available as a variable in every module compiled through webpack. If webpack sees that variable used, it will include the given package in the final bundle. Let's go ahead by removing theimport statement forlodash and instead provide it via the plugin:
src/index.js
-import _ from 'lodash';-function component() { const element = document.createElement('div');- // Lodash, now imported by this script element.innerHTML = _.join(['Hello', 'webpack'], ' '); return element;}document.body.appendChild(component());webpack.config.js
const path = require('path');+const webpack = require('webpack');module.exports = { entry: './src/index.js', output: { filename: 'main.js', path: path.resolve(__dirname, 'dist'), },+ plugins: [+ new webpack.ProvidePlugin({+ _: 'lodash',+ }),+ ],};What we've essentially done here is tell webpack...
If you encounter at least one instance of the variable
_, include thelodashpackage and provide it to the modules that need it.
If we run a build, we should still see the same output:
$npm run build..[webpack-cli] Compilation finishedasset main.js69.1 KiB[emitted][minimized](name: main)1 related assetruntime modules344 bytes2 modulescacheable modules530 KiB ./src/index.js191 bytes[built][code generated] ./node_modules/lodash/lodash.js530 KiB[built][code generated]webpack5.4.0 compiled successfullyin2910 msWe can also use theProvidePlugin to expose a single export of a module by configuring it with an "array path" (e.g.[module, child, ...children?]). So let's imagine we only wanted to provide thejoin method fromlodash wherever it's invoked:
src/index.js
function component() { const element = document.createElement('div');- element.innerHTML = _.join(['Hello', 'webpack'], ' ');+ element.innerHTML = join(['Hello', 'webpack'], ' '); return element;}document.body.appendChild(component());webpack.config.js
const path = require('path');const webpack = require('webpack');module.exports = { entry: './src/index.js', output: { filename: 'main.js', path: path.resolve(__dirname, 'dist'), }, plugins: [ new webpack.ProvidePlugin({- _: 'lodash',+ join: ['lodash', 'join'], }), ],};This would go nicely withTree Shaking as the rest of thelodash library should get dropped.
Some legacy modules rely onthis being thewindow object. Let's update ourindex.js so this is the case:
function component() { const element = document.createElement('div'); element.innerHTML = join(['Hello', 'webpack'], ' ');+ // Assume we are in the context of `window`+ this.alert("Hmmm, this probably isn't a great idea...");+ return element;}document.body.appendChild(component());This becomes a problem when the module is executed in a CommonJS context wherethis is equal tomodule.exports. In this case you can overridethis using theimports-loader:
webpack.config.js
const path = require('path');const webpack = require('webpack');module.exports = { entry: './src/index.js', output: { filename: 'main.js', path: path.resolve(__dirname, 'dist'), },+ module: {+ rules: [+ {+ test: require.resolve('./src/index.js'),+ use: 'imports-loader?wrapper=window',+ },+ ],+ }, plugins: [ new webpack.ProvidePlugin({ join: ['lodash', 'join'], }), ],};Let's say a library creates a global variable that it expects its consumers to use. We can add a small module to our setup to demonstrate this:
project
webpack-demo |- package.json |- package-lock.json |- webpack.config.js |- /dist |- /src |- index.js+ |- globals.js |- /node_modulessrc/globals.js
const file='blah.txt';const helpers={test:function(){ console.log('test something');},parse:function(){ console.log('parse something');},};Now, while you'd likely never do this in your own source code, you may encounter a dated library you'd like to use that contains similar code to what's shown above. In this case, we can useexports-loader, to export that global variable as a normal module export. For instance, in order to exportfile asfile andhelpers.parse asparse:
webpack.config.js
const path = require('path');const webpack = require('webpack');module.exports = { entry: './src/index.js', output: { filename: 'main.js', path: path.resolve(__dirname, 'dist'), }, module: { rules: [ { test: require.resolve('./src/index.js'), use: 'imports-loader?wrapper=window', },+ {+ test: require.resolve('./src/globals.js'),+ use:+ 'exports-loader?type=commonjs&exports=file,multiple|helpers.parse|parse',+ }, ], }, plugins: [ new webpack.ProvidePlugin({ join: ['lodash', 'join'], }), ],};Now from within our entry script (i.e.src/index.js), we could useconst { file, parse } = require('./globals.js'); and all should work smoothly.
Almost everything we've discussed thus far has been in relation to handling legacy packages. Let's move on to our second topic:polyfills.
There's a lot of ways to load polyfills. For example, to include thebabel-polyfill we might:
npminstall --save babel-polyfillandimport it so as to include it in our main bundle:
src/index.js
+import 'babel-polyfill';+function component() { const element = document.createElement('div'); element.innerHTML = join(['Hello', 'webpack'], ' '); // Assume we are in the context of `window` this.alert("Hmmm, this probably isn't a great idea..."); return element;}document.body.appendChild(component());Note that we aren't binding theimport to a variable. This is because polyfills run on their own, prior to the rest of the code base, allowing us to then assume certain native functionality exists.
Note that this approach prioritizes correctness over bundle size. To be safe and robust, polyfills/shims must runbefore all other code, and thus either need to load synchronously, or, all app code needs to load after all polyfills/shims load.There are many misconceptions in the community, as well, that modern browsers "don't need" polyfills, or that polyfills/shims merely serve to add missing features - in fact, they oftenrepair broken implementations, even in the most modern of browsers.The best practice thus remains to unconditionally and synchronously load all polyfills/shims, despite the bundle size cost this incurs.
If you feel that you have mitigated these concerns and wish to incur the risk of brokenness, here's one way you might do it:Let's move ourimport to a new file and add thewhatwg-fetch polyfill:
npminstall --save whatwg-fetchsrc/index.js
-import 'babel-polyfill';-function component() { const element = document.createElement('div'); element.innerHTML = join(['Hello', 'webpack'], ' '); // Assume we are in the context of `window` this.alert("Hmmm, this probably isn't a great idea..."); return element;}document.body.appendChild(component());project
webpack-demo |- package.json |- package-lock.json |- webpack.config.js |- /dist |- /src |- index.js |- globals.js+ |- polyfills.js |- /node_modulessrc/polyfills.js
import'babel-polyfill';import'whatwg-fetch';webpack.config.js
const path = require('path');const webpack = require('webpack');module.exports = {- entry: './src/index.js',+ entry: {+ polyfills: './src/polyfills',+ index: './src/index.js',+ }, output: {- filename: 'main.js',+ filename: '[name].bundle.js', path: path.resolve(__dirname, 'dist'), }, module: { rules: [ { test: require.resolve('./src/index.js'), use: 'imports-loader?wrapper=window', }, { test: require.resolve('./src/globals.js'), use: 'exports-loader?type=commonjs&exports[]=file&exports[]=multiple|helpers.parse|parse', }, ], }, plugins: [ new webpack.ProvidePlugin({ join: ['lodash', 'join'], }), ],};With that in place, we can add the logic to conditionally load our newpolyfills.bundle.js file. How you make this decision depends on the technologies and browsers you need to support. We'll do some testing to determine whether our polyfills are needed:
dist/index.html
<!DOCTYPE html><html> <head> <meta charset="utf-8" /> <title>Getting Started</title>+ <script>+ const modernBrowser = 'fetch' in window && 'assign' in Object;++ if (!modernBrowser) {+ const scriptElement = document.createElement('script');++ scriptElement.async = false;+ scriptElement.src = '/polyfills.bundle.js';+ document.head.appendChild(scriptElement);+ }+ </script> </head> <body>- <script src="main.js"></script>+ <script src="index.bundle.js"></script> </body></html>Now we canfetch some data within our entry script:
src/index.js
function component() { const element = document.createElement('div'); element.innerHTML = join(['Hello', 'webpack'], ' '); // Assume we are in the context of `window` this.alert("Hmmm, this probably isn't a great idea..."); return element;}document.body.appendChild(component());++fetch('https://jsonplaceholder.typicode.com/users')+ .then((response) => response.json())+ .then((json) => {+ console.log(+ "We retrieved some data! AND we're confident it will work on a variety of browser distributions."+ );+ console.log(json);+ })+ .catch((error) =>+ console.error('Something went wrong when fetching this data: ', error)+ );If we run our build, anotherpolyfills.bundle.js file will be emitted and everything should still run smoothly in the browser. Note that this set up could likely be improved upon but it should give you a good idea of how you can provide polyfills only to the users that actually need them.
Thebabel-preset-env package usesbrowserslist to transpile only what is not supported in your browsers matrix. This preset comes with theuseBuiltIns option,false by default, which converts your globalbabel-polyfill import to a more granular feature by featureimport pattern:
import'core-js/modules/es7.string.pad-start';import'core-js/modules/es7.string.pad-end';import'core-js/modules/web.timers';import'core-js/modules/web.immediate';import'core-js/modules/web.dom.iterable';Seethe babel-preset-env documentation for more information.
Node built-ins, likeprocess, can be polyfilled right directly from your configuration file without the use of any special loaders or plugins. See thenode configuration page for more information and examples.
There are a few other tools that can help when dealing with legacy modules.
When there is no AMD/CommonJS version of the module and you want to include thedist, you can flag this module innoParse. This will cause webpack to include the module without parsing it or resolvingrequire() andimport statements. This practice is also used to improve the build performance.
Any feature requiring the AST, like theProvidePlugin, will not work.
Lastly, there are some modules that support multiplemodule styles; e.g. a combination of AMD, CommonJS, and legacy. In most of these cases, they first check fordefine and then use some quirky code to export properties. In these cases, it could help to force the CommonJS path by settingadditionalCode=var%20define%20=%20false; via theimports-loader.