What do you want to learn?
Skip to main content
by Wes Higbee
Start CourseBookmarkAdd to Channel
Table of contents
What Is Webpack?
Welcome to this course about webpack. As I prepared for this course, I wasn't quite sure where I wanted to start. For example, maybe we could start with a definition of what webpack is to describe it on a high level. And, of course, it's often referred to as a bundler. But I really like to think of webpack more as a compiler, a web compiler that can compile all sorts of web artifacts. While this is a concise way to describe webpack, I don't think it's very helpful. Instead, what I'd like to do is take a look at the ways in which you can benefit from webpack on a very high level. I want to show some of the really neat things you can do, and then throughout the rest of this course, we're going to dive into these areas, and we're going to find out how it's actually possible, how webpack works behind the scenes, and how you can take advantage of these things to improve your web applications.
Interactive Coding with Hot Module Replacement
Modularity is another benefit of webpack. First off, we have this one file that's blown apart into three separate files, which is convenient to edit these related components together. But it's possible we want to do the opposite and have individual files instead. For example, we have the main.js file, right, if I go back to my graph. Over here is our entry point to he application. It looks like it points at two separate files or modules, the App.vue and the index.js file. If we look over here at our code, and expand out our imports at the top, yep, indeed, that's what we have. We have Vue, App, and router up here at the top. So it's kind of hard to see that last piece there, but there's a line running in the middle here that comes down for the vue dependency. Now it's important to keep in mind that not every browser can support these import statements. Module loading is not yet widely implemented. And it's a problem both for current browser versions, but also legacy versions that you need to support. I can drill into some of these modules like the App module. And here you can see we have three separate sections again for a different component in our application. And if I go back to the graph here, you'll see that dependency here with App.vue. And again, we have three separate parts that are blown out. So modularity is another big benefit of webpack. You can write your code using a nice modular format, using imports and exports, without needing the browsers you deploy to to have support for this.
Benefit - Bundling for Performance
So bundling is a benefit of webpack, and I would actually go so far as to say it's sophisticated bundling that's the benefit. It's not just putting all the files together. There are many tools that you can use to do that. Notably, you could just concatenate all the file together, or you could just write all your code in one file to begin with. And assuming you want a performant application, then one bundle probably won't cut it. An example a of sophisticated bundling, let's say we're a gaming company, and we don't just have the game of Solitaire, we have three separate games. And, of course, instead of one bundle to contain all the code for these games, we could have three separate bundles then. And when a user comes along, and they want to play the game of Solitaire, that's all they get. Another user might just want to play bridge. And then maybe later on, that same user wants to play poker, and so they pull down just the code that they need. While bundling is a great thing, there are some drawbacks. Can you think of one? One that comes to my mind, all of these are card games, so there's some logic to display cards, to work with cards, that's probably shared between all of these games. That means that we have to download that logic three separate times if we're bundling our games separately. But maybe what we want to do is just to build a third separate bundle and extract out all that common logic and put it inside of a deck bundle. This would subsequently make our game bundles smaller, which wouldn't be helpful the first game you play. You're still going to pull down the same amount of content, but when you're ready to play the next game, you only need to download that game, and you don't need to download the deck logic again. So you can save space, bandwidth, and even speed in working with your application if you start to split your bundles apart intelligently. And webpack will help with this. In fact, code splitting was the impetus for the webpack project.
npm Install, Import, Go!
One of my favorite benefits, one I originally wanted to lead with, is the simplicity with which you can pull down libraries and other dependences from npm and include them in your project. If you're familiar with the node workflow to install a package and then require it to use it, we are the exact same workflow now with front-end applications. And npm install, for example, with lodash, followed by a simple import statement, which is the harmony equivalent, so new module format equivalent, of the require in Node.js. The long story short, you can see in our Vue application here, obviously, we need to use Vue itself, which is a third-party framework, and so we have a series of npm packages that are included in our project. For example, you can see the Vue package here that's referenced by our main.js file. Part of what makes this possible, webpack supports any module format. So CommonJS is really important for Node.js packages. Webpack also understands harmony modules, which is the format that we will use long term, but for right now, we still have a lot of legacy packages that use CommonJS. There also are quite a few libraries out there that use AND or UMD. Webpack even supports libraries that still use globals, for example, if you're using jQuery in the global scope to access jQuery's APIs. This interop allows for a gradual migration to modules. Part of what it makes it possible to load npm packages is that webpack is a module loader; in other words, it can resolve the location of modules for you. This is customizable as well, but by default, it will look in the Node modules folder for requested modules, for example, if you request lodash. You don't have to worry about pathing to the location where you downloaded the lodash file. And you don't have to worry about pathing in production anymore, worrying about it you changed the folder structure at all. You specify relationships between modules. Webpack takes care of the rest. Consequently, you also don't have to worry about dumping a bunch of script tags into some index page. And you definitely don't have to worry about the order of these anymore because each module will request dependencies, thus forming a natural ordering that webpack will enforce for you.
Help with Caching
Webpack will also help with caching, that annoying problem where, in development, you don't want caching. Because when you make changes, you can't tell the difference between the change isn't working versus the page just didn't reload. And, of course, then in production, where you definitely want caching to improve the user experience, but when you do a deployment, you also want to make sure that users get the latest version the next time they come back to your site. And, of course, that can be a problem without the proper caching setup. So back in this overall graph, if you look through here at some of the assets that are generated, the manifest bundle, the app bundle, and we'll get into what this means later on, you'll notice that there are hashes inside of the file name. You have complete control over how that's formatted. In this example, though, we're injecting a hash of the contents, which means, well, you tell me. What does that mean? Well, it means that in terms of caching, if we change the code, the contents of the bundle will be different, and thus, the file name will be different. So when a file changes, we'll get a new file name. If a file hasn't changed, we'll keep the old name. That's going to be great for both development and for production. Can even see on the vendor bundle over to the right-hand side here where your vendor dependencies are at, our third-party dependencies, we also have hashing there. That's really important because our vendor dependencies like vue.js, for example, are much less likely to change than our application code. And, of course, this caching is customizable, and, in fact, everything is customizable. Dev doesn't have to be production. Sometimes you want different things to happen in different environments, and so you can customize your build process, and we'll see that throughout this course. For example, in development, you could imagine that if one of these modules provided some data to our application, we might want to have some fake data in development and real data in production. You could swap out those modules.
Source Maps Through Any Number of Transformations
A Compiler Platform
Course Series and Updates
Cloning and Starting the Solitaire App
Now that you've got an understanding of some of the major benefits of webpack, let's actually step into an example and start to put it to use to see these benefits in action. What I'd like to do first is set up the sample code for this course. So we're going to pull down the source code for that game of Solitaire that we saw a moment ago. So you can hop out to this GitHub repository and clone or download this. And by the way, make sure you check out the master branch. I'll be pushing updates to this repository throughout the course so you can see all the steps that we took. So I'll do a git clone here, paste in the URL, and, in my case, I'll put this all into a folder called course. And you can see we've got the files here locally. So as a first step, just to get your mind wrapped around things, if you run npm start, this will kick off an npm install, Bower install, and then start up a web server to host the game of Solitaire. Once that tab opens up in your browser, go into the app folder, and the game of Solitaire will load from the index.html page. If you want, you can play a little bit of the game of Solitaire. I always like to play a few rounds from time to time. And then once you get comfortable with the game, at least inside of the browser, let's move on and talk about the structure of this and how that structure would benefit from having webpack.
The Legacy Solitaire App Structure
Performance Problems in the Solitaire App
Now if you pop open the dev tools in your browser and refresh the page while looking at the network tab, you can take a look at the requests that happened to load the page, and there are quite a few of them. We've got 30 requests, and we've almost got a megabyte of content. However, things seem to load pretty quickly, but that's because we're running locally. If you want, you can pop open a new tab, make sure you open up the dev tools, and then set the network mode here to Slow 3G, and then go ahead and load that page. You can see here it's taking quite a while to work through things, to make all the requests to load this application. In fact, we're already at about 11 seconds, and we still don't even have the app running. We can't even see anything yet but the green background. We've sort of got part of the piles here. Okay, there we go. Looks like we've got most of the game now. So that took at least 20 seconds just to download the content that we needed, the initial content for this page, and then additional time was necessary to pull down some of the image files. So this is the experience we would have on a slower connection. This is another drawback of this app. As it's architected right now, it does not perform very well on a slower connection. That may or may not matter, depending on what you're developing, but if it does, well, then bundling can definitely help there. So part of the issue here is the latency to be sending all of these requests. Additionally, it would probably help to minimize some of our content, to run it through maybe a minification process or some sort of optimization to reduce the amount that we actually have to transfer.
Inspecting and Using the App Bundle
Let's open up the bundle file. Now I've got a challenge for you. I want you to look through this file and see if you can find something familiar. Go ahead and pause the recording, and work on that, and then come back. Okay, so if you scroll through this file, you'll find more code than we have in our app.js file, and somewhere along the way, you'll see what looks like our app.js module. So these are the lines of code that came out of our app.js module. If I split horizontally here, you can see both of these things side by side. So would be has taken our one entry point module, and it's bundled it up. Now there's some additional code inside of here. This is the webpack runtime. It's necessary to run our application, and it performs a lot of the functionality necessary for our application to be bundled up and yet still execute when we have lots and lots of modules, so it'll become more important as we go. Right now, it probably seems wasteful to have any of this code. The most important thing you'll see, at the bottom of this code here that webpack generates, it's calling webpack_require, which is going to execute a module. It's looking up the module that has an ID of 0, and you can see the 0 here. That's the module for our app.js file. So after webpack bootstraps itself, it executes our code. So let's go ahead and plug in the bundle instead of our module. So we can come over to index.html here. Now, what should I change in here to take out the old module and put in our bundle? Well, we have our app.js script right here. Let's yank that out, and actually save this file, and let's just make sure things don't load in the browser. All right, over in the browser, refresh, and hey, it looks like things aren't broken. And if I pop open the dev tools, go to the Console tab, you can see the issue we have. Yep, Angular failed to bootstrap itself. It can't find the Solitaire module. So now what do I put in here? Well, just drop in a script tag. Set the source on this equal to, and then in go in dist/app.bundle.js. By the way, this is one of the reasons why I like working with WebStorm or some sort of nice IDE. I get this completion here to help me find the file so I don't have to type everything out, and fat finger something, and then have to spend extra time troubleshooting. So let's go ahead and save that. And let's go ahead and refresh in the browser. And our application is still working. So we indeed know that our new bundle, which just has one of our files in it, and we can add more to it in a minute; we know that that is working now, while we leave the rest of the modules in our application as is. So this is why I said it's better to start with the entry points because they have fewer dependencies that will need to be updated.
IIFEs Are No Longer Necessary
I've got a challenge for you. Take a look at the bundle that webpack generated and the app.js module. There's something unnecessary in our app.js module. Try to figure that out. Pause the recording, work on that, and come back, and I'll talk through this. If you're stuck, here's a hint. Why do we have this function, this immediately-invoked function expression, an IIFE, why do we have this wrapping our code? Well, historically, if you didn't want to modify the global scope, you could use an IIFE to create a local scope for the code inside of this module. So it was a poor man's way of isolating scope so you weren't polluting the global scope. You can still use global dependencies, but you don't accidentally leak something into the global scope. So historically, we had to add this boilerplate. Now we don't need to, doubly because webpack wraps all of our module code inside of a function already. So this part is unnecessary here, now that we've got our code inside of a webpack bundle. So make sure you save that change to your app.js file. And if you run webpack again, exact same parameters, that will recreate the bundle. This time, you can see our app.js file was a little bit smaller. That's helpful to confirm that things were actually changed. And if I come back over to WebStorm, you can see the code seemingly transforms to remove the boilerplate. So that's not being copied over now.
Migrating a Second Module to the Bundle
Webpack Polyfills Module Loading
Webpack Polyfills Use Strict for Harmony Modules
Learning from Webpack Source Code
Generating an Interactive Graph of Modules
Webpack prints out a lot of helpful information in the console. You an even ask for more with the verbose flag if you'd like to know more about what's going on with your bundle. For example, by putting verbose on, you can see the reason that scoring was included is because of a Harmony import in app.js. Now while this information is helpful and detailed, it can be abstruse when you're trying to reason about, on a high level, what's going on. And that's why I really like these visualizations with a graph of all your modules, especially one where you can click around and see dependencies. So I want to show you how to set this up right now so that as you make changes and build out your bundle, you'll be able to generate this graph over and over and over again and see the changes to it, which can help you understand if things are set up properly or not. To do this, I've set up a npm package that has a script that I use to generate the graph. It says webpack-stats-graph package. First you need to install Graphviz, I have instructions for Mac and Windows here, and then also, you'll need to install this package using npm install, and then -g if you'd like that globally. After you get those installed, we need to generate a stats.json file. Instead of outputting information about the build in a human readable format, or at least partially human readable here, webpack can print out the information in a JSON format. So if I run this again, I get JSON out. So what I want to do then is capture that into a stats.json file. It's a typical name for this. You can look through that file if you'd like. It's just JSON inside. One thing to watch out for. Make sure there's no additional output somehow, like maybe a rogue plugin to webpack, has printed out some information as well. It might be captured in this file, and you might need to remove it. Once you have this then in the current directory, you can run webpack-stats-graph. It'll print out any warnings or errors if something goes wrong. Otherwise, it'll write the files out to a statsgraph. So it creates a graph for you. Actually, there's a couple of files that are written out. It generates a .file, which is a textual representation that's fed to the .CLI program to generate an SVG in the middle, you can see graph.svg, and then we also have our interactive HTML page, which has a copy of the SVG embedded inside of it, all standalone. So that's what you can actually open up, open statsgraph, and interactive. On Windows, you'll need to, perhaps, open up File Explorer to this location because the open command is not there. Anyways, open that up and boom, there's the graph. And we've got our two modules, app.js and scoring.js, and we have that relationship between the two. And then down at the bottom, you can see that we have our app.bundle.js file. So you can start to learn about what I've got set up in this graphing tool. The gray border here is essentially a bundle. In webpack parlance, this gray box is actually a chunk, which is just a way that webpack groups together modules internally. Relationships between modules are red. Additionally, you can see the chunk name up here at the top of the gray box. Up at the top is the entire compilation's hash. You'll notice that matches what you have in the command line output, so when you run webpack, which has a compiler inside of it to perform all the transformations we've talked about, and also bundle things up, and optimizations, and yada yada yada, every time you run it, there's a compilation. It's an instance of running the compiler against your code base, and out of that thing, comes a hash to uniquely identify the result, as well as to provide some information like the timing for the compilation. That compilation then generates assets, which you can see listed here, or you can see visually in blue. And there are modules included, and reasons for why those modules were included, as well as some attributes like sizing information. Now there are some options to the webpack-stats-graph tool. Grab the help for it if you'd like to see what those are. For example, you can show the size of modules. If I refresh now, you can see sizing information, 0.8 KB here for our scoring module.
I think we've done enough in this course module. What I'd like you to do is go ahead and take a stab at getting the other three app.js dependencies wired in, and update the app to work, as well as get things in your bundle regenerated, and make sure your image is regenerating for you. Take a stab at that, and come back, and we'll walk through it. All right, so looking at our application, it's these three scripts that we want to get rid of. And a tip, when you're refactoring an application into Harmony modules, if you're not certain about the order of things, and what dependencies there might be, try to preserve the script order here. So we have klondike, board, and game. So I'll get rid of that, and then paste those in here, switch the script tag to an import, and, of course, drop the closing script tag. And then one key piece, make sure you're using relative imports here. This ./ tells webpack to look in the current directory inside the klondike folder, and then look inside of there for game.js, or board.js, or whatever the module might be. This is one type of import, and this portion of the import is the module specifier. And as far as Harmony modules are concerned, you can only have URLs in here. However, webpack allows you to use a different type of import, for example lodash. We'll see this later on. This isn't an absolute or relative path, so it's not a valid URL. Sometimes this is referred to as a bare import or a module path import. Basically means this is something that has to be resolved, for example, with a node modules folder. We'll see more of that later on. Webpack will help you with all that, but in this case, these are files on disk, so we just need to point at those. Save all that, including the changes to our index.html page. Over at the command line, I can rebundle the application. While I'm here, I'll also dump out my stats.json. Back over in the browser, one of two things, sometimes I like to open up a new tab, paste in to take a look at the new graph that I'm going to generate. Nothing's changed. Why is that? I have yet to generate the new graph. So run this again, refresh, and now I have the new relationships. And everything looks okay. The reason I like to open a new tab, now you can see the old version and the new version side by side. You can see what's changed. And we can see we now have a substantial subset of this high-level overview of everything. Okay, maybe not a substantial subset yet, but we're getting there. And, of course, we want to make sure that our application still reloads, looks like it's okay, and maybe even play a few cards. Yes, all right, things look okay. Keep in mind all those benefits that we discussed. For example, if I pop over to the browser, and open up the network tools, and reload the page, you can see we now only have 26 requests. So we've cut out four requests. We did add a little overhead in that webpack runtime, but that's going to pay dividends later on when we look at more advanced features and optimizations. So throughout the rest of this course, we're going to take these benefits that we talked about, and continue to apply them to this application, and gradually refactor it into something much better than it is right now.
Benefits of Watch and WDS
Time for one of my favorite aspects of webpack, interactive coding. In other words, how can we accelerate our development process? So a few times now, we've gone through the process of compiling our application. Whenever we make a change, we come over, hit webpack at the command line, and out comes a new bundle that we can then test in the browser. So we write code, we compile that code, generate a bundle, refresh our application. That works great while you're starting out, but if you're like me at all, you prefer not to have to invoke webpack explicitly. That's what we're going to take a look at in this module. So I'll start out by taking a look at the watch mode in webpack. This will simply watch for changes and regenerate your bundle for you. I like to think of this as a first tier of making your coding process more interactive with webpack. Next, we can take a look at hot module replacement, or hot reload. That's the feature where we can literally push code into a running application. For this, we'll be using the webpack-dev-server. This literally is a development server, which means we no longer need to run a web server to host our game of Solitaire. It can do that as well, so it can kill two birds with one stone. Webpack-dev-server also supports a live reload-type functionality to just refresh the browser. We'll see how that works. In fact, that'll be a stepping stone before we get to hot replacement. And again, the difference there, browser refresh versus updating a running application. So the whole focus of this module is to make it easier to write code and see the impact. And I'm doing this at the start of the course so we can benefit throughout.
So the first step on our interactivity journey is the webpack watch mode. So back at the command line, when we run webpack, it runs, produces a bundle, prints out some helpful information, and then it stops. And if we want to run this again, it goes through the whole process again, taking as much time as before, assuming you don't have any caching set up. As you can imagine, if the build process continues to grow in complexity, the duration is going to increase. In fact, it's easy to get to a point where you have a 3- or 4-second, maybe even 10-, 15-second build the first time. However, it doesn't have to be that way on subsequent runs. If we run the compiler again, we only need to recompile the files that actually changed. So, for example, if I were to come in and modify the app.js file and exclude one of my dependencies, if I run webpack again, it should be able reuse the scoring, klondike, and board module compilations from before, whatever is involved in that. It should be able to reuse that because those filed haven't changed. But instead, you can see, it's taking the same amount of time. If, however, I just take the webpack command and append --watch, or -w for short, an initial compilation of everything runs, and you can see the output. But then webpack stays running. I don't get my prompt back. And you can see at the top here in the output that webpack is in watch mode. It's watching for the files to change. So now, if I split the screen, come down into my app.js, and uncomment right here, what do you think we'll see in the output up above? Let's actually give ourselves some more room here. All right, I'll save this, and there you go. We have the output of a new compilation. And in this case, the only two modules that are listed are the app and game modules. And that's because I changed the app module, and I brought in a new game module. It was commented out before. We don't need to run through the three previous modules. Those have not changed, so there's no reason to load them up, and parse them, and perform whatever transformations we might be doing. We can reuse the in-memory cached versions of these. And, of course, we spit out a new bundle when all is said and done with the changes inside of it. So simply by adding --watch here, we've now cut out one step. We don't have to come to the command line and hit webpack every time we make a change. So if, for example, I pull up my code and the browser, I'll come into the code, comment that line out again, if I refresh the browser, you can see the application is broken. And if I take a look at the console, you can see what's missing. It's because of that file that we commented out. So if I just come in here, remove the commenting, save this, come over the browser, and refresh, our application is working again. And behind the scenes, we have additional output for the subsequent compilations that took place each time we changed and saved our files. How do you think webpack knows which files to watch? Webpack figures out what it needs to watch by taking a look at the files in the graph of dependencies.
So the next best thing would be to have the browser refresh automatically when we make a change after the bundle's created. Just go ahead and reload the browser for me so when I switch over to the browser, or if I've got a split screen up, I can see the changes in action, so a live reload-type functionality. For this, we need to install the webpack-dev-server package. Save this as a development dependency. We won't need it in production. Let me say that again. This is not a production web server. No matter how cool it sounds to be editing your code and having it automatically deployed into your running production application, believe me, that's probably not a good idea in many situations. Once that's installed, that will provide a new command for the webpack-dev-server. What do you think we type in here to run that? Well, it's tempting to think that we can just run webpack-dev-server. But we didn't install it globally, so instead, we need to put npx on the front here, or path into where the executable's at in the node.modules folder. If I run that, grab some help, you'll see the various options that are available, and then up arrow again, we can actually execute this now. We can even append --open to go ahead and launch the webpack-dev-server and open up a browser window. However, in the process, we get a warning. We don't have a configuration file, and so webpack-dev-server doesn't know what to do. So we need to set up a config file before we can use this.
npm run-scripts for Documenting Webpack Commands
So technically speaking, we can go ahead and provide the same arguments that we passed to the webpack CLI, so our entry point and then the bundle file. But this is a good opportunity to talk about config files. So this here would actually work. However, I don't want to follow that approach. I instead want to set up a webpack config file, which you've probably heard about if you've ever looked into webpack yourself, and it's usually something that people start with. But the reality is, I like you to see that webpack is a CLI tool, so that's why I didn't start with the config file. And I do that because the config file can sometimes be off-putting, but not if you look at it from the perspective of what we've been doing thus far. So thus far, we've been running webpack and passing arguments to it. And right now, a few arguments aren't a problem, though we did pass that JSON flag at one point to generate stats. And as you could imagine, over time, you're going to want to configure webpack, and you're going to have more arguments to pass. And naturally, you're going to want to put this somewhere, at least in a readme so people know what to run for this particular application, and probably in something more automated. For example, it might be nice to have an npm run script, maybe npm run build, to kick off the webpack process. So we can come over to the package.json file for our application, and we can add in a new build script. And we can paste in what we we've typing at the command line. Note when you do this, you don't need to put npx on the front because when an npm run script executes, it will have that bin folder inside of the node.modules folder available. In other words, it's going to have access to webpack and other CLI tools that you've installed as a part of your project. So now that this is here, we can save this file, come over to the command line, and instead of running this, what would we type in instead? So here, just npm run, and then build is the name of the script that we created. You can see the webpack process kicks off, and out comes the results. You might also want some sort watch script. So we can duplicate our build script, flip this over to watch, and then all we need to do here, well, you tell me. What do we need to change inside of this command? We just need to come over and add the watch flag. Now, over at the command line, we might choose to run watch instead. So that's one convenient way of automating the process of running webpack without needing to remember what the arguments are or copy and paste some sort of long command out of a readme file. And this is a pretty good first step, but it can be a lot better, and that's where config files come in.
Composing npm Scripts
So we have two new commands for webpack documented inside of our npm scripts. Looking at these two, what might be problematic? Well, we have the same arguments passed to each. And, of course, if we want to write a script for our webpack-dev-server, well, we're going to need to duplicate the arguments here. So that's the third time. Oh, and by the way, we have this old start script that's no long valid because we won't be using HTTP server anymore. We'll be using the webpack-dev-server. Well, the problem is we've now duplicated our arguments three times, so this is another reason why having a configuration file, or some sort of file, that can put all of these common parameters together, would be a huge time savings. Then, of course, we can try to make these scripts composable. For example, watch does everything that build does, so we could have npm run build here, and then just pass one more argument, the watch flag. And that cuts down some of the duplication. That's not going to work for start, though, where we're calling the webpack-dev-server. So let's take a look at a config file.
Adding a webpack.config.js File
devServer.contentBase - Set Location of Static Content on Disk
All right, with our config file in place, we can use webpack-dev-server. Before you start that up though, it might be helpful to kill off the HTTP server that we had started up at the beginning of the course. That way, there's no confusion about which version of the application you're testing. All right, so what do I type in here to start up the server? So we set up npm start to execute the webpack-dev-server, so let's use that, and you can see our application opens up. By the way, if you want to speed that up, you can come over to the package.json file, and you can comment out npm install before we start our application. That's not necessary. As you're learning about webpack-dev-server and maybe starting and stopping it, you might not want to have to wait to run the install every time, even if it takes just a few seconds. All right, back at the command line, we have a lot of output that we'll work through, not all right now. The most important, though, is this URL that you can go to to hit the development server. I'll close one of these. So what do we do here to load up our game of Solitaire? Well, it's not loaded right now because we're seeing a listing of files. And I just want to start asking you questions to start reasoning about what we need to do to get our app working, because from time to time, you'll run into issues like this. Maybe the paths aren't quite set up right, and, in this case, you have to click on app to load up the index.html page. Now if you consider that annoying, and you'd like to load the app folder at the root of the development server, so you can see right now we're actually hosting a /app in the index. If you'd like to just have index be at the root of the website, well then, hop over to the config file, add in a section called devServer, inside of which there is a contentBase setting---this is one or more locations, so this can be a single value or an array of values---for folders that the devServer serves static content out of, for example, our index.html page. Remember, that's not part of our bundle. So we can path.resolve here, grab the directory this config file lives in, and in this case, just do app. So I'm saying, hey, I want the contentBase to be set to this app folder. By default, it's set to whatever folder you're inside of when you run it. Let's change that. Once I've done that then, saved all those changes, come back to the command line, and kill off the server, and start it back up. All right, now you can see that time when it started back up, it pulled up our index.html right at the root of the site. And by the way, if you come back to the command line, I want to show you something that can help you figure out what's going wrong. If you set the contentBase explicitly, you're going to get additional output here to tell you where the files are being served from. You can see right here, Content not from webpack is served from, and then there's our application folder. If you're familiar with Express to develop web applications in Node.js, well, we're using that behind the scenes here with webpack-dev-server, so you can transfer that existing knowledge. You're even free to extend the devServer, if you so choose. I've got a high-level overview of what we just set up that we'll use throughout the rest of this module to understand the different things that we're configuring with webpack-dev-server. So first up, we have our files that are part of our project. For example, we have our webpack config file, and we also have that app folder that has our source code inside of it, for example, the app.js file that is the entry point to our app, and then the folders, cards and klondike, that have many other files inside of them. In that app folder, we also have that index.html page, and we have the dist folder with the build output for webpack. Speaking of which, webpack pulls in our app.js file, loads all the dependencies, processes everything, and spits out our bundle for us into that dist folder. So we've been doing this with the webpack CLI. And then to access the app in a browser, previously we have http-server set up, just a lightweight web server that serves up static files on disk, then we had it pointed at the root of our project so that browsers could make requests, for example, to the index page inside the app folder, and those files would be served up, including the bundle. So it can be helpful to distinguish that we have two separate processes here right now. We have the webpack process to produce our bundle, and that's completely separate at this point from the http-server, which is serving up the files, including the bundle. Even if we turn on watch mode, that's still a separate webpack process, and in that case, a long-running process. And we continue to regenerate that bundle as changes are made to our source code. And now in this clip, we took out http-server and put in the webpack-dev-server, so it's now serving up our files. We've also seen how we can change the contentBase, which defaults to the current working directory, which was the root of our application. We saw how we could change that to something else, in this case, to our app folder. And that simplified the request then that we make to our website. So if you're ever confused about the devServer, separate out the idea of serving static content from disk from what we have with webpack to produce our bundle. And I say that because webpack-dev-server can actually do a lot more than just serve up static content. It also takes care of running webpack, thus, obviating the need to have two separate processes running. Now do everything we've done in watch mode with webpack, as well as serve static content, and that's what we're going to see next.
devServer.publicPath - Set Base URL to Serve Webpack Output
All right, I've got another challenge for you. So technically, things are not working with our webpack-dev-server. And depending on the order of what you're doing to follow along, you might've already stumbled upon this. So I want to show you what's wrong, and I want you to take a stab at figuring out why. So right now, our website loads. Everything looks okay. In fact, it's operational. If, however, we go ahead and remove the file that's inside of our app folder, in the dist folder, and get rid of our bundle, if we remove this, refresh the browser, you'll see the application is broken. Now you might be thinking, well, you just deleted the file that we had to create. So go ahead and kill off the devServer, if that's the case, and start it back up, and you'll find the application still doesn't work. What's going on here? Take a few minutes and see if you can figure out what's wrong, and then come back, and we'll walk through this. So if you take a look at what's going on in the browser here, if I click on the Network tab and refresh here, you can see that we are missing our app bundle. Makes sense. That's the file we deleted, and then everything stopped working. And it's not at all because we deleted the file on disk. Webpack-dev-server serves the webpack output from an in-memory file system. So it's serving up our bundle, and yet we're not pulling it down in the browser. And so what's the question that I need to ask to figure out why we're not pulling down the bundle, when, in fact, you can see in the output here that the asset's been generated? What's off here? Well, the problem is the location where the webpack output is being served from. You might have noticed this right here that the output is being served at the root of the website. So the devServer starts up, it produces the bundle, and then it serves it from that in-memory location at /, and then app.bundle.js. And just to confirm that, let's go over to the browser, and I want you to type in this /webpack-dev-server endpoint. You could also type in app.bundle.js, which is exactly what's linked to right here. I'll open that in a new tab. So you can see, there's out app.bundle.js being served at the root of the website, so that's the problem. If you go back to this endpoint, though, I wanted to show you this page as well. This is a great troubleshooting point. So if you can't figure out what's going on, come here, and you'll see where the webpack output is at, and you'll have links to it as well, which is really nice. All right, so our app bundle needs be served somewhere else. We're asking for it in the dist folder. And we could change our HTML page, but then that would break what we are doing when we don't use the devServer. Instead, we just need to configure the devServer to say, hey, we want to serve the webpack output from this dist folder. So how do you think I go about tweaking the devServer? How about we use the config file? So the config file will probably be the answer to a lot of my questions in this course. You might've guessed this from the slide. There's an option called publicPath. This is the location where the output from webpack is served. In this case, do /dist for the root, and put a slash on the end. If you don't put a slash on the end, you'll have trouble. Make sure you save the change. And pop over the command line and restart the devServer. And it looks like everything's okay now. We're getting our bundle down.
devServer.watchContentBase - Reload on Static File Changes
Now for the best part of the devServer, refreshing the browser on changes. So I've split the screen here, and I will open up the scoring module. And under the newGame function, how about we change the initial score to 10? I haven't saved this yet. Watch over in the browser in the upper-left corner, I'm going to save this, and there you go. You can see that the browser refreshed. So this is that live reload-type functionality. It is not the hot updating, we're not there yet, but it is a pretty good place to be. If you're having problems with change detection, you might want to see if your IDE has some special save mode enabled. For example, WebStorm has this safe write mode that I had to disable. With this enabled, you could have problems both with watch mode and with the devServer. Now because we're in the middle of a migration, which you might encounter in your own work, we have some code, like our scoring.js module that's a part of our bundle, and if we change it, the browser's going to refresh. We have other code, though, board.html. This is being served as a static asset. So what do you think is going to happen if I make a change to this? Let's save this, and disappointingly, nothing happens. Can you take a guess why that is? Well, the scoring module's a part of our bundle, but this board.html file is not, at least not right now. Now, later on, if it was, then everything would be fine here. The browser would refresh. But because it's separate, it's being served from the contentBase so it's a static asset. And by default, the webpack dev server will not watch these files for changes. You can change that, though. Where do I do that at? So, of course, that's going to be inside of the config file. Under the devServer block, come in here, and it's simply watchContentBase, and you can set this to true. After changing this config file, what do I need to do? Well, I need to restart that webpack-dev-server so it picks up the new configuration. And you can see test up here in the top, so let's go ahead and remove that and see if we get the refresh we want. So come over here, delete this out, save that, and there you go. We've got the refresh over on the left-hand side. So that's an option you might want to set, again, while you're migrating your application. If everything's over in webpack land, though, in the webpack output, you don't need this set. And this is an additional set of files that you need to watch, so it's going to consume extra resources on your computer. It's something you might want to toggle on and off as needed.
Webpack Configuration Documentation
Now if you're curious where you can learn more about the options for the devServer, hop out to the docs at webpack.js.org, pull up the devServer configuration section. There's an extensive list of options you can set, as well as explanations. In the process of looking for the docs, when you're out googling for the docs, you might stumble upon the old 1.X documentation. Just be aware of that. There is a nice warning here calling this out. You'll probably want to avoid this, though I've found at times there's some old documentation that's better than the new documentation, and it's somewhat relevant. You just have to be careful because some things are out of date. If you see the dark-blue header, you're on the right site, at least until they change the styling.
Looking Under the Hood of the DevServer
Let's take a look under the hood to see what webpack-dev-server's doing to refresh the browser when we have changes. So I've got the dev tools opened up here, and I've got the source code opened up over on the right. And I'm going to change the value here and save this, and watch the console on the left. Looks like there were a few messages printed, but they're gone now. If you want to see those, come in and check to preserve the log. You might even need to check your filters, if you have any. And then go change the value again and save it, and now you'll be able keep your console output. So when the page reloads normally this is cleared out to avoid any confusion about where these messages came from. So in here, you can see a couple of messages about the content changed. And we're reloading, we're recompiling, and now we're navigating back to the same location to refresh. And over on the right-hand side, you can see that these messages are coming from our app bundle. Let's go over and take a look at the resources we have, so over on Network tab. You might also want to click the Preserve log, otherwise the requests will be cleared out every time you reload. And then down below, we have quite a few requests here, so let's pare this down a bit. I'm going to remove anything inside of the cards folder, bower_components, klondike, and images folders. Now we're down to a few core requests. And I'll turn on the thick view for the rows so I can see the path here. So we've got two requests, one for WebSocket and one for info that really aren't a part of our application. We also have localhost, which is our HTML page, and then we have our app.bundle. We're familiar with those last two, but these other two, those are not our code. If you look at the WebSocket, take a look at the Frames tab, you can see some of the messages that are coming through. So we have a log level set to info, so some settings coming down from the server, we have the hash of the compilation, which, if you want, you can hop over to the console and compare, and then we have something about type is ok. And now make sure you've set Preserve log, come back to the code, and change things again. Save that. And now we're going to have a similar set of requests, and I'd like to sort these by start time. You can come over and right-click, go to Waterfall here, and make sure you have Start Time selected, and then go ahead and click on the Waterfall column. So here's our first request, localhost, it's also blue, and our second one down below. It's a way to kind of chunk together these different requests. You might also drop the size down. If you click on the last WebSocket, you can see similar messages with a new hash, and that matches the new hash we have at the command line. Go back to the old WebSocket. You'll see some additional messages show up. We have content-changed, two invalids, a new hash, and then the ok message. So somewhere, there's some code handling all of this. If you had to guess, where do you think this is at? The code that's refreshing is coming down inside of our bundle. If you come up into app.bundle.js, take a look at the response here, if you were to scroll through, maybe search for content and then changed, that's one of the messages, you can see right-hand we're logging out that the content changed, just what we saw over in the console a moment ago, so there's our content changed, and then over on the Network tab, you can see the reload. If for some reason you don't see this code, it might be that you're looking in the wrong location. Remember that devServer serves this file from in memory, not from the disk. So if you're looking at this file on disk in that dist folder, that's from a previous run of webpack, not from the devServer. Now imagine, instead of reloading the site, we could have code in here that simply updates parts of the application in memory in the browser, and that's what's known as hot module replacement.
Hot Module Replacement Overview
devServer.hot and the HotModuleReplacementPlugin
Just a heads up, webpack 3.8 was released, so I bumped up to that from 3.7. So we need to make some changes to our config file, so let's kill off the devServer. So over in the config file, I'm going to add an array of plugins. This is one way we can extend webpack, in addition to loaders. In fact, if you look at the core of webpack, it's comprised of a lot of plugins. Most of what we set in the config file maps to these internal plugins, and webpack comes with a bunch of plugins that we can use. So I'll bring in the webpack module, and I'll use the webpack module to new up an instance of the hot module replacement plugin. So this tells the compiler, hey, we want to know about updates. We don't just want an entirely bundle. Next up, we changed the devServer. There's a hot flag that we can set to true. And once we've set that, hop over to the command line, npm start, and over in the browser, you can see our application is running. I'll open up the dev tools so we can confirm what's going on. And you can see in the console this is your confirmation that HMR is working, or hot module replacement. You'll see a few messages. And all of it, code-wise, is coming out of our app.bundle. So now if I change my scoring module, when I save this, what do you think will happen? I've somewhat led you to believe that we should have module replacement, when I save this, you can see the browser refreshes. Let me do this again. The console messages are the same that we saw a moment ago, so it seems like the console was cleared out. And network request-wise, it also looks like we have a full set of new requests made. Just to confirm, I'll put that filter in from before. Yep, make a change again, save that. Yeah, the whole page is reloading. Why is that?
What could I look at if I wanted to figure out why we're reloading? Well, one thing that would be helpful is to keep the log of messages and the log of network requests. Right now, they're being cleared out, so it's hard to see what's going on. So I might come back in here and preserve the log in both of these places. Now when I make a change, save that, now we can keep the requests around. Now what should I look at? Well, how about we sort these? So here's details about the first time we loaded the page. Here's the second load. And the first time, you can see we've got our content changed. Looks like we've got a type of hot now. The rest of this looks the same as before. And if you come into the WebSocket down here, nothing interesting here. So this doesn't seem so helpful. Come over to the console, and here we go. We've got some interesting messages. Can see the messages that seem to indicate that we simply reloaded like before. And the reason for this, well, we don't have any code client-side that can handle swapping out the old scoring service for the new one. And when that's the case, the devServer is going to fall back to reloading. If you want, you can turn that feature off, if it's annoying you. For example, if you're missing the error messages in the console because the page reloaded, you can come set this to hotOnly, and the reload fallback will be disabled. It'll only be hot updates.
Automatic Restart After Changing webpack.config.js
So let's see what this hotOnly is like. I've saved this, and, of course, I have to come over to the command line and kill things off to then restart the devServer. That's getting a bit annoying, actually, and as you're learning to make changes to the webpack configuration, I actually recommend that you have something set up to watch the config file and reload the devServer if you change the config file so you don't have to do this all the time. And my preference is to come over, open up the package.json file, come down to start here, and add in nodemon. I'll watch the webpack config, and I'll execute then the webpack-dev-server. And then before my arguments to the devServer, I'll add an extra -- that says, hey, nodemon, these aren't your arguments. We'll go ahead and save that. I also need to install nodemon. You don't see it up above in the dev dependencies. So over at the command line, install that package. And now npm start here. You can see it's watching the webpack config file. Starting up our devServer with the open flag, so that looks okay. And the best part, split the screen, come over to the config file, if I set this to hot, look on the left. We restart, and we pop open the browser again. Now that might get annoying, in which case, you might want to take off the open flag and just manually open things up yourself.
Disable Reload Fallback
So now if I set hotOnly, we can see what happens here. On the left, we restart the devServer. Oh, yes, and the browser open up. Why is that? Well, we changed the arguments to nodemon in our run script, but we didn't stop nodemon and start it back up, so make sure you do that. All right, over in the browser, I've turned off Preserve log, both the Console and the Network tab, so we're not seeing old messages. And I've reloaded here, so we have the new version of the app. We do have hot module replacement enabled. If we come over to our code, change that scoring module, what do you think's going to happen? Well, I told you that this would disable reloading, but as you can see here, this actually reloads. This one caught me off guard because of an additional setting we have set. Do you want to take a guess what's wrong here? So we turned on this watchContentBase for those static files, so it's picking up the change, and it's reloading in that case. And this makes sense. We can't perform replacement on static content that's not managed by webpack at all, so we do need to reload in that case, so a second flag to consider. So you'll want to set that to false. We'll reload the browser here, make sure everything's cleared out. Now we'll come make a change. Isn't it nice to have the devServer just restart when you change the config file? Save that. And there you go. Now we don't have a reload, finally. You can see in the output that we have a hot update. Hot module replacement plugin checks for changes on the server. We go ahead and pull down the hot update, but we can't find it, and so we're being warned that we need to reload the page. The good news is, we did not fall back to a reload. However, we do have failure if we wanted to do a hot module replacement. If we look at this request, so this last request here, what do you think is going on? What's wrong here?
If you look at the URL here that we're requesting the hot update from, this has the same problem that we ran into just loading our application bundle. Remember it didn't load up? Why was that? Well, it's not being requested from the right location. So the client, for some reason, is requesting from the root of the website even though we changed to that dist folder for the publicPath, so the file is going to be available at /dist/, and then that long hot update file name. Just like with setting the publicPath to dist, up above here in the output for webpack, we also have to tell webpack where the publicPath is at. And in this case, we'll use /dist/ as well. It's now redundant to have it down here below in the devServer block because the devServer will pull it from the output.publicPath as well. It doesn't hurt to leave it, though, to be explicit. Back in the browser you can see the app attempted to pull down another update. That didn't work because we need to reload to have the new update code that looks in the right location. Look at the console, no errors, at least on startup. But we need to come make a change and not see an error when we update. So 10 here, save that. Yay, no error now. What do you think these three yellow messages mean?
Identifying Modules by Name with the NamedModulesPlugin
So the three yellow messages are what we get to indicate that there was an update that wasn't accepted because we don't have code in our application to perform the replacement, so for that scoring component. And this is somewhat obtuse because by default, webpack uses numbers to identify modules. One thing you can do to make this a little bit easier to understand is to change the module identification strategy. And if you had to guess, how do you think we might do that? Well, if you guess configuration, you're absolutely right. Specifically, we have another plugin to install. This is another one provided by webpack. It's the NamedModulesPlugin. Now over in the browser, I'll reload here, clean the slate. Preserve log is not enabled. Come back and make a change to our code. Now we can see a little more about what's going on. So our scoring module, that's what we updated, bubbled all the way to the top of our application, our entry point. So neither of these did anything to accept the update, to perform the replacement. And that makes sense we don't have any code for that right now. And we also have hotOnly on so we didn't reload the page, and so now we do need to reload things ourselves. That's why we have this recommendation to reload. Of course, the reason to use this hotOnly mode is because now you can see what exactly happened before you reload the page. If you don't reload the page, things could end up wonky in your application, or maybe not.
Hot Swapping Methods in a Live Application
Anybody can provide the code to perform replacement, you can write it yourself, and to see that, I'm going to paste some code in here and make a slight change to our scoring module so that it can take care of updating itself. First up, I'll extract the scoring function. If you're confused at all, don't worry, you can follow along with the Git repo for the course. And I'll paste in a little bit of code that I precanned here. The gist of this, when we make a change to our service code up above here for scoring, any of the functions that we change will be updated on the scoring service. If you look at the code down below, you'll see that behavior. We're looping through and just overriding all the functions only. Obviously, we could do things with properties as well. Maybe we want to pull over some of the data for our application, but I'm not going to do that right now. Just an example. So I'll save this here. And you'll see an if module.hot. This checks if we're hot replacing so that this code can be stripped out, for example, in production or when we're not using hot replacement. If we are then this line right here says that this module can update itself, so notifying webpack so it doesn't bubble anything up to a module above. And then the code to do that is right here. Also, I'm going to come up and dump out a log message so you can see when we evaluate the module, so when the scoring module is loaded we'll print out that information. Okay, over in the browser we'll refresh here. Any of the scoring output here is from that code I pasted in, for example, evaluating that module for the first time. And now if I come up to the top, make a new game, you can see each time I make a new game, we shuffle the cards, and the score is set to 1. So I notice that bug, I want to change things, I hop over to my code, scroll up to that section for the new game. Let's just say I set it to 2. Save that. So over on the left-hand side, you look in the output, it looks like we updated things, so we hot swapped the scoring module. And if that's truly the case, we should be able to click on New Game, and take a look at that. We've got 2. Let's do that again so you can look at everything. Take a look at the browser. It's not reloading. Save again. No reload there. We just have more console output down below. And when I click New Game, you can see we're using a value of 3. I can put it back to 0, no problem, change it again. Down in the console, we've got lots of output for each of the updates. So we're replacing the methods on our scoring service with new code, which is great because a lot of times, we're just tweaking some existing code to see the behavior of it, so no problem just to swap in a new function, assuming it has no side effects. And so now I can hit New Game, and we get 0, as we should.
Hot Swapping Can Maintain State
So I'll admit, changing the initial score might not be that demonstrative of the value of this, in terms of the state of your application. So about we change something in the middle of a game, something that doesn't require us to reset the game to see it working? So this logic here when you flip over a tableau card, right now we increment by 5. So, for example, if I take this 5 right here and drag it over to the 6, that tableau card in the middle pile is going to flip over and give me 5 points. You can see the score up above sets to 5. What if I notice that that's wrong, and I'm in the middle of a rather complicated game? For example, maybe I've got the ace up above giving me a score 15 now. And I'm ready now to move this 4 over to this 5, but I want to make sure I update my code first so I can use that to test the new logic. So over in the code, let's say that the correct value is supposed to be 3. I've changed that. I'll save this. Notice the browser doesn't refresh. We get our hot swap logout put down below, so it looks like things are okay now. And if I move the 4 over to the 5, the score should go up to 18. If it goes to 20, that's the wrong value. Ta-da. Isn't that awesome? Right in the middle of testing an application, we can update it without losing our state. That's really, really valuable. In fact, there are quite a few libraries developed around organizing the state of your application so it's easy to reload much of the application itself without needing to wipe out the state and perhaps reset it.
Inspecting Hot Updates
Do you remember where we can go to see what pages, or bundles, the devServer is serving? So keep this in mind. You can go to /webpack-dev-server and see your bundles. And that can be helpful in the case of hot updates. As you can see here, we have a list of the updates that were applied, and this might help you learn about what's going on. And if you click on one of these hot updates, you can see what exactly was updated. And in this case, we're just updating one of our modules, the scoring component, so that's really all we see inside of this update chunk. If I go back here and open up the next hot update, and if I toggle between this one and the previous one, and you can see the difference here, we've extracted the scoring function, and our hot replacement code is down below. Take note that instead of module.hot in our if expression, we instead have true. And that's because webpack at compile time evaluates that expression and replaces its value with either true or false. If it's true, obviously the code will operate when we run the site. If it's false, the code won't operate, and chances are, you'll have some sort of minification that will remove the code because it's not used. That would be, for example, in a production environment.
So we've taken a journey here toward a more interactive approach to writing our code. We even hit my fourth tier, which is watching the config file and using nodemon to restart our devServer so that when we're learning and changing that config file, we can see things reload automatically. I consider that the top tier on the right. But even if you don't get that far, and you just have the watch mode enabled with the webpack compiler, that can be immensely helpful to not wait for an entire build to execute again the next time you make a change. Adding on that live reload-type experience with webpack-dev-server is wonderful, even if you're not using the hot replacement that we saw. Of course, if you can get some hot replacement in, I would encourage you to do that. We saw in this module how to write our own code. Later on, we'll see the style loader and possibly some other ways that you can use existing libraries so that you don't have to write that plumbing yourself. Next up, let's talk about customizing our configuration per environment.
Dev Isn't Prod
It's not that common to have an application that has the exact same code in development as it does in production. So when we're ready to go to production, we need some way to tweak the application to take some of that code out or put some other code in, and that's where configuring webpack per environment can become very beneficial. For example, we saw in the last module how we can inject in hot replacement code. That's something we want in dev but we don't want in prod. We'll see that, as well as other things throughout this module. Now I have a question for you. How do you think we will make our application configurable per environment?
HMR Plugin Bloats Production Builds
Conditionally Adding Plugins with NODE_ENV Environment Variable
So do you have any ideas for how we can make this hot plugin configurable to turn it on and off? Well, one common approach is to use an environment variable. And a common convention when working with Node.js is to use the Node environment environment variable. It's accessible with process global in Node.js .env, and then dot, whatever your environment variable is named, so NODE_ENV. It's pretty typical. You could always set up your own if you want. So how about we set up some sort of check is this development, and have this be true when the NODE_ENV is equal to development. Otherwise, we'll assume it's production. And with that, we just need to conditionally plug in this plugin. Many ways to do that. I like to extract out a variable here for the base configuration. That way we can modify this subsequently with logic. So you could imagine a basic configuration that applies in every scenario, and then we're going to layer on our environment specific behavior or functionality. So what do I do here? Well, if we're in development, then let's take the base configuration and modify it. Let's go to our plugins and push in a new plugin. Actually, we can push in multiple if we want. So add in that HotModuleReplacementPlugin, and how about we grab this NamedModulesPlugin and mention that that's meant for development as well. So now we can clean things up above. Over at the command line then, now when we run our build process, what's going to happen? Will we or won't we include the hot replacement plugin? Let's run it find out. So judging by the size of our bundle at 8.83 KB, closer to the latter case here where we didn't have that plugin. It looks like we didn't have it added in. Why is that? Well, because we didn't do anything to set that environment variable to specify that we are in development, and so the default is going to be production, that lean and mean build. Now to set an environment variable, that's going to be operating system-specific, and more specifically, it's going to be shell-specific. For example, if you're using PowerShell, it's going to be different, as opposed to Z shell that I'm using right here, and some of the other Linux shells that can be different. So you're going to have to look up what you need for your environment if you want to follow along. For now, I'll go ahead and just run things by setting this just for the scope of this single webpack command. So I can set NODE_ENV=, and then development. And if I do that, and then right after, I run my command, npm run build, you can see we now have our 34 KB bundle, hinting at the fact that we have that hot module replacement turned on. Now be careful. You'll need to set that value specifically to what we're checking for. For example, if I do dev here and not development, I'm back to the production build. So if this is how you're intending to use this environment variable, you might want to add something to log out, whether we're using a development build or a production build. Now over in the output, that's clear. You can see right here, This is a production build, so I don't have to go off the size alone, which is going to change as we add and remove things.
cross-env and dotenv Help with Cross-platform Env Vars
I've got a question for you. If we run npm start here to launch the devServer, which build will we use? Let's run it and find out. You can see here, we're using the production build, and that's because we didn't specify the environment variable to change that. We can come over to our package.json file and do that, but what am I going to type in here? Well, I might be tempted to type in NODE_ENV=development just like I was doing a moment ago. Save that. And then back at the command line, run again, and you can see we're now running a development build, and that works fine, until, of course, you have an environment maybe like PowerShell. So if I hop over to PowerShell here, pull down the project, the latest changes, type in npm start, and we get an error back. PowerShell's trying to treat this NODE_ENV as an internal or external command, and that's not working. One way to fix that, this cross-env npm package can be used. If you scroll down, you'll see some examples, but basically you can just add cross-env in front of the command that you're going to run in your build script, and then you can use this terse syntax to specify your environment variables. Another approach you could take, you can use a tool called .ENV. There's an npm package here that you can use directly. This allows you to read environment variables right from a text file. Down below, you can see an example of this, so the same terse syntax, but in a file, which definitely will work across platforms. And along with this, there's a plugin for webpack. The gist of this, it's going to read those files, parse them, and then export the values into process.env, and conveniently, externally set environment variables can override the value, so you have some flexibility.
Using a CLI Argument to Set the Environment
So yes, environment variables work, but when we start pasting them into these npm run scripts, they're really no different than just passing arguments. So, for example, up here on the build, we could have this set up to NODE_ENV=production, and so now it's like we're just toggling a command line argument that we're passing to webpack. If that's the case, why use an environment variable for something that we could simply use an argument for? For example, we could set an environment flag to production, and then down below, instead of setting NODE_ENV, we can come to the end of this, tell nodemon we're done passing arguments to it, and then say hey, env, and in this case, development. This is all possible in webpack 2 and beyond, which uses yargs for parsing arguments. You have this environment argument all to yourself, whatever you want to do to it. In this case, I'm setting the environment to a string, so over in my code then, I need to bring in this environment arg somehow. And the way to do that, instead of exporting a configuration object, webpack supports exporting a function. So if you use a function here, you can receive the environment argument, and now we can use that environment argument to determine which environment we're inside of. In this case, we'll set isDevelopment =, and then we'll check environment for development. And then I just need to bring this modification code inside of the function so I can conditionally execute it, come up to the top here, get rid of that check, get rid of this, and bring the logging down below. So now we'll check, when this function is called, modify our base configuration, and then return it back from this function. And that function is what we're exporting to webpack. So webpack is going to read that env argument from the command line, it's going to invoke our function, pass in the value, a string, in this case, we then build up our config object, and return it to webpack. So with that saved, and with our package.json modified, we should trigger production on a build, and development if we run start. So over to the command line, npm run build, there's our production build, npm start, you can see we have our development build. And now the best part, on Windows, if I pull the latest and do an npm run and build, there's the production build, npm start. There you go. There's our development build. So this works nicely across platforms.
Environment Option Uses yargs
While we simply passed a string like this first example to specify the environment, there are other choices for what you can pass that allow you to create multiple different flags that you can turn on and off. So these are the different styles. For example, you might have a flag that enables or disables minification. Could have a flag that maybe produces that graph that we look at with the modules in our application. Maybe you want to turn that on conditionally in development. The sky's the limit. And if you just simply do env., you'll get an object for the environment instead, and it'll have properties on it for the various different flags that you want. So this is referred to as the environment option approach. I would like to point out, in the docs, there is one more page that talks about this and references this as environment variables. I've found that somewhat confusing. I like to think of this, --env, as an environment option to disambiguate it from what we just did a moment ago with environment variables, like system environment variables.
Multiple and Named Configurations
I want to touch on a few other options you have for how you can provide dynamic configuration, and one of those is the ability to export multiple configurations for multiple builds. So you can see here, I've modified to export an array, the square bracket there, and I've passed an object first, so like we were using before, a configuration object. And then I've also passed the function that we were working with. And then I close off the array down here. Inside of the function, I set the name of the base configuration object to base. And then on the first configuration object, I set the name here to other. So first off, you can export multiple configurations, and second off, you can name those. And then from the command line, if you're on webpack directly, both will execute as if you had two separate builds that kicked off. And if you look at the output folders, so if I dump out the contents, you can see we've got app/other and app/dist. You can also then pass config-name and specify that you want to build the base versus other, which is the name I used for the other one. So this is another way that you could flexibly configure a development versus a production build, or, actually, any aspect you want. Maybe you want a special debugging build when you're in development that has a lot of source maps inside of it. So this is another approach you could take using a configuration name and then select the config name you want at the command line. This ability to name configurations is a recent addition in 3.4, and here's the commit, if you're curious about the changes that were made. It's kind of fun to read through this sometimes. It's not that much to reason about, and it even has some test cases down below. Also, if you take a look at the docs for webpack under Configuration, come down to Configuration Types. You'll see a little bit more about what we've been working with here in exporting a function. We can also export a Promise. And then here's the multiple configurations that we just stepped through. So this is the other key to take away from this clip. You can have multiple configurations exported from a single config file. I assume at some point we will see some documentation here on this page for naming your configurations, or somewhere on the site. I haven't seen that yet, though. If you look for that, you might not find that.
Modularizing Config Files
Another route to go is to extract out your configuration into multiple files or Node.js modules. So that baseConfig that we have, that could be brought out maybe into a base configuration file. And then we could create a devConfig and a prodConfig file completely separate to modify and extend that base configuration and then export the configuration just for that particular environment. And if you come out to create-react-app, you'll see something like that here. You can see we have a config.dev and a config.prod, and if I click into one of these, scroll down a bit here, this is rather verbose, and actually has some good documentation, if you're curious, you'll find that this example has some common code for pathing and reading the environment. This is used in the devConfig file. And then way toward the bottom, well, actually right here, we're exporting the configuration object then for development. And it's got all of this configuration in here. If I go back and take a look at prod, you'll see the two modules, again, that are shared between the dev and prod configuration, and maybe some of these others as well. So if you like having separate files, that's another viable option. One word of caution, I wouldn't prematurely split up the config file. I would keep basic configurations inside of a single file, and only at the point where it makes sense that things are getting unruly would I move the configuration into separate files. It's just easier to reason about a simple configuration in a single file, and that's why I'll keep a single file, probably for all of this course. If I do change, it'll be because of some of the advanced configurations we're doing later on. And at that point in time, you'll see why I'm doing it.
webpack-merge to Merge Configuration Objects
Another tool you might like to try is called webpack merge. You can bring this is as an npm package, just webpack-merge. Make sure you npm install that. And then down below, instead of modifying the baseConfig, comment this out, we can create a new object by calling merge and passing to it, first off, the baseConfig, and then second off, a development configuration object. And inside of here, we can use the typical structure, just like with our baseConfig up above. I can specify those two plugins inside of here, which will then be merged into the base configuration. Let me show you what that looks like. So we have our base configuration, and we're creating this second object with our development-specific configuration. As you can see, it's convenient to be able to use the same webpack configuration structure. This is a nice alternative to imperative code that modifies our base configuration instead. And then when we call merge with these two objects, webpack will produce a new object with the merged result. Over at the command line, run webpack without any arguments. That'll be a production build. You can see it's still small. If I set the environment to development, there you go, we've got our 34 KB development build. Now I've got a question for you. What happens if I take this NamedModulesPlugin, come up above here, paste this into the base configuration, what do you think will happen now when I run the development build, well, and, for that matter, the production build? Let's find out. So let's run production first, and you can see we have a smidge bigger bundle. That's because we've invoked that new plugin, the NamedModulesPlugin. If I run development now, you can see we have both our plugins because we have our bundle size of 34.4, just like we have before. So this is a big benefit of this merge tool as well. It's intelligent about arrays, like plugins. It'll concatenate the two together. It doesn't just replace the second one over the top of the first one. And there are a bunch of features that you can configure, as far as this merge tool's concerned, with regards to webpack. This was built specifically for webpack, so come through here if you like this strategy, and look at some of the different options that you might want to use to make it possible to just merge together configuration objects, if that's the style you like.
Inspecting the Merged Configuration and Config Defaults
Now if you're not certain what this merge function will produce, I'll quickly show you here how you can create a plugin of your own to print out the configuration, the final configuration, once everything's merged together. So I'll just paste this in. And all I've done is added a new object here that has an apply function on it to which the compiler will be passed. So when webpack starts up, it'll call apply on this plugin, passing the compiler, after which I can use the compiler to register another plugin, my own, when the compilation is done. And when the compilation is done, I'm going to print out the compiler options object. Before I run this plugin, I do want to point out that there's a new syntax for working with plugins that's coming in a future version of webpack. Sounds like webpack 4 will start with this new syntax. Also looks like there will be backwards compatibility for what I just typed out a moment ago. I can't use the new syntax right now, but I do want you do know that the style might change for how we create a plugin. It's probably going to look something more like what we have here. We're going to do something like compiler.hooks, and then we want one on done, so it'll be .done, and then we'll tap in and add in our custom function to print out the options at that point in time. Behaviorally, you'll still be able to do the exact same thing with the plugin. It'll just be a little bit different syntax to set it up. Okay, now hop over and run a development build, and up above, you'll see quite a bit is printed out here. So we have the final result of merging objects, for example, both our NamedModulesPlugin and the hot replace plugin, and actually, down here is our custom plugin. So we have the net effect of merging, as well as all the defaults that webpack is applying behind the scenes, which is actually quite helpful if you want to learn what some of these defaults are, short of reading the code itself. For example, part of the default configuration points at the node_modules folder, as far as resolving modules, so that's how we can bring down an npm package and use it in our project. Of course, you can learn the same thing from the docs. For example, here are the docs for resolve.modules. And down below, you can see that this defaults to Node modules, which is what we just saw a moment ago. While the docs are great, sometimes I like to just look at objects running in a program to understand what's going on.
Conditional Code with the DefinePlugin
All right, let's clean things up a little bit here. I'll get rid of some of this commented out code, and I'll bring that NamedModulesPlugin down into the development build only. I also got rid of that custom plugin so we're not polluting the output. So thus far, we've seen how we can control the build from the config file. It's also possible that you want to control the build from the code itself. We saw this a bit ago with this module.hot check. This is an example of something that webpack provides to give you flexibility in your code. It'll inject true or false here depending on the environment that you're running inside of and whether or not you need this code. It's possible that you might want to inject your own customizations from the code itself. For example, in a development environment, it can be nice to print out extra debug information like we have here logging out that the scoring module is evaluating. But in production, we may not want to see these messages. So it might be nice to wrap this and only execute it if the environment is development. Of course, if I save this right now, and assuming I've started up my devServer, I can then come over the browser, and you can see that we have an error. And, of course, that's because this variable is not defined. So somehow, we need to define some sort of global constant and then turn that on or off from our config file so we can enable or disable this code. And we can do exactly that with what is known as the DefinePlugin. This, again, is available from webpack, create a new instance of it, and then you'll pass your definitions to this. You could have multiple constants that you define. In this case, we'll do ENV_IS_, and DEVELOPMENT. And then we need to give a value for this, and we'd like this to be a Boolean. So we could set it to isDevelopment. That means we're going to have to move it down below inside of the function where we have that available. It's okay. I can pick this up and move this down, paste this in here. That way, we have this in both production and development. Now I want to save this. With nodemon running, I should've started my devServer again with the new config, which means over in the browser, if I refresh now, we get our log message output that we're evaluating. And if I pull open the code here, you can see that we have the value of true substituted in our bundle, just like with the code below for module.hot.
Careful to Quote String Constants
Another thing we might want, in the browser we might like to print out if this is a development or production build. So I can come down to the DefinePlugin here. Maybe I want to set up ENV_IS, and if is development, then we'll set that to development, otherwise, production. Now we could go anywhere to put this, probably in the app module would make most sense, but how about we just come here and just put in a console log, and in this case, I'll just dump out ENV_IS. Save that. So we're just going to print out which environment it is. Over in the browser, you can see we already have a problem. Even if I refresh, we still have the problem. Can you take a guess why? So we have this error here, development is not defined. And if we take a look at the bundle, so there's the console log statement. What's wrong with that? Well, we don't have quotes around it. So an important aspect of this DefinePlugin, it literally injects whatever you tell it to inject into the code. It's a simple replacement. So in this case, we need to come into the DefinePlugin, and if we need quotes around this, then we have to put the quotes around it. So I could come and put quotes around each of these strings, but another common thing to do is to use JSON.stringify, which will wrap this in a string for us. That's why you'll see this in a lot of examples. This is somewhat obtuse, versus just putting the extra quotes in, but this is so common in webpack configs that we might as well use it. So if I come back over, I'll refresh the bundle code first. You can see we have our quotes around development now. And if I pop over to the app and reload, our app now works, and we have development printed out.
Passing Env Options or Variables Through to Code Constants
Now one simplification that's pretty common, instead of a ternary here to determine the environment and then map that back to development or production or whatever, you might want to pass along the flag that you had passed in, so the env flag, and so in that case, you can just dump env right here and pass that through to this constant. Over in the browser, you can see we have development in the console output now. So we can just pass that string from our command line argument right through to our code. And if you're using environment variables, you might decide to pass the environment variable for NODE_ENV, if that's the one you're using. And if that's what you're doing, usually it's pretty typical to not use a custom constant, but to literally use the same thing in your code, so then we'd have this over in our scoring module. And now this can be somewhat weird, because the browser code here, the scoring module, isn't running in a Node.js environment. And so it can be confusing to be using a Node.js API, but this is common. If you come across this, just be aware that there's probably some sort of code replacement happening, and it's typically from something like the DefinePlugin. Or it might be coming from the EnvironmentPlugin, which is just shorthand, a simpler way to specify these constants. So in this case, you use the EnvironmentPlugin, specify the environment variable that you want to map through to constants that you replace in your code, instead of using the more verbose DefinePlugin you can see below. They do the same thing. For now, I'm going to roll back those changes and just go back to the environment option, not the environment variables. I just wanted you to know that that's possible if you like environment variables.
Touch-ups and Recap
In the spirit of differentiating builds, how about we move one last thing before we finish this module? You don't have to move the devServer. It's not going to hurt to leave it up above, but we can now bring it down below pretty easily into our config object specific to development and make it really clear that we're not using that outside of development. From the command line, we can start up again, make sure our app loads okay in the browser, yep. And there's development in the console. Kill that off. We could do a build here. In this case, I can run http-server using that app folder, so this is a production build. App loads up, and you can in the console we've got production here in the output. Throughout this course, we've laid a nice foundation of flexibility. And we have a few differences now between a development build and a production build. We'll use this foundation as we go throughout the rest of the course to change some other aspects between different builds. For example, in production, we'll want some optimizations like minification, maybe even dead code elimination. In development, we might want source maps. We may also want those in production, but we probably want different source maps that make sense for production, that don't bog down our end users. You can also use this flexibility to produce multiple builds of your application. Say you want to build the game of Solitaire to run in Electron, instead of just in the browser. So that's a multitargeting example, and, of course, you're going to have some different configuration in both of those cases. Now, let's move on and take a look at transpiling our code so we can use the future today.
Transpiling: Using the Future Now
Installing Relevant Babel Packages
Using ES6 Class Syntax
To have something realistic to compile, I've converted the scoring module away from a constructor function pattern on the left, over to using an ES6 class on the right. This update's available inside of the GitHub repository for this course. Go ahead and pull down the new scoring.js file. Now if you hop out to the compat-table website that I was talking about and look for class in here, you'll notice that for the most part, browsers have support. You will notice, though, for example, with IE 11, that we don't have support. So this is kind of a fringe case, but it's possible you need to support older version of IE. And while we have support for a lot of modern browsers listed here, you might need to be supporting versions before these for Firefox and Chrome and even Edge. So we want to code what we have on the right here, but we want it to be transformed into what we have on the left, if even one of the browsers that we need to support doesn't support classes. So let's assume that. Now I've got a question for you. Where could I go to look at what code is actually executing in the browser for my scoring module? Well, how about we pull up the app.bundle? And to do that, how about we pull up the webpack-dev-server and click this app.bundle link? That way, you know where to go to find all of your bundled files. And then in here, I'll look for klondike and scoring. And here you go, we've got our module. So you can see right now the class is flowing through to the code that's executing in the browser. And right now, I have a modern version of Chrome, version 62, so this is not a problem. Nonetheless, let's get this converted over, and we'll know we're successful if the code that's here in our bundle is not using the ES6 class syntax.
Adding a babel-loader Rule
webpack-dev-server Adds Modules to the Bundle
Now you might be wondering what in the world am I talking about? What are these Node modules that we're including in our bundle? After all, this is the structure of our application, right? Well, we're using the webpack-dev-server, as I mentioned, and it can inject its own code, not only into that runtime where webpack has its bootstrapping application logic, but also, it can inject its own modules, and it's doing just that. And I've gone ahead and rendered a high-level overview of our application when we're using the devServer versus when we're not. So production's on the left; the devServer's on the right. Clearly, you can see there are quite a few more modules added. And the ones that are dark gray, those are npm packages that have been included. So it's a modification of all these extra packages that are polluting our diff of understanding what's going on when we add in the babel-loader. It's also slowing down the webpack compiler in general, and we'll see that in a minute. And if you're curious about generating this graph yourself, well, then inside of the repository, you'll find a new StatsGraphPlugin. This does two things. Once a compilation is done, it takes the stats object that it gets back and writes it out to a stats.json file. After it's done that, then it runs WebpackStatsGraph, assuming that that is installed globally. So this is somewhat of a hack. I just wanted to throw this together. This is a nice way, when you're using the devServer, to automatically have the stats graph regenerated for you based on the stats.json file. So this plugin takes care of everything. All you have to do is add it to your list of plugins. So it's these extra modules on the right that we don't want to process. That's what we're going to exclude here in a moment.
Excluding node_modules from babel-loader
So to get rid of these unnecessary transformations, we could come back to our rule. We could do something like, maybe put scoring on the front of here, so it'll only apply to our scoring file. But maybe we do want to apply this to all of our application files, in which case, we can use an exclusion instead with exclude. And again, we can pass a regex here, and we can type in a couple of things, node_modules, and I also have bower_components. Instead of excluding, we could include, so whitelist instead of blacklist, that's another choice, if you just use the include property. Behind the scenes, if you look at the webpack source code, in the RuleSet.js file, you'll actually find out that test and include do the exact same thing. Now when I save this and come back to the browser, refresh here, we've got our new bundle. Copy this. I'll come paste this on the right-hand side so I have the original on the left before we applied the babel-loader. And this looks much better over in the mini-map. Obviously, the hash is going to be different, of the compilation, and if I step through here, now we can see that just some of our application code has been reformatted slightly, so some white space has been removed. Not a big deal. And most important, come down to our scoring module, you can see we still have our scoring class, so that's not been transformed. Before we talk about transformation, though, hop over to the command line, and take a look at some of the timing from your webpack builds. Right now, this last one was 538 ms, which is not bad. I think we were at about 300 ms before we put the babel-loader in. But if I scroll up here to the build before, where everything ran through Babel, we were already over 1 second for a build time. So another big reason why you want to make sure you set up proper exclusions is that Babel is slow. And, of course, we have incremental builds in here as well. These are long because we had to run everything through the babel-loader for the first time. Nonetheless, having a first time build that's 1 second for no reason, not a good idea. So now I've got a question for you. Why is our scoring class not transformed?
Adding @babel/preset-env to Transform Classes
So we even transformed our class down to something more vanilla, like a constructor function, simply because we haven't configured Babel. Babel, as of version 6, is driven entirely off of plugins, and if you set up no plugins, there are going to be no transformations to your code. And to configure a loader, you simply pass options. In version 1 of webpack, it's important to note that a query string used to be used to pass options. Now the preferred approach is to use an options object. Now if you'd like, you can put these options into a .babelrc file. The babel-loader will pick that up as well. But I'll leave them in-line. So first up, we can either specify plugins or presets. I'll do presets, which is an array, and I'll specify the env preset. Now when I save that, I'll go back to the browser, reload again, grab this all so I can compare it, and drop this in to produce a diff. All right, so we have a few more changes than before. I'll search for our scoring class. And take a look at that. We now have a transform class on the right-hand side. We're no longer using the class syntax. We're using a constructor function. But we have some extra transformations up above with the addition of the env preset. So the plugins in our env preset are doing more than just compiling our class down to a constructor function.
Do Not Transform Harmony Imports and Exports
So we have some additional differences here, beyond just transforming our class. And some of those are just white space differences. But one of the notable changes is this difference in how we're importing our modules into our app.js module. If you look closely at the app.js module, those four imports there, those are what we have a difference in here. On the left is the old format. You can see a harmony import, and that's what we used. We used a harmony import in our app.js file. And on the right, you see this different syntax with webpack_require. This actually is a CommonJS syntax. If you notice, take off the webpack, and you'll have the word require. So this is like a Node.js module now. And we can understand the reason for that. If I come back to the config file, make an array around the env preset, so it's an array within an array, we can pass some options into this preset, one of which is debug, and we can set that to true. Save that, and take a look at the command line output, and go down to the latest output. And if you scroll up here, you'll see which plugins have been enabled by the preset, and that's based on its defaults. And then right above plugins, you'll see this interesting statement, Using modules transform into commonjs. So it's our env preset that is transforming the type of import that we have. Come into your configuration file, and in addition to setting debug on our env preset, add in another setting, and set modules to false. And the reason we're setting this, if you take a look at the options for the env preset, you'll see that the default value is commonjs. We also have choices for other module formats, or we can use false to just turn off module transformation. So imports and exports won't be transformed if we specify false. So if we come back to the browser here, watch this section right here with the four requires. Let's refresh. You can see, we're back to our harmony imports. This is a topic we will revisit when we talk about optimizing our bundle to eliminate dead code. FYI, if you look at the issues for the babel-loader, it's possible that we will see a change in the future where modules will be set to false by default, or we might actually get a warning message if we don't set this to false. And that probably seems logical, considering that babel-loader is meant for just webpack, but webpack isn't just meant for web development like we're doing in this course. Webpack can also be used for developing Node.js apps or Electron apps, and so setting a default here may not make sense.
devServer.overlay - Showing Compilation Errors in the Browser
Now while you're working with newer language features, you might make some sort of typo in your code, and if you're not paying attention when you save that and go back over to the browser, you might think things are reloaded, especially if you don't have the DevTools opened. But inside of the DevTools, you can see we've got a problem. So we a had compilation error. It's even pointing out where the problem's at, but if we didn't notice this, we could run into trouble and become very frustrated with trying to understand why the new code we typed out isn't working, especially with hot replacement where the update just won't be applied. You might also have just refreshed the page and your application doesn't work, and if you don't think to open the DevTools, you could be in trouble here too. And maybe at this point, you think to look at the console, and you can find the problem, but sometimes it's nice not to have to go to the console at all. And instead, if you pop over to your settings file for webpack, go down to where the devServer is at, add in a comma here, and then put overlay: true. Let me save this. Look at that on the left-hand side. We now have an overlay that tells us we've got a problem, which is great, especially if we don't have the console open. Can see right where the problem is at. And then once we fix things, save that, we can go back to working with our application. In this case, I did load the page broken to begin with, so I'll need to refresh it, just to clear things out. But if we had a working application and a few hot updates had simply failed, once one works, we can continue on without a refresh. And if I may, take a look at the documentation. There are some other options to control the output that you're getting from the devServer, both in the browser and at the command line. For example, the clientLogLevel can turn up or down the verbosity of error messages that show up in your DevTools in the console in your browser. There's also an info setting to control the information that's printed out to the console. A little further down, you'll have noInfo as well. In this case, the bundle output information won't be shown, but you'll still have errors and warnings. There's a quiet option to really turn things down. And down below is stats, which allows you to control what is output with regards to the bundle information and what files you'll see here. So if you want to narrow this down, or if you want to make it more verbose, for example, to show these 30 hidden modules, you'll want to come in and configure this stats setting. And regarding the overlay, you can set it up to show both warnings and errors. If you set overlay to true, it'll just be errors. You can also turn warnings on with this syntax below, and actually turn errors off, if you want.
Understanding Browserslist Queries
Now one thing you might be noticing, we're building up quite a bit of configuration here just for the babel-loader, and specifically, quite a bit for Babel itself. This, to me, is a good opportunity to pull some of this configuration out into a separate file. In fact, there can be an added benefit of extracting the Babel configuration. You can use, then, Babel from this CLI, in addition to using Babel integrated with webpack. And that can be great for debugging. So let's see what all of this looks like. So first up, I'm going to install the Babel CLI, which is @babel/cli, a scoped package now. What could I type in now to execute the Babel CLI? Well, I can use npx and just pass along Babel, which is provided by that Babel CLI package, and I could ask for some help. In the output, you can see the arguments we can pass, notably, we just need to pass along a file that we want transpiled. So I could do npx babel here, and then app/klondike, and then scoring.js as one of our files. And in the output, you can see we have our ES6 class not transpiled. Do you know why that is? Well, remember with Babel, if there are no plugins or presets set up, then there's no transformation to your code. And in this case, now that we're using the CLI and not webpack, we don't have our configuration from our webpack config file. So we can come into our webpack config, and we can lift out the options that we're passing to the babel-loader. Yank that out, make a new file, and I'll call this .babelrc.js. This is a new feature of Babel v7 to have a .js file for your Babel config. Why might we want that over just a .babelrc file? Well, just like with our webpack config, code allows us to flexibly create our configuration. All right, so inside of this module, we'll export our configuration that we just extracted from our webpack config file, so this is just the Babel configuration. Come over to the command line, and when I run this again, what do you think we'll see in the output? Is that what you expected? So if I scroll up here, you can see we're now using our configuration because the env preset has kicked in. We can see all the debug output that we're used to, and then down below, we can see our scoring class. And, of course, I wanted that to transpile. Why is the class back to not transpiling, even though we have our Babel configuration set up? Well, a moment ago, we changed the browsers that we want to support. We added in not IE less than 12, and so we no longer need to transform classes. Let me remove that, effectively adding back IE. So if I save this, come back to the command line, you can see in the previous output, IE is not needed up here in our targets, nor do we have the transform classes plugin listed. But if I clear this out now and run this again, scroll up here, quite a few more plugins used, notably transform-classes because we now support IE 11. And if I scroll down here, you can see we've compiled our ES6 class down into functions. A word of caution, because I'm using a relative query based on usage of greater than 1%, at some point, hopefully soon, IE will no longer have more than 1% usage, at which point in time, then IE won't be a browser that we're targeting. So you'll need to explicitly add it to the list if you want to try this with IE or find another browser that doesn't support promises.
babel-loader Works with .babelrc.js Too
What do you think we need to do here to fix up our webpack config now that we've extracted out the Babel configuration? Well, turns out we really don't need to do much at all. We just need to get rid of the options object that we no longer have, and the babel-loader, like the Babel CLI, will just look for that babelrc file or rc.js, in our case. So if we want to see that in action, over at the command line here, I can do an npm start, and you can see we still have your transformation because we've got our output here from out env preset.
Disabling babel-loader in Development Builds
How to Tell babel-loader to Ignore .babelrc
As a word of caution, if, for some reason, you don't want the babelrc file to be used for your webpack build, for example, maybe you have that config file for some other purpose, then you can come into the options object for the babel-loader and set babelrc to false. Before I save this though, if I run my production build, you can see we're using the preset-env above. When is save this now and run my build again, there you go. We're not using Babel. There's nothing in the output about preset-env. And so now you can set up your own configuration of the babel-loader, independent of your babelrc file.
Webpack Runtime Uses Promises That Might Need to Be Polyfilled
In the spirit of solving a real problem, if you take a look at the bundle, the development bundle, with the hot module replacement turned on, if you look in here, you're going to find usages of the Promise type. So webpack itself and its runtime expects for there to be a built-in Promise type available in the environment that's it's executing inside of, if you want to use this hot module replacement. And, of course, that's only going to be in a development environment. Nonetheless, it serves as a great example of something that we might need to polyfill. And promises are something you'll probably use in your own code, and maybe you need support older browsers, so this is a great example to work with. And, of course, right now in Chrome, the application works because the latest version of Chrome has support for Promises. In fact, every major browser has support for promises, of course, short of IE. So let's hop over to IE 11 and try out our devServer so we can break things and then see how we fix things by adding in a polyfill for promises.
devServer.host - Configuring External Access to WDS
All right, I'm over on my Windows machine now connecting back to my Mac, it's time to give the Windows machine some love, but I can't connect to the devServer back on my Mac. I've got the IP right. I've got the port right. I verified those. What do think's wrong here? Well, it's not that my Windows machine just hates talking to my Mac machine because my Mac machine gets all the love. It's a simple configuration problem. The devServer that I have running is listening by default on localhost only. And that's a good idea from a security perspective. If I want to change this, though, what do you think I need to do? So we just need to hop over to our config file and add in host, and then we'll do 0.0.0.0, and everybody in the world can connect to us now. Quick quiz, if I didn't know what this setting was, where could I go and find it? So keep in mind, you have this nice configuration documentation out on webpack.js.org/configuration, specifically a section for the devServer. And inside of here, you can see the host property that we can set. All right, let's see if this works now. Okay, looks like something is working, but something's off, so let's take a look at the console and see. If I look here, I've got a Syntax error. If I click on that, what do you think's wrong here?
Enable Transpilation in Development as Needed
So the problem is that we're allowing the class syntax to flow through. And, of course IE 11 doesn't support classes, either. Why aren't we transpiling this class, though? Well, we're set up with a development build right now, and we specifically disabled transpilation in our development environment because we don't need it because theoretically, we're using a modern browser. So for now, let's change that. I'll hop up to the development configuration and just add that babel-loader rule. So now it's both in development and production. And the only reason I'm adding this to development is because that's where I have the hot module replacement right now. I don't have that in my production build. And if I refresh now, there you go. You can our application is now operational.
Fix Polyfill Issues by Reproducing the Problem
What could I do here to test whether or not this browser, IE 11, supports the Promise type? So Promises are used by hot module replacement. How about we change our code and see what happens in the console. So I'll set the default score for a new game to 1. Save that. You can see an update is attempted over in the browser, but in the console, you can see Promise is undefined. Now we have a problem we can fix. If we don't have a problem when it comes to using polyfills, we're shooting in the dark to understand whether or not the changes we're making are actually working. And if you do that, I guarantee you'll be very frustrated. Polyfills will always feel magical. If, however, you have a real problem to solve, you're going to find that polyfills aren't that big of a deal.
Webpack Makes Using npm Packages Facile
There are several ways you can tackle polyfills. That's part of what makes them suck. But there are good reasons for the different approaches you can take. What I want to do is start with one of the most general approaches with Babel. We'll look at the problems with it, and then we'll incrementally work toward a better solution. I'm not out for you to become a Babel expert in this course. What I want you to understand is the implication of the different choices that you might make on your application bundle, and then your application's performance, as well as your compilation time. So first up, we're going to use the polyfill that's provided by Babel. This is the @babel/polyfill scoped package. And it used to be named babel-polyfill. So if we take a quick look inside of this polyfill, you can use unpackage to do that easily, just put the pack name in here, you can see that this just brings in two other packages. It brings in core-js, and it brings in the regenerator-runtime. And part of the code in core-js is going to provide to us the Promise type. One thing I'm really excited about about this demo, I finally get to show you how easy it can be to install a package, so @babel/polyfill is what we want, I can just install this package with npm as if I were working on a Node.js project, I can come right over to my application code, and I can just import that package. You might recall that I touched on this subject back in the beginning of this course when I talked about benefits of webpack. If you want to use some other library, just npm install it, import it wherever you need it, and just go, go ahead and use it. The days of worrying about where do I download this script at? Which build do I need? Is this site trustworthy? How do I add this script to my application? All these problems are gone. Instead, you just focus on what libraries you want to use, and then you get to work using them. All right, back to the babel/polyfill. One word of caution, you do need to make sure that you bring this in before you use it, and that's because we're importing global polyfills into the global scope. And in our case, because the webpack runtime needs this after the application loads, for those hot updates, it's okay to bring it in in our app.js file. Now before I save this, I want to pull up the graph of modules. This has been generating in the background, thanks to running that npm start command. Now there's a lot in this graph, and the colors are a little bit different. Earlier in this module, I made some changes to the parameters I'm passing to webpack-stats-graph to color-by-size, and also to show the size of various different modules. You can see the GitHub repo if you want to copy those settings over to your project. For now, what I want to focus on is the fact that down here in the bottom, you can see app.js and the four modules that we've added. And then we have quite a few more modules up above. Do you remember where these come from? This is code from the devServer and the hot module replacement logic. I would turn off the devServer build, but we need it for that Promise, so for right now, I'm just going to leave this in here, and ask you to ignore all this up at the top. And the easiest way to understand what to look at and what not to, take a look at the app.js file itself. That's really all we care about right now, and the dependencies that it asks for. And then, take a look at what else gets added to this picture when we go ahead and save our change to app.js to bring in that new polyfill.
Studying the Impact of @babel/polyfill
So let's go ahead and save this change to app.js. And what do you think's going to happen to our graph? Well, let's find out. What should I do here to study the changes to this graph? Well, I could just refresh the graph, but then I'd lose the previous state. I'd be stuck with my imagination to try to see what's different. Instead, I'll open it up in a new tab. Take a look at that. Just a few modules, eh? Well, it turns out, the polyfill, while it's complete and that it provides a lot to your because it has core-js and the regenerator-runtime, it has a lot going on. In fact, the core-js package is just crazy. It's a highly modular project, so the code is split out really nicely into hundreds or thousands of modules. So this becomes a bit hard to understand. But if you keep zooming in here, you'll start to see something that looks familiar, something that looks exactly like what we had on the previous tab, just shape-wise. You can see we have our webpack devServer code up here, and here's our app.js file that we need to look for. And right now it has five dependencies, and let's zoom in on those. Shift key will allow you to zoom in, then with your mouse, scroll up and down. So we got the four dependencies that had before, plus our new babel/polyfill. And the index.js in that babel/polyfill refers to the regenerator-runtime like we saw a moment ago. And if you follow this arrow, here's the shim for core-js. In fact, if you come up here, you'll see the word core core-js on this package somewhere. You can also click these. Don't forget that. I set this up so you could click on the graph and be taken to the package page if you want to learn more. Anyways, regenerator-runtime, not a huge deal at 1 extra module and only 23 KB, but there's a lot going on with core-js here, a lot of which, we don't need. In fact, if you search in here for es6.promise, so right about there, I think, is the 1 module that we need. Now it probably has some dependencies here, but it doesn't need all of this, so let's talk about how we can reduce this.
Testing the Promise Polyfill
Now that we've got our polyfill in place, let's test to make sure that it's working, and then we can reduce our polyfills down to just what we need. A couple thing you can do to test out polyfills in general, first off, without reloading the site, if you try to look for a Promise type, you won't find anything in completion, and you'll see that Promise is undefined here. Now when I reload the page, we need to test out if the Promise type is available, and the first way to do that, take a look at the completion. You can see we now have a Promise type. So that's one good sign. We can also come over to our scoring code here, change this, save that. Our app updates over on the left. No errors this time. That's a good sign. And let's just test that the new code works. Take a look at that. The score for our new game is 2, so our change worked. In a real app, I would strongly encourage you to have some automated tests that validate your polyfills, so as you evolve your approach, you can instantly validate that you haven't caused a problem. So now we can start to iterate on our solution to this problem, and we can use the presence of the Promise type in the console here as an indication of success.
Reducing Polyfills with core-js
So the babel/polyfill is a nice way to get started with polyfills because you don't have to think too much about what you actually need. You can just include the kitchen sink, and everything's just going to work. However, if you want a lean and mean application that doesn't have a long download and startup time for your users, then you'll need to be more specific about what it is you exactly need. And one way you could do that is just to cut out pieces you don't need, for example, that regenerator-runtime that's a part of the babel/polyfill. While it's small, we don't have a need for it right now, so let's get rid of that. Instead, we really only need the Promise polyfill inside of core-js right now, so let's just include only core-js, and let's see what impact that has. Now another way to look at the impact, with the babel/polyfill, we're up to 690 KB in our bundle. That's huge. In fact, that's why webpack is coloring our asset here almost a brown or goldish color, and it's telling us that this is big. You can control the criteria for which webpack decides that a bundle is too big. These defaults, though, are obviously pointing out a problem because the user experience for downloading almost a megabyte is not going to be pleasant, especially on a cellular connection. And it's possible that the size of this bundle is not a big deal. Maybe you're working on an internet application, and you have plenty of bandwidth internally, a real low-latency network as well. Well, then maybe you don't need to do these optimizations. However, with that said, the point we're going to refactor to will not only give you a more efficient bundle when you need it, it's also just as easy to work with as including the babel/polyfill, so stick with me, even if you don't want this optimization in your applications. So now my question to you, if I want just core-js, what change do I need to make in my code? Well, let's pull up our app.js file, and instead of importing the babel/polyfill, we can just bring in core-js instead, and specifically the shim from core-js, just like babel/polyfill does. Okay, and save this. And first up, take a look at our bundle, 665 KB versus 690 before. So we've shaved off about 25 KB. Not too bad. And if I hop over to the browser here, and I'll make a new tab so that we can continue to look at our history here, and at first, this looks exactly the same because core-js is a huge dependency, but there are some differences. For example, you can see the app.js entry point has moved down here, so clearly, something has changed in this graph. We have a few of our modules down here for whatever reason in the layout engine, we have the core-js shim here, and then we have board.js. And we no longer have the regenerator-runtime, so that's the difference here, and that's what we wanted to remove, this piece right here. And there's about the 24, 25 KB that we eliminated from our bundle. So hey, that's not too bad. That's a little bit better. And now what can I do to quickly verify that things still work? Well, let's hop over to our application in IE and refresh it, and we'll just make sure that the Promise type is available. And take a look at that. We're okay still.
Reducing Polyfills to Just Promise
So we've cut out a little bit of what we don't need. You toggle between the last two graphs, you can see just a small change there to remove the regenerator-runtime, if you squint. But if you were to look through all of this code, you can already see a bunch of red or pinkish boxes, those are all big modules. Remember, I'm coloring by size right now. Let's zoom in a little. So there's the Promise type. Here's an observable. Last I checked, I don't think we're using observables. Here are some collection modules, oh, that are a part of the map type, the built-in that mentioned. We also have weak-map, weak-set, oh, sets right here. And if you keep scrolling through here, you're going to see a bunch of built-is that we probably don't need, and that's because we're just including all these standard built-ins that come with the core-js shim. Why do that, though? Why not come over and take a look at core-js, and if you do that, you'll see there are various different ways you can reference the different polyfills that you might want. And if you look at the features approach, so this is basically selecting polyfills by feature, you can see there is an ES6 Promise section, and this shows us how we can just bring in the Promise type that we need. It's all we're using right now, at least as far as we know. So if you had to guess, what do I need to change in the application to get rid of those polyfills that I don't need? Well, I just need to pick up one of these styles of importing Promises. I have two different ways I can do that. Come in here and paste that in, so core-js/es6/promise instead of the entire shim. Now what do you think's going to happen to that graph when we make this change? So this is before, and I've made a new tab here to load the new version, take a look at that. Looks a lot better. We can almost read some of the module names without needing to zoom in. So clearly, we've pulled in a lot less here. And size-wise, well, we can go over to the console. You can see we're at 421 KB now, so down another 240ish KB. That's pretty good. And I should validate that the application still works, so let's reload the app and make sure we've got that Promise type available. Looks good. Now if you're curious at all, in the graph, we didn't zoom in, so let's do that now. We've got our three dependencies below. We also have the board up above, and then the important one, we have our Promise dependency, which is core-js, and that's all we're bringing in of core-js.
You Don't Need to Manually Triangulate Polyfills
We've done a good job of cutting back our bundle size while including polyfills. However, something should feel off with what we just did in the last few clips. What doesn't quite seem right here? A hint. Compare what we just did with polyfills to what we were doing with transpilation earlier in this module. Well, the problem is we manually went about the process of determining what polyfills we need. Step 1 was identifying the Promise type as needing a polyfill. So we'd have to continue to look through this code and know what to look for that might need to be polylfilled. And again, that requires going back, perhaps to this compatibility table, and taking a look at the various different built-ins that we might be using and searching our code base for these. Then for each of these, we need to determine if our browsers support the built-in type or not. And then for those built-ins that we need a polyfill for, we'll need to go about the process of finding the most efficient way to bring in a polyfill. This complex, error-prone process has to be repeated whenever we change our target browsers or whenever we add new code that requires a polyfill of some sort. And, of course, when things are difficult, the tendency is just not to do it, and instead, stick with older code styles that can be problematic for a number of reasons. But we have something that should help us out. We have this env preset that's supposed to understand the browsers that we want to support. It's then supposed to figure out what we need to support those browsers. And it does a wonderful job of this with transpilation. It can also do a wonderful job of this with polyfills.
useBuiltIns: 'entry' - Polyfill Based on Target Browsers
Now to understand this preset-env, what it can bring to the table, come in and comment out all the imports except the original import of the entire babel/polyfill. Once we've put this back, we're back to that large graph with both core-js, and, if you zoom in, the regenerator-runtime. And if you take a look at the output from the env preset, right at the bottom of its debug information, you can see this section here called using polyfills, and it says that no polyfills were added because we're not using the useBuiltIns option yet. If I want to use this useBuiltIns, where do you think I go to do that? Well, that's going to be part of our preset-env, which is configured inside of our babelrc file. I can come in and add a useBuiltIns, and I have a couple of choices. By default, this is false. If switch this to a string value, though, and this used to be false or true. It used to be just a Boolean, but recently it's been refactored to support a couple of different options, and one of those options is the entry mode. If I save this, and then back at the command line, if you scroll down, it looks like nothing's happened here. At least, we don't have anymore output from the preset env. If I come over and open up the package.json, and look at our npm start script, you'll see that we're using nodemon and watching the webpack config file. We're not watching the babelrc file for changes though. We should be doing that if we want to just restart automatically when we change our babelrc file. For now, you can also just come over and type rs into nodemon, and that will restart the process. And you'll get the new output from the preset-env. So here's our new output. And if you scroll down under the polyfills section with the entry option, you can see that the babel/polyfill was replaced with the following polyfills, and a big, long list here. And you'll see the reason why each one of these are included, for the most part, they're included because of IE 11. And then as we move through the rest of the files on our project here, you can see that the import for the babel/polyfill was not found, so what the preset-env is doing is it's looking for this import, and then it's replacing it with what you actually need based on the browsers that you're supporting. And actually, if we scroll up here, you'll see in the app.js file we specifically replaced our import of the babel/polyfill, and that's because we had that in the app.js file. You can put this wherever you need to put it so that the polyfills are loaded at the right time. So the babel/polyfill becomes a marker then, and the env preset does all the heavy lifting to figure out what you actually need. And if I go ahead and load the new graph, still pretty big, but little bit smaller than before. So before, after, before, after.
Changing Browser Query Changes Polyfills
To drive home the point that this list of polyfills that's included, that you can see here in what was replaced. You can also see this in the graph, if you zoom in on our app.js file and hover over this, you'll see the source code, or right-click and open this in a new tab, actually. So you can see all of these polyfill imports were added instead of our import for the babel/polyfill. And the rest of this down here, down below, this is what we actually had in our code file before. So it's just a replacement on that marker. This is quite a long list. If I come over and change the browsers that I'm going to support, so we're not going to support IE, what do you think will happen to that list of polyfills? So here's the new output. So all of the polyfills that were included, only because they were for IE, are no longer in this list. So now if we come back to the browser and load the graph, much better. Starting to look like what we had when we were just bringing in the Promise directly. And naturally, if I take a look at the source code for the app.js file, you can see the same list that we saw in the console output. Only about 10 polyfills here are added.
useBuiltIns: 'usage' - Polyfill Based on Target Browsers and Usage
Using the Promise Built-in Adds Another Polyfill for IE 11
Now one thing that's interesting here with our earlier discussion of polyfills, we had included the Promise type, that's what we were looking for, and now we don't have it. Why is that? So we don't have Promises in our code. That's why there's nothing that's been detected. We're not looking at the webpack runtime with the preset-env. First off, this serves as a good example of taking a look at what happens when we start to use a new built-in that we need a polyfill for. For example, I could come in here into my code and use a Promise type. You can now see in app.js we require the Promise type because of IE 11. Interestingly, if we load up the new graph, you'll now see a reference to the Promise polyfill, and if you look at the app.js file, you'll see a modular import to core-js for that Promise type. No way am I advocating to just add some code to trip a polyfill to be added. I just wanted to demonstrate what happens when we add a new built-in type. It will trigger a polyfill. In a real app, maybe you just want to include the polyfill yourself directly for a dependency that lives outside of your application code.
Someday: @babel/plugintransform-runtime + @babel/preset-env
Briefly, I want to talk about the transform runtime plugin, which is a different option for polyfills. This plugin also requires the babel/runtime package. Then go into your babelrc file and comment out the useBuiltIns. We're going to disable what the env preset is doing and use a different approach using this other plugin configured with just support for helpers right now. And now I've already run through a before and after. Before is just useBuiltIns commented out, so we have no polyfilling. And then here's what the new graph looks like with the helpers options turned on to the transform runtime. What's different here? Well, if you zoom in a little bit on our application modules that we care about, primary difference here is that scoring references two modules from the babel/runtime package. Now to better understand these imports, let's look at our compiled class before we added in the helpers. You can see three functions at the top here. Now imagine that every time you have a class, you have to paste in these three functions again. It's going to bloat your code base. Helpers are a way to reduce that repetition. Now let's switch to the scoring module with the use of helpers. As you can see, we have two imports here that refer to modules that are called helpers. So helpers are just reusable chunks of code. That's it. Because an import here is much less to type out than the entire chunk of code that you need. A second benefit of the transform runtime plugin is how it handles polyfills. So let's enable that feature, which, by the way, is also based on usage in your code. After these changes, this is what the graph looks like. Little bit more going on here. We have four imports now for the scoring module, and our app.js has an import now for the Promise that it's using. Remember we left that Promise behind, which, of course, is referring back to the Promise type in core-js. If you look at scoring.js, we have two polyfills at the top, followed by two helpers. And then here is our app.js module. What's different about how we polyfilled this? Previously, we were importing global polyfills. Now we're not. You can see a named import above for the Promise type, and then we're using that down below, so a substitution has happened in our code. And the gist is, at the end of the day, we're not polluting the global scope with our polyfills anymore. I suspect in the future we will see integration so that we maybe can just type runtime as the useBuiltIns option, and we won't need any of this plugin configuration.
What Is a Loader?
We've already worked with loaders in this course, so first off, what is a loader? A loader is simply a transformation. You can even think of it as a function, a function you pass source code to and then get something back, usually it's modified. So in the last module, we used the babel-loader. Can you think of some other loaders that you might want to use? So if you hop out to webpack.js.org and look at the Loaders section, if you click on the menu item here, you'll see a whole host of different loaders that you could use. And in all of these loaders, you'll find some sort of transformation or operation that's performed on a module's contents. So I thought it'd be a good idea now to understand what a loader is by making a loader. That way as we encounter more and more loaders, there's nothing mysterious about them because they really are easy to work with, they're easy to create, and they're easy to use. So the general idea as we work through our dependencies starting out with our entry point, our app.js file, webpack will find by parsing the source code, it'll find other modules that we're importing, and then of those modules it'll find any dependencies as well. And as webpack encounters these modules, it gives us the opportunity to manipulate the source code via a loader. And do you remember how we match a loader to a module? So as webpack works through the dependencies in your application, it'll compare the rules that you've specified to that module. For example, scoring.js right here matches our test, and this is not inside of the node_modules or bower_components folders. So this rule is going to apply, such that the babel-loader will be applied to our scoring module. In other words, we're going to transform our scoring module with Babel. That's really it. So let's go ahead and set up our own custom loader.
Designing a tee-loader
In the last module, we spent a lot of time looking at the source code before Babel, after Babel, and then finally in the webpack bundle. One of the ways we did that was looking at our graph here hovering over a given module and taking a look at the source code. In this case, we can establish what Babel did to modify our module. We could also open the source code up here and for example, we could see in this file we've added two polyfills and then we've also compiled our class. While this works, it can be somewhat clunky to click through this graph to look at the output from Babel. So how about we produce our own loader that can take a look at what comes out of Babel. And how about we just print that right to the console? This is what it will look like. We start with our source code, for example our scoring.js module, it's read from disk, and then it's passed through whatever loaders match, for example the babel-loader. And then whatever comes out of the babel-loader ultimately ends up in our webpack bundle. Now these loaders are modular, like Legos. You can stick many of them together to form a pipeline. For example, we'll create what I like to call a tee-loader. Think of this as tapping into our pipeline so we can look at what's going on. So the output from Babel will flow through to the bundle unchanged, but we'll also be able to print it out to the console. So for example, if we have our scoring class in our scoring.js module, when it flows to the babel-loader it's compiled down to a function, maybe some polyfills are added, when that flows to our tee-loader it'll be printed out to the console. And of course, maybe we want to see the source code before it goes into Babel so we could add another tee-loader before Babel runs and print out the file. That way, we can see before and after right next to each other in the console. So this is what we're going to build.
Creating a tee-loader
To simplify things, I'm going to use a production build for this custom loader. Now I've run this build here, and you can see that we have quite a bit of output. How about we get rid of some of this before we add our own? That could get kind of confusing. What could we do here to get rid of this debug output? Well this comes from the env preset. If we hop over to our babel configuration, we can set debug to false on the env preset. And now when I rerun webpack, that looks a lot better. So to create a loader, I've added a new file called tee-loader. You can call this whatever you'd like. It's at the root of the repository. Can you take a guess on a high level what we're going to put into this file? Well first up, this is a Node.js module because webpack runs in the context of Node.js. So we need a module.exports here, and we are going to export a function. A loader is a transformation, so it needs to take input, which would be the source code, and then it needs to return something. In our case, we don't want to make any modifications so we'll just send back the same source code that we received. And let's print out a message that says that this is coming from the tee-loader. What comes next here? Well I need to get this loader added into the pipeline. And to do that, I will go into my babel-loader configuration and in the rule where I set what loader to use, I'll turn this into an array so that we can have multiple loaders. And then I'll come right before the babel-loader, and I'm going to add a string in called tee-loader. I want to show you that there are several ways that you can configure a loader. If you don't need to pass options, you can just use the string syntax. We could do the same thing here with the babel-loader now, and so this is the array that is formed. Now you might be wondering why I put the tee-loader first in the list if I want it to look at the output of the babel-loader. I did that because loaders are run in reverse order. One way to think about that, think of these as function calls since they are functions. So you could imagine just typing tee-loader and then babel-loader, and then the original source. So the original source is passed in to the babel-loader function, the result is then passed in to the tee-loader, and the result of that is then passed back to webpack to go into the bundle. So that's all I need to do to form my pipeline. If I hop over to the command line and run my production build, we've got a problem. Can you take a guess what's wrong?
resolveLoader.alias to Resolve a Custom Loader
The problem is webpack cannot find our tee-loader. It's been able to find the babel-loader though, and that's because we installed it into the node_modules folder, which is a default location where it looks for loaders. A couple of ways we can fix this. We could modify this set of modules or folders that we look for modules in, so that's config, .resolveLoader, and then .modules, we can add some folders to this list, or we can set an alias and specifically point at our individual loader. Either is fine. I will do the latter. So instead of my configuration here, and I'll keep this all bundled with the babel-loader for right now, I need to add a resolveLoader section. And inside of here, I can add the alias section in, which is a mapping of names, for example the tee-loader, and then I need to map to a location where I can find this. It's pretty common to use path.resolve for that. I'll go ahead and import path. And quick quiz, what do you think I pass to path.resolve? We just need to specify the location. One way I could do that is to give the current directory, the directory this babel-loader config script is inside of, and then specify the tee-loader file. So now whenever I ask for the tee-loader, it'll be resolved from this location here, which is the file that I've created. So if I cross my fingers and run this again from the command line, hey hey, look at that, we've got five outputs for the tee-loader, makes sense. We have five application modules that should be matching to it. So this is using the exact same rule that we're using for the babel-loader, we've just strung together two separate loaders, I think of just a chain of these or a pipeline of these, and so whatever files match these conditions will also be fed through our tee-loader after the babel-loader.
Logging Request and Source per Module
So how about we print something more meaningful? If I come over to my loader here, there's a new groupCollapsed in the latest version of Node.js. It won't collapse the output in the console, at least not right now, but we're going to take a look at this in the Chrome browser in a minute, and then you can use groupEnd here. So this will group together my output so I can correlate what's what here and I can print out the source code, which will be multiple lines of information. It'll all be nested inside of a group here in the console. It might also be nice to see which file we're working with. So how about we take a look at what is the loader API? This is a set of functions that you can use in the context of a loader. I'll let you peruse this at your leisure. Down below, you'll also see some properties. For example, this.resource. This will give you basically the file that we're looking at so we could print this out. So in this case, I could add on this.resource to know what file I'm looking at. And over in the console if I run again, now we can see a lot of output. Here for example is the app.js module, and this is our source code after it flows through Babel. Confirm that this is working by looking at our scoring module next, and here are the two polyfills that are added, as well as our complied class.
Collapsing Grouped Webpack Console Output with Chrome DevTools
So we have all this output in the console, and this works for some simple situations, but this could become overwhelming. I put in that grouping because if we run our Node.js app and debug it with Chrome, we can get our console output over into Chrome and it has a much more rich console experience that understands the idea of a collapsed group. So let's see that here. So first up, you can run node and pass the inspect-brk flag that says, hey I want to debug my Node app and I want to break immediately once it starts up. The debugger from Chrome then can connect to this. And then from there, we can resume the execution of our program. We could even step through things if we want, but most important, we can see in the console in Chrome in the debugger in the DevTools the output from our tee-loader. So I'm running Node directly. I'll go into the node_modules folder and call webpack directly then. And I just need to say hey, use the production environment here. So the application has paused at the command line waiting for the debugger to attach, and that's because of this flag right here. Over in Chrome, the debugger in the DevTools has opened up, and that's because I have a plugin installed that helps me out with this. It's called the Node.js V8-inspector Manager, and it has an option to automatically open the DevTools, which is great if you want to use this for debugging Node.js apps. So I've added this to Chrome, and that's why I have this nice experience here. All I have to do here is hit Resume. The program will run. It's a little bit slower sometimes than at the command line, and in the console here, we should see output. And take a look at that, we've got our tee-loader output, and now each section is collapsed. So this is a nice way to look at the information.
Debugging Webpack with Chrome DevTools
While we're on the subject of using Chrome as a debugger, I'll restart the whole process here. You can see I automatically connect. If you'd like, you can step through the webpack process here and look at what's going on behind the scenes. So this is a nice way to learn about webpack. You can even set breakpoints here. Run until you hit the breakpoint. You can step into this call. You could even step in and take a look at what's going on here in webpack's configuration of its argument parser. So this is pretty neat.
Adding the Same Loader Twice
Now we have the output here of Babel, but I'm thinking it might be nice to see what goes into Babel so I can compare them side by side. So let's add another tee-loader right before Babel runs. Do you want to take a guess how we do that? Well, just come over to our configuration and add a tee-loader right after. So the one after is going to run before. And now when I run the application again, you can see we have the before and after. So here is the scoring before with the class. Here's the scoring module after with the compiled class. Now just because these are in order doesn't mean that that's easy to understand. How about we put a special label on here of before and after so we know which one is the input and which one is the output.
Passing and Parsing Options in the tee-loader
All right, so my question to you. If we want to show tee-loader after, tee-loader before over in the console output, what do you think we need to do here as far as our configuration is concerned? Well this to me sounds like a great opportunity to pass some options to my loader. Do you remember how I do that? Well the recommended approach is to use an options object. So I'll turn my loader into an object with a loader property to specify the name of the loader, and then I can pass options using an options object. And how about we specify a label. And what label do I need to pass to this tee-loader? Well this is the before, so let's save that, and then hop over to the tee-loader. And then to access those options, it's pretty common to use the loaderUtils npm package. You can call getOptions and pass this, the options will be parsed for you, and you'll have a nice object you can use. And then with my message, I'll put a dash and we'll do options.label, so you can work with it almost as if it was passed right from that babel-loader configuration. Now I want to save this and when I run things again, now we've got our before, but then we don't get any other output. If you come over to the command line, you can see we've got an error. We can't read the label property. That's because we didn't set an option on our second tee-loader. So one thing to be aware of with this getOptions method, it returns null in the event that you don't set options. So in that case, let's just create a default option object here, and we'll set the label in this case to just empty. Leave that, and now when we run things again, we've got our before and then our after is just a hyphen.
Legacy Option Passing via a Query String
So if we'd like to see after on our second loader, come back to our config here, and we could duplicate the options object. I'd like to also show you that an older approach for setting options was to use a query string. So I could use label here and equals and then after. This still works. It's not the preferred approach. If I run things again, there you go. We've got before and after. So a different way to set options that you might see in some older blog posts. Definitely the preferred approach is to use the options object.
Inline Loaders Are Occasionally Useful
There might be times when you want to hone in on a very specific module and run a loader through just it. And of course, you can come over to the configuration and change the rules. For example, if we only wanted to look at the output from the scoring module, we could do that. But then we're no longer running the rest of our modules through the babel-loader. It's possible we just want the tee-loader to take a look at what's going on with Babel with regards to our scoring module, but nothing else. So let's comment out the loaders here, and let's just check the output here and make sure they're disabled. So no output from the tee-loader now. And then instead of configuring anything with the webpack configuration, come over to your source code. For example, the app.js module where we request the scoring module. Right at the start of your module specifier, you can tag on your own loaders here. For example, we could ask for the tee-loader, and then put an exclamation mark between the loader and the module. And now, this is the only module that will flow through the tee-loader. So look at the output. You can now see we just have the tee-loader, and we can see the output here of the babel-loader. So this style is referred to as an inline-loader. It's discouraged, but there are times when it's helpful. From time to time, you'll also see this in older examples that might still be applicable, and in some of the innerworkings of the webpack source code itself you'll see this at play. In addition to adding a loader, you can control the entire list of loaders with two exclamation points at the start. This will override any externally-configured loaders. So what does that mean for this example right here? What's different now that I have these two exclamation points? Well on the last run, we added the tee-loader to the babel-loader. Now, we will only have the tee-loader. When I rerun webpack, you can now see the tee-loader is printing out our scoring class. This is no longer transpiled. Now what do I do here if I want to add the babel-loader inline? What will this module specify or look like? Well I just come in here, and like a pipeline you can visualize these exclamation points as breaking apart the pipeline; I can add in the babel-loader. Remember this is loaded right to left, and I actually want to go ahead and add in an additional tee-loader so I can see before and after. And how about we do a label on these too, which you can specify with a query string. We'll do label=inline-before, and I'll put a label of inline-after. And as you can see, this is getting a bit unwieldy. Nonetheless, when I run things through again, you can see we now have our before tee-loader with the class, and then after we have the compiled code. Well just a heads up, this is another style. Sometimes it can be helpful, and you should be aware of it in terms of reviewing older examples of webpack online.
Learn More by Building Loaders - Try a Pitching Cache Loader
So loaders being a simple transformation become a basis whereby webpack can understand any type of module. The module could have a styling in it, CSS, LESS, Sass. It could be a JSON file. It could be an image, PNGs, JPEGs, SVGs. You could optimize those. You could create different sizes or representations of those. It could be markdown or HTML or a Jade template. It could even be CoffeeScript or it can be diagnostic like we saw with the tee-loader. At the end of the day, a loader takes some sort of file or source code and transforms it into a format that webpack understands. So what I'd encourage you to do as you work through the rest of this course and as you learn webpack, when you start to use new loaders, take a stab at implementing some of these loaders. You don't have to implement them feature-complete, they don't have to be robust, but I found it really helpful to try to implement some of these basic loaders so that I understand what that loader does, as well as better understanding loaders themselves. And while you're doing that, refer to the docs for the loaders API and take some time to learn more of the advanced features of writing your own loaders. For example, you can have synchronous versus asynchronous loaders, which means you can do crazy things like calling out to a web service and taking the data that comes back and maybe generating some code from it or inlining that data into your application. There's a lot of information on the loader context properties that are available to you to make decisions at compile time. And if you really want a challenge, you should take a stab at implementing a cache-loader, so a loader that could cache the output of Babel, for example, or actually any other loader, so that if a file hasn't changed, even if you are running webpack outside of the Watch mode, you could be caching to disk and a cache-loader would speed up that scenario as well. And for that, you'll want to take a look at pitching loaders. It's a second type of loader. If you're familiar with events in a browser, there's both a capture and a bubble phase. Same idea with pitching loaders. Pitching loaders are executed before the loader that we created in this module. And what that means is, well normally when we are processing our source code, we read the file, it flows through the babel-loader, and if we had a cache-loader, it could be written to disk. But before this process happens, another one happens. It's the pitching phase where these loaders are run in reverse if they've provided a pitch function. So the cache-loader gets a first stab at a request. If it's not cached, it's going to let the request flow back through the rest of the pitch phase. Once the pitch phase completes, if nothing's stopped the pipeline, the source code will be read and flow through the babel-loader, and then it'll be cached to disk. So it will store it on disk. And that means the next time that a request comes in, when the cache-loader runs, it will realize that it has it in the cache, and it will stop the pipeline right there. So the babel-loader will never run then. So the pitching phase gives us the ability to intercept a request and stop it from ever processing through the pipeline of loaders.
Running Build Tasks
What About Build Tasks?
If we zoom out a bit and start to think about what it'll take to go from our development environment to production, we'll realize that bundling is one of many things we need to consider. For example, maybe we want to clean some files up. That way, previous build artifacts don't accidentally get released. Maybe we want to copy some files, like an HTML page. Maybe we want to include a fingerprint of our application version, so a Git SHA for example. Maybe we want that to be injected into the application somehow so that we can show it in the UI, or at a minimum, tag our bundle with the correct version of our application so we can troubleshoot when we have problems. We might also want to take all the files that we need to deploy and zip them up. So how do we do these things with webpack? What do you think? Now the short answer is, webpack is meant as a bundler, and I think a compiler platform. But at the end of the day, we're talking about producing a bundle. Beyond that, you don't need to use webpack to do all of these other things, but you can. So in this module, I want to show you how you can do some of these other build tasks with webpack, and I want to encourage you to consider just using other tools. For example, we've been using npm runscripts throughout this course. Moving to webpack doesn't mean obviating all of the tools that you already know and love.
Cleaning the Output Folder Before Bundling
Let's start out by looking at how we can clean up files as a part of our webpack build. Again, you could use the command line in a script to do this. You could use the rimraf npm package, but it our case we will use webpack. And knowing that, can you tell me what we're probably going to change to add cleaning to our webpack build? If you want a hint, what are the two extensibility points inside of webpack? Well we have loaders and plugins. Loaders operate on the level of modules, plugins operate on the level of the entire compilation. So plugins are where we're going to add the ability to clean files or perform other build tasks. And for our purposes, there is a clean-webpack-plugin. So let's go ahead and install this package, and then let's hop over to our source code. Then in the webpack config, import the plugin, and then what comes next? Well, we just need to hop down and decide where we're going to add this plugin. In this case, I want this to run as a part of all of my builds, so I'll go ahead and add it into my list of plugins here. So it will be the new CleanWebpackPlugin, but I'm commented this out because before I add this, I want to show you the value of this. If I come over to the command line, go into the app/dist folder, right now we just have our bundle. But in a more complex build, we'll have additional files. For example, I'll copy the bundle, duplicate it here under the main name instead so we can pretend that we just renamed the bundle in the webpack config and we forgot to get rid of the old bundle. Now to prove my point about the problem this causes, let's run a new webpack production build. That'll save a new app bundle. If I look in app/dist now, you can see we still have the two files. Webpack isn't cleaning anything up, so this is where the confusion can happen. So this is where the clean plugin can be helpful. We can wipe out the output folder before we create our bundle and any other files. So if I come over here and enable this plugin, and then I need to pass an array of paths to it to clean, and these can be relative, so I'll do app/ and then dist, the same output path I have up above here. If you want, you could extract a common variable, a constant between both of these locations. Back at the command line, I can run the production build again. And if you look in the output, you'll see debug information printed from the clean plugin that the dist folder was removed. And when I list out the files in the dist folder, you can see we just have the app bundle. Main.bundle is gone. This is a nice plugin to add in to make sure that you're not accidentally including or conflating previous build artifacts. It's especially troubling when you're learning and you're wondering where a file came from and you pull your hair out for half an hour, and it's simply an old file. This clean-webpack-plugin is pretty basic. If you hop out to the GitHub repository and look at the index module, you'll see a single file with about 200 lines of code. That's it. So you can scroll through here to learn more about how plugins work. You could even use this as a basis for your own plugins. In this case, if you peek at the source code, you'll see that we're using the rimraf package behind the scenes. So this plugin is just gluing together a separate library into the webpack build process. You don't have to reinvent the wheel if you create your own plugins or if you're going to find existing plugins.
Not Just Build Tasks: npm-install-webpack-plugin
I started this module with a discussion about build tasks, but don't limit yourself to just tasks that you need to run to deploy your application. You can extend webpack with plugins that will add any sort of functionality that might be helpful. For example, have you ever used a package in your code, an npm package, and forgot to install it? Well, this plugin can take care of that problem for you. It can automatically install npm packages. So let's install this, might be the last package you have to install. And now for a quick example of this, let's start up our dev server. I'll hop over to the scoring component, and I'm going to modify the initial score for a new game to be a large number, 1000. You can see that in the UI now. That's a bit hard to read so it's pretty common to want to format that number. So maybe we want to hop in and create a formattedScore method, and we'll return the score. And there are a number of ways we can do this, but maybe we like this library called Numeral.js, so get right to work using it, just assume we have it available. Come on down here. We're just so used to using this library, we don't even think to install it. Save that, and we go back to the browser to test our application out, and what? Oh yeah, duh, so we're missing the numeral package, so technically the numeral module, and we could install it. In fact, WebStorm is recommending that. Instead, let's open up our webpack configuration, and let's bring in this new plugin. And then I'll add in the plugin to the list of plugins for the base config; however, you might just want this in your development builds. It might be a bit scary to be running a production build and automatically installing packages. Now back at the command line, webpack was restarted by nodemon. The new plugin is registered. Do you see anything that looks different here? If you look carefully, you'll see Installing numeral. Now be careful, if you restart webpack and clear out the screen, it won't need to be installed again, so you'll only see this output one time. So the gist is this plugin looks at our source code, and if we're using something that we haven't installed, it'll install that package automatically. Now aside from the output here in the console, what else could I check to see if this package was installed? Well, why don't we check the package.json file? And there you go. The numeral package is now listed as a dependency. Now we just demoed this plugin almost in reverse of how you'd use it in real life, so I want run through one more demo of it. Now that we have the plugin added, the npm install plugin, let's go about writing some code that uses our new dependency. That way we can see how our application just refreshes and works with the dependency automatically. That's the real workflow and the real value of this plugin. So first up, let's uninstall the numeral package so we can see the install automatically again. And over in my scoring class, I will just comment out my usage of this. All right, all the code changes are rolled back, but we still have the plugin added to webpack. Just pretend it's been there for months now and today you're going to make some changes to this application. So you start up your dev server to work on the new feature, and then go over to the browser. The website is operational, so we can make our change. So we want to format that score so we hop over to our editor. We use that trusty numeral library that we know and love, and then let me split the screen here. Okay, so I cleared the output on the left-hand side. Now you come in and you add in your require, call her your import, save that. Take a look at that over on the left-hand side, you can see the plugin is installing our package. Normally you wouldn't need to look at the command line. Instead, you'd move on to updating maybe your view logic here to use the new formattedScore method. And then theoretically if I hop over to the browser, the app should work now. We should have a formatted number. Ah, it's broken. I have a small bug, so let's fix this, this.score, and look over on the left. The application just instantly reloads with the formatting, and this is all really powerful. With hot module replacement and this npm install plugin, you can focus on the code that you want to write and not so much about all the ceremonious things you need to do to get your application to operate. So in a way, this is a development task more than it is a build task. So keep an eye out for other things beyond just build tasks that you might want to plug in to webpack.
Finding Plugins for Common Build Tasks
Troubleshooting with Source Maps
Bundling and Transpiling Make Troubleshooting Difficult
Bundling is a beautiful thing, and so is using a transformation like Babel or maybe TypeScript to write modern code and have it compiled down to something that's more universal. The problem is all of this becomes a huge problem when things go wrong, because when you're running in the browser and you don't have your original code, troubleshooting based on machine-generated code from maybe Babel or TypeScript, or even the bundle that webpack generates, well, troubleshooting that is not so simple. There's a mapping at play at a minimum, and in some cases, the code you have in the browser is completely obtuse. But, don't fret it, this is where source maps come in. Source maps give us the ability to reference back to the original code so that when we're troubleshooting in the browser at the end of the day, we can usually see that original code. Let's take a look at this.
Runtime Errors Aren't as Transparent as Compilation Errors
So first up, let's distinguish a few things. I'm going to change to the scoring class here. It will cause a compilation error. Over in the browser, we get this nice warning because we set up the overlay from the dev server. We even have some help over here in the console. And it's pretty clear because it's pointing out the code where we have the problem at, and this is the code that we have inside of our application. This is not bundled code, this is not transpiled code, so it's easy to fix and know exactly what's going on. The same cannot be said of runtime errors. For example, if I throw an error here when I create a new game to simulate the fact that sometimes errors don't happen at runtime even right away, they can be delayed, so we'll save this, and then back over in the browser if I reload the page here, everything seems fine, but we do have an error behind the scenes. And this error isn't stopping us from using the application, which could actually be really scary. But if we noticed this problem, maybe we have some exception logging in a production environment or maybe we're just developing the application, if we notice the problem we might want to drill into it. Here you can see our exception is thrown. And we might want to click on the link here to jump into the source code where the error occurred. And fortunately right now, things look pretty much the same. This is the line of code I just added, and everything around this is pretty much the same. So this isn't too bad to troubleshoot. One difference, we are sitting inside of the webpack bundle, so all of our modules are merged together, this is not a single separate source code file. So if we didn't have this path right here, then it might be a little bit more difficult to troubleshoot and link back to the code in our application if we had not just recently created this problem. Let me show you what I mean. So we'll come over to our webpack.config, and in development if I don't have the NamedModulesPlugin, when I reload in the browser and now click on my exception, you can see we've lost the path to the files. So this is a little bit more difficult to troubleshoot. To compound the problem further, if we were in production or using our transpilations even in development, which I can simulate by adding the babel-loader to development, save that. Now when I come to the browser and reload the page and click on the exception, you don't even have the same code anymore. So this is all of the code that comes out of both Babel and webpack creating the bundle, and of course this is not close to what we have in the actual source code for our application. And that's because we've got some major transformations going on here. Of course, you could imagine we could be minifying our content, and that's especially likely in production. And so each of these transformations that we apply that optimize our application can also make it very difficult to troubleshoot. Fortunately, as I mentioned, source maps can take care of this problem for us, so let's take a look at how to do that. Let's troubleshoot this problem right here and turn on source maps to see how they can help.
Enabling Source Maps with devtool: "source-map"
All right, so to see the power of source maps, first take note on the right-hand side that the link here is to our bundle. So the error message, when we click this link, it takes us into our bundle where we can see the final code that was output by webpack. If I simply come over to the configuration on the left-hand side here and add in an element called devtool, and I'll add this to the baseConfig for right now, and I'll set this specifically to just source-map. I'll save that, and now make sure that your dev server restarts to reload the webpack.config and then I'll just reload the page on the right-hand side. And can you tell me what's different? We now have a link that points at scoring.js, which is the code where we actually have the problem at. That's our original module that was transformed by Babel and then concatenated into a bundle by webpack. Now we can see in the browser that original file, and if I click on this, you can see we have our class on the right-hand side much like on the left-hand side. Obviously this is much easier to troubleshoot than looking at the bundle, which we can still access here with our transpiled and also concatenated modules. So really, source maps are helping us abstract away and hide the fact that we are delivering our code in an optimized bundle. In a way, a bundle is actually a low-level concern, much like the TCP protocol, that we don't need to look at. We know it's there to optimize the delivery of our application, but we're not going to debug the TCP protocol unless maybe we have some sort of networking trouble, nor do we need to look at the bundle unless we're specifically working on it, perhaps optimizing it. Instead, we can stay higher up in the stack by looking at our original source code. So that's the power of source maps. Now let's talk a little bit about the options that you have and talk about how this works.
How devtool: "source-map" Works
We've seen the high-level implications of enabling source maps with webpack's devtool configuration option. Now let's look at what exactly is happening behind the scenes and see how we can tweak this option. So I've got a question for you. What could we look at here to better understand what this devtool config option is doing? Well, somehow we have information in the browser about our original source code, and that's why we can see scoring.js here in the link for our error message, and that links us to what looks like a scoring.js file. And if you look down below, you can see there's a message here that says this was source mapped from the app.bundle.js. So why don't we take a look at what's inside of that bundle, and how about we compare it before and after we set this devtool setting. So let's grab the contents of this file and I'll put that on the right-hand side of my diff tool here, and then I'll come back and I can either comment out the devtool option, or you can change the value to none, same net effect. Then back over in the browser, I can refresh and grab the contents of the new file. And keep in mind, this would be before adding source maps, and I'll paste this on the left here and our diff tool highlights the one difference that we can glean from this app.bundle file. You can see we have an extra line at the end here with a sourceMappingURL pointing at a file called app.bundle.js., and then map on the end. Let's take a look at this file. So I will come back over to WebStorm and change back to using source-map as the devtool option so we generate the file again. And I will look at the command line output. And in the list of assets you can see here's our map file. It's rather large, which is one of the considerations for using source maps. You have to understand the implications of the slow down you might experience when developing or running your application. Anyways, we have this new file. What can I do to take a look at the contents? Well, I'm using the dev server right now so you won't find this on disk. Instead, come over to the application, open up a new tab, and just like loading the app.bundle, we can just add .map on the end here to load the map file. If you look closely, you'll notice that this is JSON. In fact if I take this URL, and I'll use curl followed by piping the output to jq to format the JSON, you can now see a little bit more of what's contained inside of this file. Up at the top, you can see the version, we have source files that it includes mappings too. These should look familiar. These are the list of modules that we've included or that webpack has included. Notably here are some of our application modules. Further down, our names that are used as a part of our application, and then we have mappings, and if you look at the specification for source maps, you can better parse through and understand what all this file contains. That's beyond the scope of this course though, so I'm not going to dive into that too much. But then down below, one last section that might be nice to see is the sourcesContent. So inside of here, we have access to the original source code to be able to display in the browser. So it's this map file that allows us to retrace our steps both from the concatenation and modifications that webpack makes to produce the bundle on the right-hand side here, backwards through the transformation that Babel applied, all the way back to our original source code, for example to see the Scoring class. So come back to the browser and take a look at that again. We're looking at the class itself, so this is pre-Babel as well. Now it just so happens that the babel-loader is smart enough to talk to webpack itself to understand that you've enabled source maps, and it knows how to provide source maps then to webpack so that webpack can get you all the way back to your original source code. That's not true of every single tool. Just keep in mind that every time a transformation is involved, could be TypeScript; could be Babel; could be webpack's concatenation of modules to produce a bundle; could be minification to optimize that bundle; could be transformations that you're applying to your CSS, which I'll get to in a subsequent course in this series. Whenever you're applying a transformation, if you really want to take advantage of source maps, make sure you're properly configuring that tool to work with webpack so you can have the best debugging experience possible. Next, let's see what happens when we set up the devtool in a way that it doesn't reverse all of the transformations that have happened.
Fast, Inline, Partial Source Maps - devtool: "eval"
To understand the importance of making sure all transformations are factored into source maps, I'm going to change the devtool configuration value to eval. So there are different values for this setting that can tweak how the source maps are delivered, as well as what is actually mapped and delivered. So if I change this to eval and come back to the browser and reload, you can see we still have our mapping to our scoring.js file. When I click on this though, this doesn't quite look the same. What's different here? Well we still have what looks like a single code file, but this is the output after babel-loader is applied. You can see our class is compiled here. So in this case, we're only getting a mapping that reverses what webpack did to produce the bundle. So if you look at the bundle over here, you'll see something that's drastically different than the bundle we saw before. I'll get into this in a moment, but needless to say, this is our module right here, our scoring module, it's actually all on one line of code with an eval, and so what we have over here in the scoring.js file at least reverses this bundling into an eval function in this case. That's a really good thing because eval code like this is even more difficult to understand than the bundle that we had a moment ago. Now you're probably wondering, why in the world would I use this eval option if it doesn't reverse the entire set of transformations back through Babel as well? Can you take a guess why we might use this? Well, it turns out to be one of the fastest options for producing some sort of reversal or some sort of source mapping. So this might be something that you want to use in a development environment because it's faster and also because the mappings are inline. If you look at the bottom of the bundle here, there's no separate mapping file. You can see that in the output as well. You can see we're not producing the map file anymore. This approach to source maps is referred to as inline versus a separate file, and that can be another advantage, especially on incremental rebuilds of your application where maybe you'd just change one or two modules. You can quickly get that mapping information down with the new source code and not need to generate a whole new map file for your entire bundle. So if these tradeoffs are sufficient to you to get back to the point where the Babel code was generated, then maybe this is the setting you want. In fact, if you want to look at the output of Babel and debug that, well this could be a very helpful option then if you're not interested in going back to the original source code. Perhaps if you suspect some sort of problem with your Babel configuration. So there are different settings that we can apply for source maps, and let me copy the app.bundle here with the eval setting. I want to diff this with the bundle when there are no source maps enabled. So the left side here still has the original bundle with no source maps. And on the right, I'll paste in this new bundle with the eval source maps. You can see the red on the right indicates a change, so let's just step through some of these. Here you can see all the source code in each individual module is wrapped up into an eval, and that's about the only difference here. Let's jump down to our scoring module. So on the left again, this is our module code in the bundle without any source mapping, and then on the right is what that looks like with the eval-type source mapping. Again, the code on the left has just been wrapped up into an eval on the right-hand side. You can even see that in the first line of code here. We have a harmony import on both sides. And of course, the rest of this eval here is one giant line so it goes way off the screen so I can't really compare some more here, so that's why I want to blow out the code in the eval here and do a side by side diff with the original code to see what is different. Why are we using an eval? Hopefully we're not using it just for using the eval's sake. So here is that expansion, and aside from some line differences in here and some eval-ification, like escaping of double quotes, the real difference is way down at the bottom of this file. We have this footer now, and notably we have the sourceURL reference to the scoring module. This allows us to give this eval block a name of essentially scoring.js so we see that over in the browser then when we're looking at the console. We can see scoring.js there. Another way to think of this, each eval wrapper along with the sourceURL, tells the browser that this bundle really is comprised of multiple files, and here are the names for those files, somewhat like a logical demarcation of the boundaries of multiple files that are stored inside of a single file. Since there's not a lot of work involved in doing this, this is fast and maybe all that you need when you're troubleshooting.
High Quality Maps with Fast Incremental Rebuild - devtool: "eval-source-map"
So by changing the value of the devtool option, we can produce different source maps. And we've now seen a separate source map with just source-map as the value versus an inline source map with eval. The separate file is slower to generate versus the inline eval option, which is faster; however, there is less quality when using evals. So there are tradeoffs that have to be made depending on what exactly you're trying to accomplish. So let's talk more about the other options that are available and better understand what we have at our disposal. So I mentioned that eval only gives us a partial reversal of what webpack does and not what loaders do. If, however, we set eval-source-map combining the two options that we've used so far in this string, when I come back and refresh the application and take a look behind the scenes, we're now back all the way to our class. So we've also reversed Babel by switching to this eval-source-map. With this new option, do you think this is a separate or an inline source map? Well in this case it happens to be we're just adding mapping information to the inline source maps. You can confirm that in the application bundle here. So here is our eval, so we're still dealing with inline source mapping. If I take this eval though and compare it to the previous one, so here is that diff. On the left we have the eval that we looked at in the last clip, now we're looking at the eval on the right for this clip and we're diffing these two. I also turned on word wrapping so that we can compare these two loosely speaking. I'm not too worried about the specifics of this, but this is enough resolution here in this diff to see that it's this footer at the end that's different. Before we had the name for our scoring.js file with sourceURL, now we have a sourceMappingURL so we have mapping information. This happens to be a data URL with JSON data. It's Base64 encoded, and here is the data for the data URL. Once again, this is inline information. I've got a slight curve-ball question for you. What can I do to better understand the data here and what it actually represents? Because looking at it right now, it looks like a bunch of random characters. So this is Base64 encoded, how about we go ahead and decode this. To do that, I'll copy the data part, and then on my Mac I have some commands to help out, I'll use pbpaste to pipe along the clipboard contents. I can use the base64 command to decode that. When I do that, you'll see some source code, and that source code is contained inside of JSON data. We saw that a moment ago with the dataURL. So I will run another command to help us clean up or pretty up the JSON. Does this look familiar at all? Granted this is not the same contents, the structure should look familiar. We have our sourcesContent here, we have mappings, we have names, and we have sources. So in this case, this one chunk of source mapping is just for our scoring.js module. It's not for the entire bundle like we saw earlier in this module with that separate.map file. Remember we saw one file with all the mappings for the entire bundle? It had a huge list of source files in the sources property in the JSON data. So imagine taking that one source map file, blow it apart per module, and code each of those as a dataURL, and stick them inside of an eval. At the end of the day, this provides the same information to the browser, giving us this rich source map experience. So this right here is just for one module. Let me show you the rest of these source mapping URLs. Let's go back to the bundle, and if you search through this app.bundle for sourceMappingURL, you'll find this on the end of each of the evals. And each of these then would be the source mapping just for this one module. So that's how the eval-source-map option works. It's a little bit slower to begin with because we're producing the source map. So it's slower than just straight-up eval, but it gives us much more precision in our source map going all the way back to our scoring class. And the nice thing, because these are inline and because we're using the dev server, we only need to recreate these when we change an individual module, and we only need to recreate the mapping for that individual module because we have all of these separately. So, it's going to be faster on incremental rebuilds. For example, I just made a modification to the scoring module, set the value to 2, and over in the browser I captured the update with the Hot Module Replacement and you can see that the only thing that's modified is our module 40 here, our scoring module. So this is the only delta that's dropped down with the latest source map on the end here. So this is another good development option.
Let's talk about some of the tradeoffs in the options here for production versus the two that we just saw for development. So I'll first change back to just source-map, which is intended for production source maps. So what again is special about this option versus the two eval options that we just looked at? With this option, we move back to a standalone or separate source map file. You can confirm that in the console output. You can see we now have a map file generated. This is also a high-quality option that gets us all the way back to the source code for our application when we're debugging. So in the console here when I click through, I have the Scoring class, so my original source code. Beyond these two characteristics, what else might you like to see for options when it comes to source maps in a production environment? What other concerns might you have in production? Hopefully the idea of mapping back to your original source code sounds like something that you might want to prohibit if you don't want certain people accessing high-fidelity source code. Of course anybody can decompile code, but why make it easier? So even though you may want to use source maps in production, you may not want the whole world to know about it. And of course you can block that map file that's generated, so that's the file at the end of our bundle here, that sourceMappingURL. We could block this file and then nobody could access it, and that might actually be a good idea if you want to protect an accidental leaking of your source code if somebody misconfigures your build. Another option you have if you want to be able to use source maps is to just have this line disappear, what's known as a hidden-source-map. So we can produce the source map, but not link to it in our bundle. Care to take a guess at what the value might be here for the devtool? Well I mentioned hidden, that happens to be the extra little string you stick in here that tells webpack not to put that little sourceMappingURL on the end of the actual bundle. Now if we come to the browser and refresh here, we don't have the source map reference; however, we're still generating the map file, so it's still there for us to access. As long as I know where this mapping file is at, I can pull it down from the server. And in Chrome, it's providing information here about how you can load the source map even if it's not linked into the actual files that it belongs to. So instead of Chrome automatically loading it, you can come in here and right-click and add the source map yourself, just type in the name of the file. When I add that, take a look at that. We have a scoring tab that shows up so we have access now back to our original code. And in the console, you can see scoring again. Let me reload the page here so I can show you that. You can see right now, with the hidden-source-map, the console shows app.bundle.js. If I come over to these sources and add the source map again, that tab pops up here, but over in the console that switches to scoring.js. So it is possible to selectively configure this instead of having Chrome automatically load it. And this might be how you'd proceed in production if your code isn't so sensitive that you're worried about it leaking, but at the same time you don't want it to just load by default for people. You don't even want people to know that it exists unless they happen to know your convention for naming your source map file. And speaking of that, you can take a look at the output.sourceMapFilename option if you want to configure the name of that file and pick something that's not a typical convention with .map on the end of it.
Only Map Location and Filename, Not Source Code - devtool: "nosources-source-map"
In production, another way to scale back access is to just not have the source code itself delivered with the mapping. So instead of just hiding the source map, you can use a nosources option. This will produce the mapping and link to it, but there won't be any source code. So back in the browser when I refresh, you can see we have our link to scoring.js. So it's helpful to know where the error occurs at, but if I click on it, I get nothing. I can see that this was mapped from app.bundle, and if I look at the app.bundle, you can see we have the sourseMappingURL listed to the .map file. If I look at that file though, here's the before with the hidden-source-map option, so this is the whole entire file, and it's so big actually that my JSON viewer extension is not trying to highlight this. And down at the bottom right now, so this is before reloading here you can see sourcesContent, and this has the source code, the original source code. If I reload this, you'll see now my JSON viewer plugin loads and parses this and makes it look pretty because it's a lot smaller. The reason it's smaller, if we collapse down some of these elements, we have our mappings, but we don't have that sourcesContent. And this is why the browser can only map line numbers and original files, but we can't look at that source code. And this can be helpful then if maybe you want to send this information to some sort of logging service that you can use then behind the scenes where you're privy to looking at your source code. So you can marry that out maybe with your IDE linking into your exception monitoring system.
devtool Is Just an Idiosyncratic, String Based Serialization of Plugin Options
Add the SourceMapDevToolPlugin Directly for Flexibility Instead of Using devtool
So over in my config file, I can comment out the devtool, I can come right down to my list of plugins here, and I can new up a webpack., and then refer back to the source code if I want here, and grab this SourceMapDevToolPlugin. Since I'm working with sourcemap right now and not evalWrapped source maps I'll use this plugin, and then I can specify options here. So I'll set the file name, which gives me the ability to control the name here explicitly. And if I jump back here to the parser and the webpack code base, I can see noSources is an option. So I could add that, and I think that's good. I don't think I want to set any of the rest of these. So I can come back then, save that, and actually let's comment this out quick, just so we can see that we don't have source maps. Now back over in the browser, I'll try and pull down the map file. Let me see. There is no map file available. Refresh our app, no sourceMappingURL, and in the output we don't have a map file generated. So now if I add in just the plugin, not the devtool option, refresh here, app still loads. We've got our sourceMappingURL, which has the new name for our source map file. So this means our source maps are working, and it's actually a good thing that we have this here. Now we have the new file name, main.map. Paste that in. There's our new source map. And over in the terminal, you can see the main.map that we're creating. So now I've complete control from this plugin that I've registered. For example, if I actually set noSources to true, true will then disable sending the sources. My mind doesn't do so well with the double negatives. Anyways, we can tweak this one setting now by passing a different option to this plugin. Then let's come over to the browser, and it's obvious to me that on the last refresh here with main.map, we have these sources included because the file is too big for my JSON viewer to parse it and make it look pretty. Now that we've inverted this, when I refresh, there we go. We have pretty JSON, which means at the end of this file there shouldn't be any source code. So there you have it. We can use a plugin instead of the devtool configuration option. And this underscores a very crucial aspect of understanding webpack, and that's that our config file is parsed in to settings that are passed to plugins, that have the logic of the webpack compiler and all the various different tools that we integrate. Specifically with regards to this plugin, if you like this approach, if you want the flexibility, come out to the docs for webpack, and take a look at the options that you can pass to this plugin. All right, now that we've talked a lot about source maps, let's move onto the last module of this course, and let's talk about generating code. This is going to be creating another loader, and it's a nice exercise to wrap up with to solidify the concepts that we've covered in this course.
Challenge: Building a codegen-loader to Capture Build Information
Explanation of the Starting Point to My Solution
So if you like the idea of this challenge, but you want a little leg up because it seems somewhat vague or it's just a lot of work, then I'll give you the parts here that I don't think are pertinent to really understanding webpack as much. So basically, I'm going to give you everything minus the loader implementation that I would like you to write. Now before I do that, I want to point out in my solution some of the things I did before I started solving this particular problem, to change the code base after the last module where we talked about source maps. So first up, I went ahead and disabled all the source maps. I don't need them at this point in time so hey, let's speed up the process of running webpack. Next up, I removed the babel-loader in the development environment. I didn't think that was necessary. Again, I'm using a modern browser. I don't need to be transforming my source code. I then added back the NamedModulesPlugin. I had commented that out to show you the value of source maps earlier. And then I moved the babel-loader config file that we had separated out, I moved it into a configs directory. I'm thinking of this is a location for partial configuration files, and then I just updated the webpack config to read that file instead of reading the old babel-loader file. So I basically just renamed this file and put it in a folder called configs. And then the last thing here, I extracted the dev server configuration just into a local variable, mostly for readability purposes. You can now see we have this nice list of the baseConfig and the devServerConfig being merged together. And the two commits that I want you to take a look at for a little bit of help. The first setup exercise commit. I simply create a style and display the build information. So this is just the view for what we saw over in the application to show this build information. So you can stop right here if that's enough help, but if you want a little more help in the second one, I then show how you can import that build information into another module. I then take that build information and stick it on the scope so that we can then bind that to the view that you just saw in the previous commit. I broke this commit out separately because this import is essential to understanding how webpack is working and how loaders work with webpack. Without this understanding, it wouldn't make sense to import this build information .gen.js file. It would look like we're importing the compile time code at runtime. So I'm giving you an opportunity to stop now if you think I'm giving too much away in this second bit of help here. That's why I split out the commit. So leave now if you want a little bit more of a challenge to this problem. Okay, so down below, here is the compile time code to generate our runtime code. You can see here I'm reading the latest Git commit with the git rev-parse command. Out of that, I grab the commit and then I also grab the current time. Down below then, I build up code with those two pieces of information hard coded in it. So this is my generated code right here, and I return that back then. Now resource-wise, don't forget that you have the loader API available to you, and you will need to use this to be able to create this generic codegen-loader. I will stress to you, you do not need to make this a perfect loader. There is a val-loader that already exists that you can use in your production environment. Just take a stab at this and get something that works for our specific scenario. Now notably, there's one special function you will eventually realize that you need, and that's a function to be able to load and execute some code. And there is an old this.exec function. It is deprecated at this point in time, so don't use this in your production code, but you can use this for this exercise. If you are interested in the replacement or suggested replacement, you can see this comment, or you can take a look at my solution. And at the bottom of my codegen-loader is a chunk of code that explains how you can get around this deprecated function and implement the functionality of it yourself. And this is from this link right here that actually was just referenced over on that Loader API page.
All right, time to go over my solution to this challenge. And I first want to say, there's really no right or wrong way to do this. As with any coding exercise, there are many ways to get to the right final destination, so don't assume what I have here is how things should be. In fact as I mentioned, I really don't think you needed to go so far as having a production loader here. Remember, we have the val-loader, an official loader as a part of the webpack-contrib that provides this functionality. So if you'd like to know what a more robust solution looks like, come up and take a look at this. All right, so first up, I want to look at the code that reads the Git commit off disk and returns that commit information back. This buildInformation.gen.js file is inside of the application source code in the klondike folder. Let's run this quick just standalone. So I will switch into the klondike folder, and you can see the buildInformation script. And in this case, I'll run the node command with the interactive repl. And don't get too caught up in the details of how I'm executing this module. I'm going to require that buildInformation.gen.js. It just so happens that that module returns back a function, which generates the code, so I'll execute that function. And that function when executed, it's asynchronous, so it returns a promise. So when it's done, on then, I'll take back the result here, which is our generated module, and I'll go ahead the print that out. There you go. You can see the Promise that was created, but most importantly down here we have our object that comes back once the code generation is complete. And that should look like an object that has a property called code on it, and it has the code inside of that string then. I could just change this to log just the code, and that'll be a bit easier to see, and there you go. You can see our two lines of code are generated. So I just wanted to run this standalone because I think that helps you understand that I could run this script at any point in time. Here I am running it on my computer without even being inside of the context of the webpack compiler. And just keep in mind that my generic codegen-loader needs to do what I just did here at the command line when webpack runs the compiler. So that's the whole purpose of this codegen-loader. The code that's generated here is our runtime code. We can then import both of these constants into any other part of our application and have our commit and the date or time of compilation. All right, next up I want to explain the codegen-loader itself. By ignoring exceptions in problems and edge cases, I was able to simplify this down into about five lines of code. So first up, I am exporting here from the loader module. I'm going to export my loader function here. I used an async function, so I would use await. This loader function receives the contents of the module that was loaded. I named this compileTimeModule to get across the point that this parameter to the loader here is going to contain this code right here, a string of it. First up, I'm just aliasing this and giving it a more meaningful name of loaderContext because that's what it is in this context. Then comes the fun part. I use that loadModule function that I mentioned down below, I pass to it our compileTimeModule, so basically that string with this code inside of it, and I pass a loaderContext. It needs those two things to load this module, to take it from a string and turn it into an actual module, and in this case to give me back the exports for that module, the exports of that module being this function here that can return to me my generated code when it's executed. So this function is what's being exported here, and that's what I'm getting right back here. So this codeGenerator, that's just this captureBuildInformation function. Maybe I'll split the screen here. Once I get that function over here basically, so imagine this captureBuildInformation function is now available right here, and now let that sink in for a second. What does this loadModule function look like? And let me prefix that by saying this should look familiar in Node.js development. So what does this look like? What if I do this? How about I call require instead of loadModule, and I've hard coded the module we are working with, but that could be parameterized. In many ways, loadModule is exactly like Node.js's require function. I guess that would be a question to you then. Why don't I just call Node.js's require function? Well keep in mind that webpack has a series of loaders, and as crazy as it might sound, we could put a loader before our codegen-loader. It could be three or four of them, in which case, we can't just reload the file from disk, we have to use the source code that was passed to us that might also be transformed already by something like Babel or another loader. So this loadModule is simply loading a module that we have in memory. So we get back our generator function, which is this captureBuildInformation function here. I need to execute it then, which kicks off the codeGeneration process, hence the variable name here. And then I just await the completion of codeGeneration after which, well, I will get back my generatedRuntimeModule. And you can see that down here if you scroll down into the captureBuildInformation function. When this is done, it returns to me an object that has the code that's generated on it. So I then take that object and return back the code. So if I wanted more generator modules, like I said, we could generate fake information for our application, maybe a series of simulated stock prices. Well, I could use this exact same return contract here and return back any type of code inside of here, and then this codegen-loader would understand how to execute that code at compile time and will return the runtime code and send that back to webpack here. So that's what we're doing here is we're returning that code then to webpack. Now beyond these two pieces, we just need a little bit of glue to put this all together. Inside of my webpack configuration, I'm importing a new codeGenConfig, so this is another partial configuration object, and I'm adding that in this case just to me development configuration. So I'm merging it with the rest of my development configuration. If I take a look at that file, it's up inside of the configs folder next to Babel that I extracted out. And inside of here I'm just registering my codegen-loader. And I'm saying, hey, codgen-loader should run on anything that ends in .gen.js. I also set up a resolveLoader alias here, and point to the file where I have my codegen-loader. This wouldn't be necessary if you're using, say the val-loader, you would just install that npm package and webpack would discover it in your node_modules folder, but in this case, I have to point webpack at the location of my loader, just like we did with our custom tee-loader. So this adds our loader to the pipeline, and then the last piece of glue, the application needs to request to use the buildInformation. So that's where I had those pre-canned pieces. The board itself has some code here to display the two bits of information, and then behind the scenes we are importing that buildInformation from the buildInformation generator. At runtime, this will receive back a module with our generated code. So it's almost as if that's available right now at compile time. I assign that to the scope here, and then it's available on the view to bind. At the end of the day, we get our nice information here at the bottom of our application. If I come over to the terminal and recompile the application, refresh here, then the time updates. And of course, if I changed my commit, that would update as well.
We have now reached the end of the course. Before I go, I want to encourage you to spend some more time learning about webpack beyond what I've covered in this course. Keep an eye out for the subsequent courses in this series. I will build upon where we left off in this course and take us a notch up and a notch up, until we get to a final destination where we can see all the powerful things that webpack can help us do with our applications. Also, keep in mind, there's some pretty good documentation available. The GitHub repository for the course is open source and available for you to read. The release notes are great if you want to keep up to date with what's coming. And if you would like to get a hold of me, I have a website here with a blog. You can reach out and contact me, and if you'd like you can subscribe to my newsletter and see what else it is that I write about.
Wes Higbee is passionate about helping companies achieve remarkable results with technology and software. He’s had extensive experience developing software and working with teams to improve how...
Released30 Jan 2018