As you develop HTML apps, one of the issues you’ll face is that your application doesn’t come to the browser in one fell-swoop. A typical web page receives content from a number of sources. Below you can see the first bun of requests from a site (in this case MSNBC.com) as shown in Firebug:
While reducing this number and size of these requests is laudable, you will also want to take browser cache into account. In the image above, you can see that some of the assets (e.g. jquery-1.5.2.min.js) returned a status of “304 Not Modified”. This status implies that the browser found the latest version of this asset in it’s cache and didn’t need to download a new one (as it hasn’t change…or was “NOT MODIFIED” from it’s current version).
For me, this meant that I wanted two things from packaging of assets:
What Do I Mean by Packaging?
Packaging Style Sheets
While most solutions for packaging assets take style sheets into account, for me I decided that the dynamic style sheet languages (LESS and SASS) do this adequately. (If you haven’t read my post on using dynamic style sheet languages, see it here.)
While using the @import declaration will merge style sheets, you may also want to minify the style sheets too. In the case of dotless (which I am using for delivering my LESS files), you can use the configuration to turn on minimizing of the style sheets:
<dotless minifyCss="false" cache="true" web="false" />
The minifyCss property can be turned on (usually in my Web.Release.Config file) to make the output of your LESS files to be minimized to decrease their size.
If you read the docs for SquishIt, Justin shows it used directly in your Razor code like so:
The idea of how this works is that you add individual scripts (or whole directories) into a bundle of related scripts. When Render is called, it determines whether to package up all the scripts into a single script (name combined.js in this case) or to leave them as separate scripts.
I find managing this kind of code in the markup makes it harder for me to maintain the code so I decided to do it as extension methods (of HtmlHelper):
To get this into my razor files, I simply just call the HtmlHelper extension method:
<!-- _Layout.cshtml --> <!DOCTYPE html> <html> <head> ... </head> <body> @RenderBody() </body> </html> @Html.PackageLibs()
This works fine…except that I am packaging the minimized versions of the libraries in all cases. This will make debugging more difficult, so it would be better if I could do this differently in debug or release builds. Easy:
This is better as we’re not using the minimized versions, but in release mode, I don’t want to use my local versions, but I want to rely on content delivery for very common scripts (mostly jQuery and jQuery UI). SquishIt let’s me do this by adding a CDN link (with the local link as a backup) using the AddRemote method:
Pretty clean so far. And as I use plugins, I’ll just add them here so all my plugins work. But one issue for me is that the Render method uses the debug flag in the web.config to determine whether it merges all the scripts into a single file. For my needs, I want the libraries to *always* be separate. To accomplish this the SquishIt framework allows you to call ForceDebug before Render:
The way the works is crucial for me as I want to keep my library scripts (e.g. jQuery, jQuery UI and plugins) separate so I can gain the benefit of browser cache as much as possible. When I create a bundle of my own scripts, I go ahead and let it bundle it into a single file. In fact, for my own scripts, instead of adding the scripts one by one, I add all the scripts in the js directory:
Be aware, that using the merging (e.g. ForceRelease) of the scripts will actually generate the ModernWebDev_#.js file in the file system. Because the code is building a list of scripts dynamically, we need to ensure that that file isn’t included in our list of scripts (as it would then become recursive and the same methods will be defined).
You may notice that since I am building all my scripts into one file, that means that all scripts will be loaded on the first and all subsequent pages. There isn’t a wrong and right here. I made the decision that loading all of it and keeping it in browser cache was easier and faster than segmenting it. But my site isn’t enormous and when you start working with very large applications you will find the need to break it up in to modules (e.g. multiple different merged scripts, probably along directories is how I’d do it).
What about ASP.NET 4’s Web Optimization Stuff?
As you may know, Microsoft (as of the writing of this article) has just released the Beta of ASP.NET MVC 4 and that includes a new stack for optimizing assets called System.Web.Optimization. There is an article that covers the basics (though it’s a little out of date) by Scott Guthrie:
I considered moving to this as it’s pretty slick and pluggable, but it was missing some key features for me:
Because of these issues, I am sticking with SquishIt (and to follow the adage that says “old code is good code”). I am sure it will improve (either by people writing plugins to it or MS fixing it) to address these issues. For many projects that don’t care about some of the control I needed, it is a great solution.
You can see it work in action with the latest version of this example:
|Vue.js by Example (New Lower Price)|
|Bootstrap 4 by Example (New Lower Price)|
|Intro to Font Awesome 5 (Free Course)|
|Building an API with ASP.NET Core (New Course)|
|Building a Web App with ASP.NET Core, MVC6, EF Core, Bootstrap and Angular (updated for 2.2)|
|Less: Getting Started (New)|
|Using Visual Studio Code for ASP.NET Core Projects|
|Implementing ASP.NET Web API|
|Application Name||WilderBlog||Environment Name||Production|
|Application Ver||v4.0.30319||Runtime Framework||x86|
|App Path||D:\home\site\wwwroot\||Runtime Version||.NET Core 4.6.27514.02|
|Operating System||Microsoft Windows 10.0.14393||Runtime Arch||X86|