A faster web with Resource Packages – Mozilla suggestion to have just one HTTP request

One of the most common problem on the web is slow web sites, wasting he time of end users. Now, perhaps, Mozilla has come up with a solution for this, which will be applicable for all web browser vendors.

Background

One of the main problems slowing web sites down is the number of HTTP requests, i.e. the number of times a separate file need to be retrieved from the server; e.g. images, CSS and JavaScript files. Depending on web browser, you can only have between 2 and 8 concurrent HTTP requests. There can also be delays between requests, depending on file type asked for, redundant header information sent etc.

You can always combine all CSS files into one and all JavaScript files into one through a clever deployment script, but it’s still a couple of requests. Then with images, you can use CSS sprites, but that could affect memory, and there’s still no solution for inline images in a web page.

So, what if we could combine all HTTP requests into one?

A picture of someone surfing a wave

Picture from standing on the break on Flickr

Resource Packages

Alexander Limi of Mozilla has been working on a solution to this, described in detail in Making browsers faster: Resource Packages. The idea is to be able to combine all HTTP requests into one, and have this implemented in all web browsers, and to have it backwards compatible.

Sounds to good to be true, right? :-)

Implementation

The idea is to use the ZIP format, which is supported on all platforms, and to package all resources in a web page into one ZIP file. You then instruct the web browser, via a link element, to download that ZIP package of resources.

	<link rel="resource-package" type="application/zip" 
		href="site-resources.zip">

Note that the type attribute is not needed in HTML5. Also worth mentioning is that the files in the resource package should take precedence over all files included in the page through other elements (e.g. the src attribute on images and script elements). Paths on the resource file will be relative to where the actual resource package exists in the hierarchy.

You can also complement it with a manifest file that list all your files in the resource package, and also be able to use it in conjunction with Offline Resources in HTML5.

javascript/jquery.js
styles/reset.css
styles/grid.css
styles/main.css
images/save.png
images/info.png

To make this work for any web site you build, you could either manually, or through a deploy script, zip all the necessary files together for your resource package. And when it comes to inline images in a web page, you could actually let the server just zip it on the fly, first time that page is being requested – it will be some work for the server, but the performance gain of just one HTTP request is very likely to make it worth it.

A picture of a running cheetah

Picture from Mast Farm Picnic – 019 on Flickr

Web browser implementations

What web browser vendors need to do is implement support for a link element to include a resource package, unpack it and then use those files instead of those referenced at other places in a web page. Older web browsers lacking support will just ignore this and work as they always have.

What is really exciting is that this is already to be implemented in Firefox 3.7!

And what is also very encouraging is that Mozilla have been sending this proposal out to other web browser vendors, so everyone can offer it in their respective web browser (the beautiful thing of an organization such as Mozilla :-) ). They have been in talks with performance guru Steve Souders, Alex Russell of Google and Ben Galbraith & Dion Almaer of Palm to get good feedback from an implementor’s perspective.

Perhaps the web will actually be fast(er) in the future!

Posted in Developing,HTML5/HTML/XHTML,Mozilla,Technology,Web browsers |

15 Comments

  • Gerben says:

    So now you have to wait for the whole zip to load before being able to view the page.

    Very bad when the site has a 1mb decorative background image on the body.

    Or even worse when a bad deploy scripts just adds all images than are in the images folder.

    I think persistent http connections are a far better alternative.

  • Robert Nyman says:

    Gerben,

    Well, first having a 1MB background image doesn't sound for performance no matter the approach. But, I do acknowledge the point, and I guess something visually important could be included first before the file to give the feeling of a fast load.

    However, I guess to make this really powerful is to allow multiple of the same tag, to first include visual vitals for perceived performance, and then load the rest.

  • Milo says:

    Why not use tar instead of zip? This would allow the client to use contents of the tar as it's downloaded (since a tar is just a concatenation of files), rather than having to wait for the whole zip to be downloaded before it can be compressed.

    And if that doesn't work out, standard compression via the Accept-Encoding/Content-Encoding headers seems like a better idea than mandating zip.

  • Milo says:

    I guess I should read the spec before commenting next time. The "use the contents as it's downloaded" part is accounted for.

    I'd still suggest tar + optional Encoding per headers just from a purity standpoint, although I'd assume that all browsers already support zip files anyway.

  • Hm but what is with cache? The second time you load the page your browser already has all images, scripts and CSS, will it download the zip-file one more time? It would waste so much bandwidth.

    And btw. it sounds like it is related to SPDY: An experimental protocol for a faster web

  • Robert Nyman says:

    Milo,

    I guess zip is just a safer way that will more easily work in all web browsers on all platforms, but it's an interesting idea.

    Jeena,

    The Expires headers/Etags you set for the ZIP file will apply to all files within it, i.e. nothing needs to be re-downloaded.

    Alexander Limi's comment about SPDY:

    <blockquote cite="http://limi.net/articles/resource-packages/"&gt;

    While this effort from Google aims to make everything faster, it is largely orthogonal to what we’re trying to do with Resource Package. It also requires you to retrofit both web browsers and web servers to make it work, which means it will take quite a while before this will be in common use. Resource Packages work without any changes to the web server software, and will work as soon as any browser supports it — with no adverse effects to the browsers that don’t.

  • PJ says:

    couldn't they just use the same manifest file from html5 / google gears uses for offline storage then just request that the server combine and optionally compress them … seems like that would make most sense to me

    http://www.w3.org/TR/html5/offline.html

    https://developer.mozilla.org/en/Offline_resource

    http://code.google.com/apis/gears/tutorial.html

  • Robert Nyman says:

    PJ,

    It's a nice idea, although that requests that the server would know to handle it, as opposed to Resource packages who will, well mostly, work out-of-the box for any web site on any server.

  • [...] Resource Packages, a simple solution to an old problem. Best part of it all: it’s landing in Firefox 3.7 Spread the word! [...]

  • Jason says:

    Even if this is implemented in all browsers you would still have the problem of legacy browsers not supporting it.

    I don't see any method of saying Get the packageGet the files separately

    Its a nice idea for the future. I just don't see much use in the present.

  • Robert Nyman says:

    Jason,

    All the files are referred to in the HTML and CSS file as normal. The packaging is additional, on top of that. If the web browser doesn't recognize the <code>link</code> element, it will include the resources as normal.

  • Corey says:

    What about multipart HTTP responses? Same as email attachments. Most browser support them just fine, and they're a lot more flexible than forcing archiving and compressing.

  • Robert Nyman says:

    Corey,

    Might be an idea. Please contact Alexander Limi, as mentioned in the article, and see what he thinks.

  • David T. says:

    We already have pipelining and compression, which seems to be the win-win-win solution (don't have to wait for everything to download before seeing things, but don't have to establish a new connection for each resource, and the data is compressed.)

    Firefox doesn't enable pipelining by default, but you can do it yourself.

  • Robert Nyman says:

    David,

    It's an option, but what we are looking for here is a way to offer this as web site developer, to all of our visitors.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>