Captain Codeman Captain Codeman

Polymer 2.0 Build with HTTP/2 Server-Push

High performance without javascript bundling

Contents

Introduction

One of the really great things about Polymer 2 beyond the framework itself is the well thought-out tooling with the polymer-cli helping you with project templates, local dev-serving, testing, linting, build and packaging.

The only thing it doesn’t do for me right out of the box is create the information needed to add http/2 server-push. It will actually generate link prefetch tags that go into the index.html page which does speed things up but what I really wanted was to push the dependencies for a view on first load with the first request.

It turned out not to be too difficult at all …

Here’s the lighthouse PWA testing tool results to show where I got to:

Lighthouse Progressive Web App Lighthouse Page load performance

This is how:

The first step was to store a mapping of fragment to URL route. If you don’t know, Polymer helps you follow the PRPL pattern which splits the app into the shell, shared dependencies and fragments for views. Unlike with other frameworks, this makes it easy to only load the files needed for the first page view, with a service-worker then taking care of pre-caching the other views for instant-loading.

If you’re using http/2 you don’t always need the bundling - that can effectively be done on-the-fly by the server. There are also new loading options that you can use in the browser, checkout this information-rich article all about Preload & Prefetch.

What I wanted to do was generate the list of files to push for a given view and do it with link http headers (taking advantage of Google AppEngine’s built-in support for http/2). Each view would need to push any dependencies that were unique to it as well as any shared ones for the shell.

It turned out there was already a spec for defining a http/2 push manifest which it made sense to use and I added my own Go middleware to serve the headers from it.

Thankfully, the polymer-cli and polymer-build projects provided all the project analysis needed to deliver a neatly packaged up set of dependencies to build the file with which made the whole thing easy.

Oh, one last thing to add? I wanted to use “differential serving” where newer ES6 capable browsers got a newer-language version of the site and ye-olde-laggard browsers could still be supported with an ES5 version. The easiest way to do this was to split the build into two separate folders and switch the main app-shell reference in index.html for each. It also meant that two service worker files were needed to handle the different set of dependencies.

So, given a polymer.json builds definition of:

"builds": [
  {
    "name": "es6",
    "bundle": false,
    "js": { "compile": false, "minify": true },
    "css": { "minify": true },
    "html": { "minify": true },
    "addServiceWorker": true,
    "insertPrefetchLinks": false
  },
  {
    "name": "es5",
    "bundle": false,
    "js": { "compile": true, "minify": true },
    "css": { "minify": true },
    "html": { "minify": true },
    "addServiceWorker": true,
    "insertPrefetchLinks": false
  }
]

It would generate the following output:

build\
  es5\
    bower_dependencies\
    src\
    index.html
    service-worker.js
  es6\
    bower_dependencies\
    src\
    index.html
    service-worker.js

But, I was not serving either es5 or es6 folder as the root, I want to server the build folder as the static root. This means that the index.html and service-worker.js files have to be promoted up a level and their references altered to give:

build\
  es5\
    bower_dependencies\
    src\
  es6\
    bower_dependencies\
    src\
  es5-push-manifest.json
  es5-service-worker.js
  es5-index.html
  es5-push-manifest.json
  es6-service-worker.js
  es6-index.html

The files in the root of the build folder all reference their appropriate es5 / es6 dependencies.

Here’s the script I came up with to build it:

'use strict';

const fs = require('fs');
const del = require('del');
const gulp = require('gulp');
const gulpif = require('gulp-if');
const gulpdebug = require('gulp-debug');
const gulpuglify = require('gulp-uglify');
const gulpfilter = require('gulp-filter');
const mergeStream = require('merge-stream');
const polymerBuild = require('polymer-build');

// Here we add tools that will be used to process our source files.
const imagemin = require('gulp-imagemin');

// Additional plugins can be used to optimize your source files after splitting.
// Before using each plugin, install with `npm i --save-dev <package-name>`
const babel = require('gulp-babel');
const babelPresetES2015 = require('babel-preset-es2015');
const babiliPreset = require('babel-preset-babili');
const babelPresetES2015NoModules = babelPresetES2015.buildPreset({}, {modules: false});

const cssSlam = require('css-slam').gulp;
const htmlMinifier = require('gulp-html-minifier');

const swPrecache = require('sw-precache');
const swPrecacheConfig = require('./sw-precache-config.js');
const polymerJson = require('./polymer.json');
const buildRoot = 'build';
const debug = false;

// Waits for the given ReadableStream
function waitFor(stream) {
  return new Promise((resolve, reject) => {
    stream.on('end', resolve);
    stream.on('error', reject);
  });
}

function build(prefix, compile) {
  return function() {
    return new Promise((resolve, reject) => { // eslint-disable-line no-unused-vars

      let polymerProject = new polymerBuild.PolymerProject(polymerJson);
      let sourcesStreamSplitter = new polymerBuild.HtmlSplitter();
      let dependenciesStreamSplitter = new polymerBuild.HtmlSplitter();

      let buildDirectory = `${buildRoot}/${prefix}`
      console.log(`Deleting ${buildDirectory} directory...`);
      del([buildDirectory])
        .then(() => {

          // sources
          let sourceUntouchables = gulpfilter(['**', '!**/*.min.js', '!**/webcomponentsjs/*'], {restore: true, passthrough: true});
          let sourcesStream = polymerProject.sources()
            .pipe(sourceUntouchables)
            .pipe(gulpif(/\.(png|gif|jpg|svg)$/, imagemin()))
            .pipe(sourcesStreamSplitter.split())

          if (compile) {
            sourcesStream = sourcesStream.pipe(gulpif(/\.js$/, babel({ presets: [babelPresetES2015NoModules] })));
          }

          sourcesStream = sourcesStream
            .pipe(gulpif(/\.js$/, babel({ presets: [babiliPreset] })))
            .pipe(gulpif(/\.css$/, cssSlam({stripWhitespace: true})))
            .pipe(gulpif(/\.html$/, cssSlam({stripWhitespace: true})))
            .pipe(gulpif(/\.html$/, htmlMinifier({collapseWhitespace: true, removeComments: true})))
            .pipe(sourcesStreamSplitter.rejoin())
            .pipe(sourceUntouchables.restore)
            .pipe(gulpif(debug, gulpdebug({title: 'source:'})));

          // dependencies
          let dependenciesUntouchables = gulpfilter(['**', '!**/webcomponentsjs/*'], {restore: true, passthrough: true});
          let dependenciesStream = polymerProject.dependencies()
            .pipe(dependenciesUntouchables)
            // .pipe(gulpif(/\.(png|gif|jpg|svg)$/, imagemin()))
            .pipe(dependenciesStreamSplitter.split())

          if (compile) {
            dependenciesStream = dependenciesStream.pipe(gulpif(/\.js$/, babel({ presets: [babelPresetES2015NoModules] })));
          }

          dependenciesStream = dependenciesStream
            .pipe(gulpif(/\.js$/, babel({ presets: [babiliPreset] })))
            .pipe(gulpif(/\.css$/, cssSlam({stripWhitespace: true})))
            .pipe(gulpif(/\.html$/, cssSlam({stripWhitespace: true})))
            .pipe(gulpif(/\.html$/, htmlMinifier({collapseWhitespace: true, removeComments: true})))
            .pipe(dependenciesStreamSplitter.rejoin())
            .pipe(dependenciesUntouchables.restore)
            .pipe(gulpif(debug, gulpdebug({title: 'dependency:'})));


          let buildStream = mergeStream(sourcesStream, dependenciesStream)
            .once('data', () => {
              console.log(`Creating ${prefix} build files...`);
            });

          // we're not bundling - http/2 server-push manifests TFW !!!
          // buildStream = buildStream.pipe(polymerProject.bundler());

          // Okay, time to pipe to the build directory
          buildStream = buildStream.pipe(gulp.dest(buildDirectory));

          // waitFor the buildStream to complete
          return waitFor(buildStream);
        })
        .then(() => {
          console.log(`Analyzing ${prefix} build dependencies...`);
          return polymerProject.analyzer.analyzeDependencies;
        })
        .then((depsIndex) => {
          console.log(`Generating ${prefix} push manifest...`);

          var pushManifest = {};
          var shellDeps = depsIndex.fragmentToFullDeps.get(polymerProject.config.shell);
          var shellKey = polymerProject.config.shell.substring(polymerProject.config.root.length + 1);

          // every path needs the shell
          var shellPush = { [`/${prefix}/${shellKey}`]: { "type": "document", "weight": 1 }};
          shellPush = Object.assign(shellPush, ...shellDeps.imports.map(s => ({[`/${prefix}/${s}`]: { "type": "document", "weight": 1 }})));
          shellPush = Object.assign(shellPush, ...shellDeps.scripts.map(s => ({[`/${prefix}/${s}`]: { "type": "script", "weight": 1 } })));
          shellPush = Object.assign(shellPush, ...shellDeps.styles.map(s => ({[`/${prefix}/${s}`]: { "type": "style", "weight": 1 } })));

          // then for every fragment ...
          depsIndex.fragmentToFullDeps.forEach((depImports, key) => {
            if (key.startsWith(polymerProject.config.root)) {
              key = key.substring(polymerProject.config.root.length + 1);
            }

            // get the route path ...
            let route = polymerJson.fragmentRoutes[key];
            if (!route) return;

            // add the shell dependencies ...
            var routePush = Object.assign({}, shellPush)

            // and the route-specific dependencies
            routePush = Object.assign(routePush, ...depImports.imports.map(s => ({[`/${prefix}/${s}`]: { "type": "document", "weight": 1 }})));
            routePush = Object.assign(routePush, ...depImports.scripts.map(s => ({[`/${prefix}/${s}`]: { "type": "script", "weight": 1 }})));
            routePush = Object.assign(routePush, ...depImports.styles.map(s => ({[`/${prefix}/${s}`]: { "type": "style", "weight": 1 }})));

            pushManifest[route] = routePush;
          });

          fs.writeFileSync(`${buildRoot}/${prefix}-push-manifest.json`, JSON.stringify(pushManifest, null, '  '));

          return depsIndex;
        })
        .then((depsIndex) => {
          console.log(`Generating ${prefix} Service Worker...`);

          // precache all the things
          const precachedAssets = new Set(polymerProject.config.allFragments);
          for (let depImports of depsIndex.fragmentToFullDeps.values()) {
            depImports.imports.forEach((s) => precachedAssets.add(s));
            depImports.scripts.forEach((s) => precachedAssets.add(s));
            depImports.styles.forEach((s) => precachedAssets.add(s));
          }

          let staticFileGlobs = Array.from(swPrecacheConfig.staticFileGlobs || []);
          staticFileGlobs = staticFileGlobs.concat(Array.from(precachedAssets));
          staticFileGlobs = staticFileGlobs.concat(polymerProject.config.sources);
          staticFileGlobs = staticFileGlobs.concat(polymerProject.config.extraDependencies);

          // make the paths relative
          staticFileGlobs = staticFileGlobs.map(path => {
            if (path.startsWith(polymerProject.config.root)) {
              path = path.substring(polymerProject.config.root.length + 1);
            }
            return path;
          });

          // handle index.html as a dependency using "/", don't re-request it as index.html
          swPrecacheConfig.staticFileGlobs = staticFileGlobs.filter(path => path !== 'index.html');
          swPrecacheConfig.dynamicUrlToDependencies = {
            '/': ['index.html']
          };

          // add the build specific prefix
          swPrecacheConfig.stripPrefixMulti = {
            'images/': `/images/`,
            'src/': `/${prefix}/src/`,
            'bower_components/': `/${prefix}/bower_components/`,
            'node_modules/': `/${prefix}/node_modules/`
          };

          // build it, and they will come ... without having to re-request files!
          swPrecache.write(`${buildRoot}/${prefix}-service-worker.js`, swPrecacheConfig, function(e) {
            if (e) {
              reject(e);
            } else {
              resolve();
            }
          });
        })
        .catch(reason => console.log('failed:', reason));
    });
  }
}

gulp.task('clean', () => del([buildRoot]));
gulp.task('build-es5', build('es5', true));
gulp.task('build-es6', build('es6', false));
gulp.task('build', gulp.series(['build-es5', 'build-es6']));

I mentioned earlier that I need to push the dependencies per-route. This requires a mapping that can be used during the build process from polymer src fragment to URL route. I add this as an extra property in polymer.json file:

"fragmentRoutes": {
  "src/my-home.html": "/",
  "src/my-about.html": "/about",
  "src/my-profile.html": "/profile"
}

The route is then used as the key for each block of dependencies within the push manifest, e.g.:

{
  "/about": {
    "/es6/src/my-app.html": {
      "type": "document",
      "weight": 1
    },
    "/es6/bower_components/polymer/polymer.html": {
      "type": "document",
      "weight": 1
    },

There are a few other pieces - on the server for instance I check the UserAgent and decide to send the es5 or es6 version and, as well as the link preload headers for the view being rendered I also send some preconnect headers for some external domains that are used to load polyfills, Google Analytics and images.

Although the service-worker takes care of caching and repeat visits (so I don’t feel too bad about the preload push for new views), another optimization that I may look at adding is CASPer - Cache Aware Server Push. This uses a bloom filter and cookie so the server only pushes assets that the client doesn’t already have.

I’ll try to post more information about the serve-side of things and the AppEngine configuration and middleware used to serve the site.