First of all, let me make clear that I have no inside knowledge and am not privy to any secrets - my views are based purely on using the platform, hanging out in Slack and watching the presentations.
It’s also maybe useful to describe where I “am” as a web developer. I used to use Angular 1 and then 2, never really cared much for React but was aware of it (and like Redux) but was totally sold on WebComponents and the benefits of any framework being built on the platform. When Angular 2 turned out to be a huge letdown and Polymer had turned 1.x it was time to change and I’ve been happy with the choice. I find it quicker to develop apps and I’m spending more time on app-development and much less on endless framework upgrades plus the end results start and run faster.
But while the version upgrade of the framework itself was great and we have the promise of Polymer being a much smaller, 10kb-ish framework it feels like we never really got to see those benefits. Sure, if we just used Polymer … but who does that? Part of the appeal of Polymer is the rich ecosystem of elements ready to use and a significant number (when it comes to usage) are Polymer’s own Material Design elements - the set of iron- and paper- components.
Sadly, these are still in hybrid mode so the promised size-gains of 2.0 are harder to attain. Whatever code you write, you’ll still likely be pulling in all the legacy compatibility code to support hybrid mode because the elements you depend on still use them.
My hope was that after the 2.0 dust had settled the focus would shift to updating the elements to take advantage of them. It seems like this might happen but not as immediately as I thought, though probably with better end results.
Instead, Polymer 3.0 was announced and with it a major shift for those of us sold on the HTML-first world view. Polymer was going to become one of the npm crowd.
Personally, I loathe npm. I think it’s a pile of lies and don’t understand why it is so popular. It probably makes sense if you develop for node but I only use that for build systems - I was happy with Bower for front-end components. By comparison, npm was slow and didn’t solve versioning at all IMO, it just caused new issues (often with versions of npm itself). If it were so great, why are so many people now in love with Yarn? But Yarn can be made to work similar to Bower (a flat structure) and as Bower isn’t going to be developed going forwards it makes sense to change.
I suspect there are still many things to work out with projects using npm - it’s maybe not as simple as importing modules when some are resolved from node and others loaded by path and some things like a flat structure and others don’t, but we’ll have to see. If there are issues, they can be solved - it’s just software after all.
Part of the change to npm means that we’ll be able to use other JS ecosystem tooling. That is both good and bad. I personally think the JS ecosystem is a bit of a dumpster fire - I watched the talk on WebPack and wanted to cry afterwards. Escaping that stuff was one of the things I liked about moving from Angular to Polymer and there are still lots of unknowns about how the PRPL lazy-loading pattern will work (presumably with ES modules and no building required at some point?). I really like unbundled code combined with http/2 server-push - it’s faster and less effort, win-win.
The biggest impact for 3.0 and the reason for the shift is not simply to use npm, it’s that HTML Imports are being abandoned. They were great, but the world seems more obsessed with JS everywhere and JS first so ES Modules are going to be the thing to use. As part of that reasoning, npm makes total sense and Yarn makes npm possible.
Maybe it’s the Go developer side of me that finds the namespaced npm packages so jarring. With Go, it’s typically good practice to try and avoid “stuttering” in packages and APIs but npm, oh boy … it was bad enough with Bower where you imported
../polymer/polymer.html but now:
I wonder if that’s to do with Polymer at all?! I find that repetition a bit annoying and it’s repeated for every element as well.
But that’s a minor quibble. What about things to like?
Well my disappointment about the elements being stuck in Hybrid mode seems to have been misplaced as it looks like they are already working on upgraded and simplified versions that promise to be much smaller and much faster. One of the presentations showed some paper-element like components that went from 100Kb to 6Kb - that’s one heck of a reduction so let’s hope they arrive soon.
It hasn’t been said explicitly AFAIK but it seems like some of these gains have something todo with
lit-html. This is a new templating project that they are working on to fit better with the new HTML-in-JS approach that is inevitable as we move from HTML Imports to ES Modules for loading things.
It’s a less-than 2Kb templating system and uses string literals to create “parts” for efficient DOM updates but without the overhead of doing virtual DOM diff operations that React uses. It should be faster and from testing I’ve done creating a js-repaint-perfs version of it, it is. Very fast.
So, will Polymer 3.0 use lit-html as the template engine but with an optional 2.0 style template system that can be used for backward compatibility. That would seem to make sense to me. They have a modularizer tool to convert Polymer 2.0 hybrid elements into ES module compatible code so the existing templates need to be supported somehow.
I’ve been experimenting with using lit-html with pure WebComponents and Redux / Reselect all wrapped up with a Rollup build and it makes me very happy and excited at what might be coming in future.
I reproduced the guts of an event-photo browsing PWA and the entire thing comes in at under 10Kb. That’s 10Kb for the complete app, not the app that also needs a 150Kb+ vendor file - 10Kb for everything! That’s including Redux + Reselect + Redux-Thunk, things like web-worker powered image-packing-layout engine, off-thread image decoding and lazy loading, animated app-drawers (60 fps naturally) and a simple router. It also includes all the unminified HTML and CSS within the elements as well plus I included the WebComponents polyfill loader to save an extra request (checkout the branch that removes HTML Imports which avoids any polyfills being needed for more browsers).
I remember when I was using Angular 2 where getting things under 100Kb just for “hello world” took an inordinate amount of effort and just last week reviewed a React app that was 1.8Mb when all that was visible was a login page. Having so much functionality using so little JS is simply fantastic.
Of course there are still the WebComponent polyfills themselves to load for browsers that don’t support all the features natively but guess what? Because HTML Imports are being dropped the majority of evergreen browsers won’t need to load anything. Think about that - the framework (WebComponents) is baked into the browser and all that we need are some very small utility layers to make it easy to handle template creation and rendering and a few other things.
This is the Polymer 3.0 future that is really exciting - even closer to the platform and smaller and faster as a result. Built on the WebComponents standards that it seems all the other frameworks are now rushing to retrofit support for, but instead of being an afterthought, it was designed precisely for them.
With 3.0, the Polymer future looks very bright and you should really start looking at it as a great option for future web-development.comments powered by Disqus