In his post, @ppk argues something that I’ve heard before, that we would do good to consider pacing down on new developments – and how we should focus on improving the most rudimentary things that are still so very impossible to work with on the web. While I don’t think a moratorium, nor slowing down the pace are proper answers – the problem is unquestionably there:
How did we get to the point where we are able to have realtime bi-directional peer-to-peer communication between visitors to our site, yet we can’t determine the styles for a measly
<select>
input, we can’t customize form elements like<input type='check'>
, we can’t use pseudo-selectors like:before
or:after
on certain elements like<img>
or<input>
, because these are “replaced elements” – what the hell is that?
To me, the problem is not that we are moving too fast on cutting-edge technologies such as WebRCT, HTTP 2.0, enforcing TLS, Service Workers, ES6, ES7, WebP, Web Audio, WebGL and countless others. Rather, the problem is that we are making no effort whatsoever to fix things that have been broken forever, like being able to give basic styling to form elements in a non-over-engineered way.
I vehemently agree that constraints drive innovation, as @ppk points out in his post. I think the web platform is constrained enough as is, though. However, I do get what ppk is trying to get out there – we are worrying too much about these complicated things and missing out on the essential. For us to keep on focusing on the complicated problems we all know and love, we should deal with the inexcusable pain points of the web, such as how lousy form elements behave, how lame native form validation is (where by the way, you can’t style the validation errors consistently nor effectively), and how JavaScript ends up becoming the de-facto excuse for all of this “well you can do it with jQuery”, “well, it works if you add some JavaScript sugar cones on top”. Problem is, this is as rudimentary of an excuse as “it works on my machine”, and we all know by now that’s a terrible excuse.
Right?
On “Web vs. Native”
This is the ultimate argument, right? The biggest issue here is misinformation. While, yes, obviously, the web platform is not as well-suited as native mobile apps in some ways – you probably wouldn’t want your mobile experience of realtime video chats to be web-based, because it might be terribly tricky to get that right, there’s also a lot of this bandwagon effect where somebody says something and almost everyone else just nods their heads in agreement.
Instant Articles
A great example is one of Facebook’s recent features being unveiled – the so-called “Instant Articles” feature. Instant Articles is fancy marketing speak for Facebook crawling news sites that have paid a hefty fee for Facebook to do so, storing their articles in a standarized data structure of their choosing, rendering a “faithful-enough” version of the media site’s article, and presenting that “view” when humans try to access one of the media site’s pages, instead of the actual website.
This is useful because media sites are the bane of the web. They sometimes take double-digit seconds to load, incur several MB worth of downloads. Thus, data charges go off the roof, not to mention when the human is relying on roaming and they have to pay €10 for 200MB worth of content, essentially paying roughly €2 per page view. Of course, you could try and pin that on the mobile networks, but they’ve been charging insane amounts of money to roamers for over a decade, and we’ve made it a problem only recently.
With Instant Articles, media sites are taking the worst possible step in the right direction. They’re effectively saying:
Our site is garbage, BUT, if you visit us through Facebook, well… then they’ll solve our embarrassing performance issues on our behalf. If you’re on mobile, make sure to visit our site through Facebook, otherwise it might take you seven seconds to see this page.
– Implied message from every media site on Instant Articles
This screams of missed opportunity. The notion of investing tons of money in ensuring that Facebook solves your performance problems when people visit your site through Facebook (and only in that case), is as terrible of a value proposition as any. You would be way better off hiring a performance consultant like Tim Kadlec to fix your web performance issues for good.
This brings me to my last point, the “let’s blame our problems on the web” approach.
Blaming slowness on the web, The Verge way
If you’ve gotten this far, chances are you haven’t missed The Verge’s article on how “the mobile web sucks”. In their article they basically blame the web on the fact that dozens of media deals and analytics human-data-siphoning that goes on when the page is first loaded. This is well summarized by Les Orchard in their article.
Holy crap. It took over 30 seconds. In the end, it fetched over 9.5MB across 263 HTTP requests. That’s almost an order of magnitude more data & time than needed for the article itself.
Head over to blog.lmorchard.com for their full analysis.
A couple hundred requests, almost 10MB in data transfers, how the hell do you expect the mobile web to not suck in those conditions? That page view just costed you €2 in data roaming charges. You can buy a couple of physical WIRED magazines for that cost.
Just because your website is slow doesn’t mean the web platform is at fault. Any platform you misuse is going to slow down to a crawl. Instead of spending time and energy figuring out who’s to blame for your slow site, try and do something about it. Maybe you don’t need to talk to 22 different analytics providers on every page load, maybe a single service can broadcast events on your behalf to all of those. Maybe you don’t need to load 1.3MB worth of images on page load, maybe you could defer most of those, most of the ones below the fold, until after the text content has loaded.
“Dis-services” like t.co don’t do much in the way of helping, either.
It’s time for media sites to begin acting as media sites and stop acting as advertisement platforms that only care about cramming more ads into their pages. As long as that keeps happening, their websites will continue to be as slow as snails.
It’s time we stop breaking the web, we start building performance into our sites, and we start caring about the fundamentals of the web platform. Otherwise, how can we expect to advance the more sophisticated aspects of its development?
Back to the Classical Elements
Consider this. If <select>
, <input>
, <textarea>
, and client-side validation were to start being developed today, what would they look like? Would they look like impenetrable black boxes with arbitrary constraints set forth by each individual user agent? Or would they be modular, extensible, easy to integrate into your designs without obscure hacks, and as flexible as any userland component that’s developed on top of them? If your answer is anything different than “exactly how they are today”, why on earth are we not doing anything to change them? They’re still the most ubiquitous form of user interaction on the web (besides links), and we’ve been resting on our laurels for far too long.
I’ve always found standards teams to be of a similar nature, but I’ll admit that’s mostly because of my ignorance. I have little idea about how they operate, how a proposal moves through the ranks, how much effective “power” an average developer’s voice has (from what I hear, it’s mostly browser vendors that get a say), and how the effort to implement certain features across browsers is coordinated.
That being said, I’d love for more people to get involved in the standards process, asking for these things everyone seems to be taking for granted even though we’ve been developing hacks around them for years.
I’ll leave you with a video you might want to see. It’s Chris Heilmann talking about: “Advancing the Web Without Breaking it”.
Comments