Assumptions

Last year Benedict Evans wrote about the worldwide proliferation and growth of smartphones. Nolan referenced that post when he extrapolated the kind of experience people will be having:

As Benedict Evans has noted, the next billion people who are poised to come online will be using the internet almost exclusively through smartphones. And if Google’s plans with Android One are any indication, then we have a fairly good idea of what kind of devices the “next billion” will be using:

  • They’ll mostly be running Android.
  • They’ll have decent specs (1GB RAM, quad-core processors).
  • They’ll have an evergreen browser and WebView (Android 5+).
  • What they won’t have, however, is a reliable internet connection.

This is the same argument that Tom made in his presentation at Responsive Field Day. The main point is that network conditions are unreliable, and I absolutely agree that we need to be very, very mindful of that. But I’m not so sure about the other conditions either. They smell like assumptions:

Assumptions are the problem. Whether it’s assumptions about screen size, assumptions about being able-bodied, assumptions about network connectivity, or assumptions about browser capabilities, I don’t think any assumptions are a safe bet. Now you might quite reasonably say that we have to make some assumptions when we’re building on the web, and you’d be right. But I think we should still aim to keep them to a minimum.

It’s not necessarily true that all those new web users will be running WebView browser like Chrome—there are millions of Opera Mini users, and I would expect that number to rise, given all the speed and cost benefits that proxy browsing brings.

I also don’t think that just because a device is a smartphone it necessarily means that it’s a pocket supercomputer. It might seem like a reasonable assumption to make, given the specs of even a low-end smartphone, but the specs don’t tell the whole story.

Alex gave a great presentation at the recent Polymer Summit. He dives deep into exactly how smartphones at the lower end of the market deal with websites.

I don’t normally enjoy listening to talk of hardware and specs, but Alex makes the topic very compelling by tying it directly to how we build websites. In short, we’re using waaaaay too much JavaScript. The message here is not “don’t use JavaScript” but rather “use JavaScript wisely.” Alas, many of the current crop of monolithic frameworks aren’t well suited to this.

Alex’s talk prompted Michael Scharnagl to take a look back at past assumptions and lessons learned on the web, from responsive design to progressive web apps.

We are consistently improving and we often have to realize that our assumptions are wrong.

This is particularly true when we’re making assumptions about how people will access the web.

It’s not enough to talk about the “next billion” in abstract, like an opportunity to reach teeming masses of people ripe for monetization. We need to understand their lives and their priorities with the sort of detail that can build empathy for other people living under vastly different circumstances.

That’s from an article Ethan linked to, noting:

Have you published a response to this? :

Responses

2 Shares

# Shared by Chris Taylor on Saturday, October 29th, 2016 at 3:58pm

# Shared by Gabor Lenard on Monday, October 31st, 2016 at 4:40pm

1 Like

# Liked by Gabor Lenard on Monday, October 31st, 2016 at 4:54pm

Related posts

Upgrades and polyfills

Apple’s policy of locking browser updates to operating system updates is bad for the web and bad for the planet.

Speedy tunes

Improving performance on The Session.

JavaScript

Inside me there are two wolves. They’re both JavaScript.

Performance and people

When it comes to web performance, there are technical issues and then there are human issues.

Lightweight

Minimum viable television and minimum viable websites.

Related links

How To Build Resilient JavaScript UIs — Smashing Magazine

The opening paragraphs of this article should be a mantra recited by every web developer before they begin their working day:

Things on the web can break — the odds are stacked against us. Lots can go wrong: a network request fails, a third-party library breaks, a JavaScript feature is unsupported (assuming JavaScript is even available), a CDN goes down, a user behaves unexpectedly (they double-click a submit button), the list goes on.

Fortunately, we as engineers can avoid, or at least mitigate the impact of breakages in the web apps we build. This however requires a conscious effort and mindset shift towards thinking about unhappy scenarios just as much as happy ones.

I love, love, love the emphasis on reducing assumptions:

Taking a more defensive approach when writing code helps reduce programmer errors arising from making assumptions. Pessimism over optimism favours resilience.

Hell, yeah!

Accepting the fragility of the web is a necessary step towards building resilient systems. A more reliable user experience is synonymous with happy customers. Being equipped for the worst (proactive) is better than putting out fires (reactive) from a business, customer, and developer standpoint (less bugs!).

Tagged with

Two articles on SPA or SPA-like sites vs alternatives — Piper Haywood

On framework-dependency and longevity:

So it’s not even so much about being wary of React or Vue, it’s about not making assumptions, being cautious and cognizant of future needs or restrictions when proposing a tech stack. Any tech stack you choose will ultimately become a ball-and-chain, not just those based on JavaScript frameworks. It’s just that the ball can sometimes be heavier than it needed to be, and you can anticipate that with a little foresight.

Tagged with

I Used The Web For A Day With JavaScript Turned Off — Smashing Magazine

Following on from Charlie’s experiment last year, Chris Ashton has been assessing which sites rely on JavaScript, and which sites use it in a more defensive, resilient way. Some interesting results in here.

A good core experience is indicative of a well-structured web page, which, in turn, is usually a good sign for SEO and for accessibility. It’s usually a well designed web page, as the designer and developer have spent time and effort thinking about what’s truly core to the experience. Progressive enhancement means more robust experiences, with fewer bugs in production and fewer individual browser quirks, because we’re letting the platform do the job rather than trying to write it all from scratch.

Tagged with

The tough truth of reality by Chris Taylor

Progressive enhancement is not about “what if users turn JavaScript off” but “what happens when the page is loaded in sub-optimal circumstances”.

Tagged with

Thriving in Unpredictability - TimKadlec.com

This is the way to approach building for the web:

I want to make as few of those assumptions as possible. Because every assumption I make introduces fragility. Every assumption introduces another way that my site can break.

It’s progressive enhancement, but like Stuart, Tim is no longer planning to use that term.

Tagged with

Previously on this day

13 years ago I wrote Listening

Music to build websites to.

17 years ago I wrote Laid low

Joyous guitar.

17 years ago I wrote Beautiful hackery

An API-driven text adventure, accessible train times and hackable URLs; oh, yes.

18 years ago I wrote Flickrism

Back in July, a Flickr member posted a photoset of a commercial he was working on. The spot involved the release of 250,000 superballs down a hill in San Francisco. Other Flickrites also documented the release.

19 years ago I wrote Anger is an energy

Last year BBC Radio One came to Brighton and organised a week of live music. This year, there’s no sign of auntie Beeb but local promoters have organised a week of music anyway.

21 years ago I wrote Apple - iMac

The deed is done. I just placed an order at the Apple store for a 17 inch iMac.