Sunday, October 6, 2019

React rant

A transcript of something that happened in Slack the other day...

me, trying to avoid starting a rant: [vicious off-topic React rant]
(ed: literally what I wrote, brackets and all)

FE specialist friend & former coworker, stepping in it: ...since you brought it up again, I'm curious why you say it's user-hostile (I'll agree up-front that a lot of its uses can be)

(ed: lightly edited to conceal identities)

me: so like, once upon a time the web was documents, right? and people had slow computers and poor internet connections and retrieving a few KB could be perceptibly slow, as in multiple seconds

me: and then a lot of stuff got better. The experience of browsing a document-based site in 2019, even on a bad machine on a bad connection, is generally lightning-fast. The highest-profile example, though imperfect, is Wikipedia

me: but we don't see that, in general. In general, we see "modern" web development, where, for the convenience of the developer, we routinely ship multiple megabytes of executable code to the client, not just once but repeatedly

me: we consume the client's bandwidth, the client's CPU cycles (and battery power!), to do the same exact operations, over and over again. Instead of sending them the document that results from those operations.

me: And sometimes, this is justified. When you are building an app. Facebook, twitter, google maps, gmail, etc., these are all applications that use the web as a delivery mechanism. But Wikipedia, retailers (including Amazon), newspapers, blogs, these are document sites just like we had in 1995

me: and we are constructing them in such a way that they are hundreds, even thousands of times slower than they could be

me: and we do this in a way that systematically disregards who our client is. We stand at our electrically-raisable desks, with our $3k Macs, 27" monitors, and multi-hundred-Mbps internet connections and revel in the beauty of what we can create. But our client has a 2-year-old mid-range Android phone and intermittent/noisy metered 3G service

me: and we are shipping that client five goddamned megabytes that render into less than a kilobyte of content

me: WE. SHOULD. BE. ASHAMED.

me: </rant>
So yeah, my throwaway "I don't want to rant about this today" comment became a kind of self-fulfilling prophecy and I ranted anyway.

I'm sure I'll have more to say about this in the future, but having transcribed this here I feel it's worth expanding on it a little.

Circa 1997 I got my first Pentium-era PC, which was also my first internet-connected machine. It had a 200MHz CPU, 32MB of RAM (which was somewhat luxurious at the time, 16MB being much more common), a 3.2GB hard disk, and a 33.6Kbps dial-up connection to a local telco ISP. The internet was much less a part of our daily lives like it is today, but being a huge nerd it was certainly important to me even at the time. This comic is from a few years later, but it captures the spirit of what living with dial-up was like.

Fast forward to now. My current PC has a 3.8GHz (up to 4.4 boosted) CPU with 6 cores, 16GB of RAM, a 1TB SSD plus 4TB of spinning rust, and a 200Mbps cable internet connection. Let's review the gains:
  • CPU: rounded to 4GHz, a 20x gain (or if we count all the cores, 120x)
  • RAM: 512x
  • Disk: over 1000x in capacity, speed-wise I don't even know, but if we use bus bandwidth as a proxy, this suggests Ultra SCSI was 160Mbps vs NVMe at 32Gbps, so that would be 200x
  • Network: ~5952x, yes that's right, a nearly six thousand-fold gain
Wow! Surely in light of all this hardware improvement, the experience of using a computer, and using the internet in particular, must be much faster now, right?

See, about that...

Let's use Firefox developer tools to measure a few websites:
  • nytimes.com: 1.58MB (on the wire), 5.39MB (decompressed), 6.16s, 47 requests
  • washingtonpost.com: 4.67MB / 6.15MB, ~8s, 93 requests
  • slatestarcodex.com: 1.15MB / 1.47MB, 2.31s, 48 requests
  • marginalrevolution.com: 0.98MB / 2.47MB, 2.98s, 35 requests 
  • that Wikipedia page linked above: 219KB / 858KB, 1.71s, 22 requests
Out of this entirely non-scientific sample of two major newspapers, two of my favorite blogs, and a Wikipedia article, only Wikipedia comes out looking vaguely close to something possibly approximating fast.

It's so disappointing, so draining, I don't even have the energy to keep ranting about it. All the hard work by hardware engineers over the last two decades has been eaten away by developers making their own lives easier at the expense of their customers, and I just can't fucking stand it.