Sunday, December 15, 2019

A Different Take on The Failings of Open Source Software

http://marktarver.com/thecathedralandthebizarre.html

The above is a very good article describing why most of the promises and predictions made in ESR's The Cathedral and the Bazaar (1998,) about the coming triumph of open source over closed source, failed to come true. As someone who read TCatB and fell for its promises -- in 1998, no less -- I cannot help but nod along to the arguments presented. It's helpful to go back to the early promises because while in 2019 open source may feel dominant, it has nevertheless clearly failed to live up to the original hype.

If anything, the author isn't critical enough when it comes to the claim that the world would converge on optimal solutions and avoid duplicate effort. I've got about ten thousand Javascript frameworks waiting outside to have a word with that one.

But I want to come at it from a different angle, and make what may be a novel criticism of open source.

Back in the day, if you wanted software to run your business on, you had to pay for it. Even if you only need the tools, to allow you to build your own software with which to run your business, you still had to pay for it. Operating systems, compilers, databases: money, money, money. The obvious consequence was that running a business on software was rather expensive. (This world sort of still exists, in Microsoft shops.)

But there was a secondary consequence that made things even more expensive than that: you needed to hire people that knew how to use and take care of all this software. This feels a bit like adding insult to injury, but it was actually something that businesses didn't mind at all, because it fit their existing mental model of how industrial systems worked. You see, if I had a factory, it was obviously necessary to hire people to work in it. If I bought a Bridgeport mill, it was only sensible that I needed to hire a machinist to run it. A crane needed an operator, a forklift needed a driver, and so on. Capital was never sufficient on its own, it was always necessary to add labor to get output.

So even as late as 1999, if you forked out a few million dollars a year for Oracle Database, it only made sense to spend a few hundred thousand extra employing a couple of professional Oracle DBAs. Likewise if you had a fleet of Windows NT servers in the racks, you would have a team of administrators trained (and likely Certified) on Microsoft software to look after them. And so it went for all the large proprietary business software vendors.

Then along comes open source software and... it costs nothing. Oh sure, the message is "free as in speech" not "free as in beer", but in practice it's all priced at $0 and if we're honest that's a big part of the attraction. An interesting thing happens psychologically. Paying $100k/yr for a professional DBA to support a $1M/yr Oracle installation feels very reasonable. Paying $100k/yr for a professional DBA to support a $0/yr MySQL installation... somehow does not.

There's another phenomenon developing right around the same time that reinforces this: the amateurization of business software development. Used to be one needed all this expensive software (and hardware!) to get a tech business off the ground. Then suddenly all you need is a cheap x86 server plus the zero-dollar LAMP stack and you're off to the races. For a while it was easy to dismiss this approach as the domain of hobbyists, but then the hobbyists starting launching successful businesses with it, forcing the entire industry to take it seriously. I say "amateurization" because the key driver here was the availability of free (as in beer) software that ran on cheap hardware, which allowed motivated hackers to get experience doing stuff without training, certifications, mentorship, or even (in many cases) college.

This deeply affected the culture of tech companies. In the proprietary high-dollar era, a developer was happy to enlist the help of a DBA, because the DBA was the expert on the database. The DBA was happy to enlist the help of the SysAdmin, becuase the SysAdmin was the expert on the OS and hardware. The SysAdmin was happy to enlist the help of the Network Admin... and so on. In the LAMP era, it's just four guys in a garage, and they all have to do everything just good enough to ship. The hardware, OS, network, database, compiler suite, various server software, and everything else is easy enough to procure, install, and configure that any motivated hacker can do it. There's neither a need nor time for specialized professionals.

This in turn has deeply affected the career development of technologists. Oracle DBA and Microsoft Server Admin used to be stable, high-paying jobs with long-term career prospects. Satellite firms built businesses around selling tools to these folks. These career-slash-cultures had their own conferences, newsletters, even glossy monthly magazines. Almost all of that is absent from the open source world. Do you know anyone who got training on how to install Linux? Anyone who's made a career out of MySQL administration? Someone certified on nginx?

I think it's been about 20 years since this evolution got going in earnest, so it seems reasonable to take a look back, as the author of the opening link did, and ask where it's gotten us.

In the "pro" column, it's a hell of a lot easier to start a company than it ever has been. If you have an idea and the drive to pursue it, it's never been cheaper or easier to try giving it a go.

In the "con" column, we have a systematic loss of expertise and deep understanding. We assume now that any piece of software that's no further away than apt-get install should be something we can run professionally, in production, with real money on the line, with no training, no practice, hell maybe not even a skim of the documentation.

Tuesday, November 26, 2019

Kubernetes is Anti-DevOps

(bias warning: I think Kubernetes is basically cancer)

So over the last 10 years or so there's been this whole DevOps... movement... thing. The industry got the idea into its collective head that developers should participate in the operation of the software that they build, that operators should adopt practices from development like using source control, and that in general development and operations should work more closely. In the limiting/idealized case, developers are the operators and there's no organizational separation at all.

In general this was a good idea! Developers who have ops responsibilities build software that is more operable, and operators became more effective by using source control and programming languages better than /bin/bash.

There has also been a lot of pretense and bullshit. Companies have undertaken "DevOps transformations" and crap like that which ultimately accomplished nothing at all. System Administrators have had their titled changed to "DevOps Engineers" with zero change in responsibilities or organizational structure. Any company that uses the cloud has declared that they "do DevOps" and left it at that.

And then there's Kubernetes.

Kubernetes runs containers, containers means Docker, and Docker is super DevOps, right?? Yeah, about that...

An interesting thing about containers is they simplify deployment. If I have a Python program that depends on a specific interpreter version and specific versions of a few dozen libraries, I need to be able to manage the deployment target to make sure all that important stuff is there. But if I package it in a container, now it's a singular artifact that I can plop down anywhere and just docker run that action (so the theory goes, anyway).

That simplified deployment can act as an inter-organizational interface. This is a fancy way of saying that it enables that practice we all just decided was bad: developers throwing their code over the wall to be ops' problem. And once they start doing that, the next step is overly complicated systems of containers with elaborate dependencies on each other, and now you need "container orchestration".

Kubernetes thus becomes the icing on the anti-DevOps cake. In theory it enables all this flexibility and solves all these container orchestration problems (that, it should be noted, we didn't have at all just 5 short years ago). In reality it's a hyper-complex operations layer that requires a handful of specialists in order to use at all.

Kubernetes does nothing for the developer, but nor does it hurt the developer. Being just an execution substrate, Kubernetes is irrelevant to the developer. Thus in their ordinary course of business, a developer would have no need to learn and understand how it works. Nor would it be efficient for them to do so, given Kubernetes' off-the-charts complexity. It's reasonable for, say, a Java developer to learn how to manage the JVM as a runtime and what it takes to deploy applications with it. By comparison, learning Kubernetes is like learning how to run an entire private cloud: simply not something it's worth a developer's time to do.

So ultimately, adopting Kubernetes is about the most anti-DevOps move you could make as a software organization. The wall between dev and ops that we've spent the last decade tearing down is going right back up, and we'll set about throwing our code over it. Enjoy!

This all doesn't make the argument as clearly as I would like but hey this is my blog and I get to rant if I want.

Saturday, November 23, 2019

Why Engineers Are Grumpy

https://humanwhocodes.com/blog/2012/06/12/the-care-and-feeding-of-software-engineers-or-why-engineers-are-grumpy/

A near-perfect explanation of why engineers are grumpy and say "no" all the time.

Sunday, October 27, 2019

Solid Advice

https://rachelbythebay.com/w/2019/10/25/enabler/


"When you're already in a hole, QUIT DIGGING."

Tuesday, October 22, 2019

Andromeda in the Future Sky

https://kottke.org/19/10/behold-our-dazzling-night-sky-when-the-milky-way-collides-with-andromeda-in-4-billion-years

I have a long-running fascination with cosmology, which is part of how this blog got its name.

If you live in a city, with its attendant light pollution, you likely have never even seen the Milky Way except in photos. If that's true you owe it to yourself to visit somewhere far from city lights where the pale glow of our galactic disk is visible to the unaided eye. Then imagine seeing not just our own galaxy but another galaxy in the night sky... (now click the link)

Wednesday, October 16, 2019

THIS is what America needs

https://www.politico.com/magazine/story/2019/10/13/america-cultural-divide-red-state-blue-state-228111

Caleb Wright, who’s from Chapel Hill says, “The value is that you can staunchly disagree with someone, but also humanize the person.” Adds Gaby, “It was more to learn about each other than to change people’s minds.”

The point, in other words, is to combat “othering.”

Thursday, October 10, 2019

Why Remote War is Bad War

https://www.technologyreview.com/s/614488/why-remote-war-is-bad-war/

Just one link today, and it's not one that's going to make you feel good. About anything.

The moral distance a society creates from the killing done in its name will increase the killing done in its name. We allow technology to increase moral distance; thus, technology increases the killing. More civilians than combatants die in modern warfare, so technology increases worldwide civilian murder at the hands of armies large and small.

Sunday, October 6, 2019

Sunday Assorted Links 2019-10-06

https://kevinlynagh.com/notes/pricing-niche-products/
I love keyboards and I love auction theory and this link has both!

https://www.eidel.io/2019/04/24/making-my-own-glasses/
Know thyself.

https://qz.com/1721901/hong-kong-anti-mask-law-a-history-of-mask-bans-around-the-world/
Masks are about inverting power dynamics.

https://www.theatlantic.com/education/archive/2019/10/college-students-dont-want-fancy-libraries/599455/
Books are good.

https://www.youtube.com/watch?v=1OfxlSG6q5Y
Toasters: another thing that's gotten worse.
I will say though that this machine is awesome, and if you need a toaster just forget all the stuff that only toasts bread and get this instead, because it's better at that and it does other stuff (e.g. it reheats pizza like a boss).

https://williamyaoh.com/posts/2019-10-05-you-are-already-smart-enough.html
"...the perception of what tools, libraries, and concepts are important ends up distorted by novelty and excessive cleverness."
Honestly that feels like reason enough to avoid this whole scene.

http://rachelbythebay.com/w/2019/10/05/nxdomain/
Vendor shit is awful. ISPs are awful. Know how your tools work. It's worth spending extra effort to do things properly.

https://towardsdatascience.com/coding-ml-tools-like-you-code-ml-models-ddba3357eace
This is cool and I want to try it.

https://www.slashgeek.net/2016/05/17/cloudflare-is-ruining-the-internet-for-me/
American tech companies pretend most of the rest of the world doesn't exist, film at 11.

React rant

A transcript of something that happened in Slack the other day...

me, trying to avoid starting a rant: [vicious off-topic React rant]
(ed: literally what I wrote, brackets and all)

FE specialist friend & former coworker, stepping in it: ...since you brought it up again, I'm curious why you say it's user-hostile (I'll agree up-front that a lot of its uses can be)

(ed: lightly edited to conceal identities)

me: so like, once upon a time the web was documents, right? and people had slow computers and poor internet connections and retrieving a few KB could be perceptibly slow, as in multiple seconds

me: and then a lot of stuff got better. The experience of browsing a document-based site in 2019, even on a bad machine on a bad connection, is generally lightning-fast. The highest-profile example, though imperfect, is Wikipedia

me: but we don't see that, in general. In general, we see "modern" web development, where, for the convenience of the developer, we routinely ship multiple megabytes of executable code to the client, not just once but repeatedly

me: we consume the client's bandwidth, the client's CPU cycles (and battery power!), to do the same exact operations, over and over again. Instead of sending them the document that results from those operations.

me: And sometimes, this is justified. When you are building an app. Facebook, twitter, google maps, gmail, etc., these are all applications that use the web as a delivery mechanism. But Wikipedia, retailers (including Amazon), newspapers, blogs, these are document sites just like we had in 1995

me: and we are constructing them in such a way that they are hundreds, even thousands of times slower than they could be

me: and we do this in a way that systematically disregards who our client is. We stand at our electrically-raisable desks, with our $3k Macs, 27" monitors, and multi-hundred-Mbps internet connections and revel in the beauty of what we can create. But our client has a 2-year-old mid-range Android phone and intermittent/noisy metered 3G service

me: and we are shipping that client five goddamned megabytes that render into less than a kilobyte of content

me: WE. SHOULD. BE. ASHAMED.

me: </rant>
So yeah, my throwaway "I don't want to rant about this today" comment became a kind of self-fulfilling prophecy and I ranted anyway.

I'm sure I'll have more to say about this in the future, but having transcribed this here I feel it's worth expanding on it a little.

Circa 1997 I got my first Pentium-era PC, which was also my first internet-connected machine. It had a 200MHz CPU, 32MB of RAM (which was somewhat luxurious at the time, 16MB being much more common), a 3.2GB hard disk, and a 33.6Kbps dial-up connection to a local telco ISP. The internet was much less a part of our daily lives like it is today, but being a huge nerd it was certainly important to me even at the time. This comic is from a few years later, but it captures the spirit of what living with dial-up was like.

Fast forward to now. My current PC has a 3.8GHz (up to 4.4 boosted) CPU with 6 cores, 16GB of RAM, a 1TB SSD plus 4TB of spinning rust, and a 200Mbps cable internet connection. Let's review the gains:
  • CPU: rounded to 4GHz, a 20x gain (or if we count all the cores, 120x)
  • RAM: 512x
  • Disk: over 1000x in capacity, speed-wise I don't even know, but if we use bus bandwidth as a proxy, this suggests Ultra SCSI was 160Mbps vs NVMe at 32Gbps, so that would be 200x
  • Network: ~5952x, yes that's right, a nearly six thousand-fold gain
Wow! Surely in light of all this hardware improvement, the experience of using a computer, and using the internet in particular, must be much faster now, right?

See, about that...

Let's use Firefox developer tools to measure a few websites:
  • nytimes.com: 1.58MB (on the wire), 5.39MB (decompressed), 6.16s, 47 requests
  • washingtonpost.com: 4.67MB / 6.15MB, ~8s, 93 requests
  • slatestarcodex.com: 1.15MB / 1.47MB, 2.31s, 48 requests
  • marginalrevolution.com: 0.98MB / 2.47MB, 2.98s, 35 requests 
  • that Wikipedia page linked above: 219KB / 858KB, 1.71s, 22 requests
Out of this entirely non-scientific sample of two major newspapers, two of my favorite blogs, and a Wikipedia article, only Wikipedia comes out looking vaguely close to something possibly approximating fast.

It's so disappointing, so draining, I don't even have the energy to keep ranting about it. All the hard work by hardware engineers over the last two decades has been eaten away by developers making their own lives easier at the expense of their customers, and I just can't fucking stand it.

Thursday, October 3, 2019

Thursday Assorted Links 2019-10-03

https://www.theverge.com/2019/10/3/20895798/bird-scooter-fundraising-valuation-unit-economics
Bird managed to lose ~$100M in Q1 on revenue of only ~$15M. Despite that I guess they managed to raise a bunch more money. For some reason The Verge parrots (get it?) Bird's claims about the lifespan of its new scooter models. I get that you can project 15 months of life based on testing, but when the things have only been on the street for 1-4 months you need to say it's a projection.

https://www.delish.com/food-news/a29351591/mcrib-back-mcdonalds-2019/
The McRib is coming back. Does that mean Chipotle Chorizo is next? That wacky theory that McRib runs have something to do with pork futures always struck me as crazy enough to be true.

https://www.objectstyle.com/agile/why-developers-hate-agile
Assertion: Agile Process Bullshit is about legibility, in the Seeing Like A State sense.

https://doisinkidney.com/posts/2019-10-02-what-is-good-about-haskell.html
Look, I get what you're trying to do here, but it's not working. "Let's implement a basic data structure and a sorting algorithm!" is such a Haskell programmer thing to do. If I mostly write code in [general purpose language] I don't need those things, because they're in the standard library, or a popular-consensus third party library.

I think programmers frequently forget just how different other kinds of programming are from whatever it is they do. Web programming is not game programming is not systems programming is not embedded programming is not kernel programming is not [et cetera]. Speaking as a web/data guy, showing me how awesome your thing is at foundational data structures / algorithms stuff is just 100% irrelevant, because I will never implement those things myself.

Ask HN: Who Wants To Be Fired?
You think your job sucks? Reading this is gonna make you feel better.

Microsoft Surface Laptop 3 pre-order
What they want for extra RAM and disk is fucking reprehensible. It's three hundred dollars to go from 128GB to 256GB. Buying 128GB of NVMe at retail is like forty bucks. A nice 1TB NVMe drive can be had for $170. They know this. They know 8GB/128GB is for suckers, and that almost everyone will upgrade. It's Apple-style price anchoring. The margins on this must be fantastic for Microsoft.

...I may buy one anyway.

https://www.pcgamer.com/world-of-warcraft-classic-players-cant-stop-feuding-over-the-abbreviation-for-an-old-dungeon/
Every time I'm in Westfall someone is having this argument and it drives me nuts.

Introducing the blog

It's finally happening. A blog.

Expect a mix of links and rants, starting Very Soon Now.