justin․searls․co

I was surprised and delighted to learn that my friend Len had been invited to write an essay about Bob Chapek's ouster in the New York Times. The article serves as a great primer on some of the issues those of us who live near Disney World have been griping about for years. The whole thing is well worth reading.

His conclusion really stood out to me:

Mr. Iger is reportedly already scrutinizing the reservation system and is alarmed by the price increases his predecessor instituted. To further mend the relationship with our community, Mr. Iger should explain how Disney is going to use the revenue from upcharge programs to improve the guest experience.

If he wants to learn more, I sincerely suggest Mr. Iger try to plan, book and take a Disney World vacation on a middle-class budget, relying only on Disney’s website and app. When he’s overwhelmed by the cost and complexity, I know many fans who’d be happy to talk him through it. No charge.

In software we talk about the value in "dogfooding" an app, because it forces us to embody the persona of the user. If I, as a developer, experience any confusion, encounter any bugs, or feel any friction using the app, I can go to work and fix it. Immediately. No need to channel the feedback roundaboutly through focus group testing, customer support, or product management.

If you're the CEO of a theme park company, it may not seem like a huge sacrifice to dogfood your product by going to a theme park to experience it as the average guest would. But as soon as the company starts down the path of selling priority access for people who can pay more (fast lanes, VIP tours, backstage entrances), you'd surely have access to those luxuries yourself—you're the CEO, after all. It would take remarkable self-restraint not to indulge in those conveniences and instead wait in full-length lines—you know, like an average guest would.

I've seen this phenomenon impact countless software teams as well. If an app features multiple differentiated pricing tiers, the experience at the lower levels of access tend to accumulate more bugs, simply because nobody inside the company is compelled to dogfood them. When was the last time an Amazon engineer tried buying something without a Prime membership? Or a Netflix employee with a Basic subscription? Or an Apple engineer whose iCloud quota is capped at the 5GB free tier? It's no surprise that these experiences are terrible for customers, if they even work at all.

Apple's commitment to accessibility is nothing short of remarkable. They pull features all the time. They ship so many bugs that many of my friends wait months before updating their software. But nothing ever ships until every feature supports every accessibility modality.

It generates nearly zero direct revenue, but it surely makes up for it in good karma. And one reason I started taking accessibility seriously as a developer was having a blind friend show me how magical his iPhone 4 was back in 2011. It didn't just set a high standard for excellence, it expanded my understanding of what was even possible.

Now, why should we bring back that artisan, hand-crafted Web? Oh, I don’t know. Wouldn’t it be nice to have a site that’s not run by an amoral billionaire chaos engine, or algorithmically designed to keep you doomscrolling in a state of fear and anger, or is essentially spyware for governments and/or corporations? Wouldn’t it be nice not to have ads shoved in your face every time you open an app to see what your friends are up to? Wouldn’t it be nice to know that when your friends post something, you’ll actually see it without a social media platform deciding whether to shove it down your feed and pump that feed full of stuff you didn’t ask for?

Wouldn’t that be great?

Few endeavors have felt so immediately "right" as investing in an overhaul of this site and its RSS (well, Atom) feed last week. Looking back, the time in my life that I got the most out of the Internet and put the most back onto it was 1997-2009.

Whatever pulled me away in the years since didn't leave much of an impression beyond my frayed dopamine pathways and a thumb always anxious to scroll up to refresh.

Hard not to conclude that reading and writing blogs is better for the mind than scrolling social media timelines.

An updating monorepo full of self-hostable Open Source fonts bundled into individual NPM packages!

I just stumbled across Fontsource for the first time and it's brilliant. And not because Fontsource provides developers a way to import free fonts, pin them to a particular version range, and host them along with the rest of their applications. For the simple reason that they make it easy to find, filter, and test countless fonts in a web UI that's free of ads, marketing, and visual clutter.

I literally clicked through 228 handwriting-style fonts today before settling on Handlee for a new project. It feels good to finally have a free font site I can recommend without reservation.

Luthen is the most nuanced character Star Wars has ever had. He has all the gravitas of the living myths like Luke Skywalker and Han Solo, all the convictions of Qui-Gon Jinn, all the complications and commitment of Obi-Wan Kenobi, all the showmanship of Kylo Ren, all the cleverness of Leia Organa, and deeper, more human flaws than anyone the series has ever seen. In the capable hands of Andor creator Tony Gilroy and Skarsgård, Luthen is the kind of complicated, thorny, fascinating character Star Wars just never seemed built to contain.

One of my favorite things about Tony Gilroy's Andor series was that by telling a story that doesn't incorporate the Jedi, the Sith, or the Force—and thereby avoiding Star Wars' traditional, simplistic narrative arc of Very Obviously Good overcomes Very Obviously Evil—it created room for characters to react realistically to the circumstance of a fascist, bureaucratic empire encroaching on their daily existence.

If viewed as a role-playing game, the "smuggler" has long been designated by Lucas and Disney as the third playable class after the rebels and imperials, but until recently they've been relegated to comic relief and MacGuffin couriers. And we'd probably never have seen this kind of believable character development if the final Skywalker trilogy hadn't ballooned into such a sloppy and overblown mess. I guess we have JJ Abrams to thank.

I was grateful to be hosted on the Season 1 finale of Matt Swanson's new podcast, YAGNI. It's an interview show challenging that widely popular tools and practices may not be as worthwhile as people think—as the the eponymous agile acronym ("You Ain't Gonna Need It") suggests.

In this episode I share some history about what was going on at the time RSpec rose to prominence, why its continued dominance in the Ruby community is at odds with the declining relevance of the ideas that begat it, and why I personally stopped using and promoting RSpec in the mid-2010s.

I hope you enjoy it!

I'm genuinely excited about this site's rewrite. I thought it'd be fun to make a video to explain my thought process about the redesign—both why I'm excited about finally publishing a linklog/linkblog/microblog and how all these various tools plug together to enable a (mostly) painless continuous deployment pipeline.

I hope you enjoy it!

There's something new coming in Ruby 3.2 this Christmas that I can't wait to start using in all my projects:

Ruby 3.1 [sic] adds a new core class called Data to represent simple immutable value objects. The Data class helps define simple classes for value-alike objects that can be extended with custom methods.

While the Data class is not meant to be used directly, it can be used as a base class for creating custom value objects. The Data class is similar to Struct, but the key difference being that it is immutable.

It'd be easy to look at this and conclude that this is just the same Struct we've been using for decades, with the added constraint that it's immutable. On the other hand, that's a pretty important constraint!

But if you read the pull request, there are some serious quality of life improvements over Struct, like built-in translation of positional arguments to keyword arguments, which allows for easy-to-define default values:

Measure = Data.define(:amount, :unit)

Measure.new(1, 'km') # => OK
Measure.new(amount: 1, unit: 'km') # => OK

Measure = Data.define(:amount, :unit) do
  def initialize(amount: 0, unit: 'cm')
    super
  end

  # imagine other elucidative methods here
  def metric?
    # …
  end
end

This is great news for people who go out of their way to separate their code into two categories: units that implement feature logic and things that represent values. The units implementing application behavior have no state and the values they receive and return are nothing but state. Adopting this approach rigorously transformed my programming practice, allowing for clearer thinking and making progress more predictable.

It's exciting to see Ruby core continue to make consistent iterative progress year after year!

I've been a VR gaming enthusiast for years (having owned the Rift DK1, the HTC Vive and Vive Pro, and the Valve Index before settling on a Quest 2), so I preordered the Meta Quest Pro to see what all the fuss was about.

The Quest Pro’s resolution is 1800 x 1920 pixels per eye, roughly the same as the Quest 2’s 1832 x 1920 pixels. In theory, it provides better contrast and a very slightly higher pixel density per eye, but comparing both devices head-to-head, I was hard-pressed to tell the difference. It’s still grainy enough that images look all right, but small text is fuzzy.

I, for one, decided to return this thing within 5 minutes of unboxing it.

The $1500 Quest Pro makes trade-offs that add up to a significantly worse headset than its (until recently) $299 predecessor. The Pro's pancake lenses improve the field-of-view slightly, but they also magnify each pixel more, reducing the sharpness of the image. Placing the battery on the back helps balance the Pro's weight distribution, but it also forecloses the possibility of 3rd-party straps—which matters, because the Pro is much less comfortable than my Quest 2 with this excellent $35 strap. The Pro's open design (it barely obstructs the user's peripheral vision) makes it a statement piece that VR doesn't have to be antisocial, but its "wings" let in so much ambient light that it makes most games instantly nausea-inducing.

Do not buy the Meta Quest Pro.

Worlds and Workrooms are available for the Quest 2 and Quest Pro alike, but Workrooms is particularly aimed at Pro users. And—there’s just no nice way to put it—it’s one of the worst apps I’ve ever used.

This is what really kills me about this product, though. The hardware is pitched as one's entry point to "the metaverse", but there is no metaverse! Just a couple broken apps. They're so bad that management can't even force the programmers making the apps to try using them.

Superficially, sure, Horizon Worlds is a worse version of RecRoom and Horizon Workrooms is a much worse version of BigScreen VR. On a deeper level, though, this failure is emblematic of what large companies often get wrong when they undertake greenfield software projects. They dream big, staff big, and then start building.

Ready. Fire. Aim.

One way I think about this is that almost every stimuli that a very small team building a very small thing encounters amounts to direct product feedback in some form or another. Even if the team members themselves are the only users, the feedback loop couldn't be tighter. Change a thing. Try it out. Repeat.

The larger a human organization surrounding a product grows—especially when it outpaces the maturation of the product—the less attention will be paid to the product itself. That attention will instead be diverted to the superlinearly-growing needs of all the humans (logistics, consensus-building, rework) and the affordances they demand of the product to accommodate so many people (modularized design patterns, internal tooling, service orchestration).

Big teams don't result in successful products, but successful products sometimes result in big teams.

Connecting a gaming PC to Apple Studio Display

…You're right, it shouldn't be this hard

I'll never forget when I bought the first 5K Retina iMac. Almost as soon as I ripped it out of the box, I booted it while holding down Command-F2, assuming it would support Target Display Mode, with the intention of using its one-of-a-kind display with my gaming PC. I was heartbroken when Mac OS X booted anyway and I slowly realized that Target Display Mode hadn't survived the transition to retina resolutions. And it never came back, either. (I haven't really been happy with my setup ever since.)

Well, here we are, 8 years later and Apple has introduced the 5K Studio Display. I ordered one the minute that they hit the store in the hope I would receive what I thought I had purchased in 2014: a single 5K Apple display that could drive both a Mac and a PC desktop. (Nevermind the fact that it's damn near the exact same panel that I bought 8 years ago.) When my Studio Display arrived, I tore it out of its environmentally-friendly origami box and excitedly plugged it into one of my Nvidia RTX 3090's DisplayPorts with a DisplayPort-to-USB-C cable.

I booted up the PC: nothing happened.

Okay, I'm interested…

Cramming a gaming GPU into your MacBook Pro

…without actually doing that

How we got here

After Apple released its (soon-to-be) previous generation Mac Pro, it probably didn't take long for them to realize they had a trash can fire on their hands, especially with regards to GPU performance. When Apple announced eGPU support for macOS in 2017's High Sierra release, it was hard not to see the announcement as anything more than an admission that Apple's top-of-the-line desktops and notebooks shipped with subpar GPUs due to their severe thermal constraints. Of course, because Apple has never considered AAA gaming to be an important function of its products, the Mac has always lagged behind Windows in GPU availability and support. But by 2017 (and until the new Mac Pro tower releases this fall), the situation has been especially grim: even for workstation tasks like video encoding and 3D modeling, the internal GPUs Apple has been selling are so bad that they're driving a nontrivial number of creative professionals—a market Apple actually does care about—off its platform.

The world may be excited to close the door on the ill-conceived trash can Mac Pro, but if it hadn't been for its glaring design flaws, Apple and Intel probably wouldn't have prioritized the engineering needed to make running an eGPU over Thunderbolt 3 a commercial reality.

Keep reading…

You make less money than you used to. Blame your iPhone.

For years, economists have been puzzling over why, despite unprecedented technological innovation since the dawn of the Internet, productivity is flat. Really, nobody seems to know why! Look no further than this week's news to find a consensus opinion that the just-around-the-corner cure for lagging productivity numbers is—wait for it—more technological innovation.

Productivity is a curiously-named economic measure that essentially boils down to "amount of money you generate for your employer over time." And because the promise of most technology is to enable people to do work faster, we should expect technology's useful impact to be measurable, even with an (oversimplified) equation like Labor + Technology = Productivity.

But something has clearly gone wrong. If we work backwards, we already know productivity is flat. And we are equally certain that technology has improved over the last twenty years. That leaves just one variable for which a negative value could explain the productivity gap: maybe we're literally doing less useful work every day. Reflecting on my own experience, I'd go a step further and ask, what if recent technological advances are actually decreasing our productivity?

You'll never guess what happens next…

10 Rules

Here it is, the post that enumerates all of the ways in which remote work has turned me into a total weirdo.

For almost a decade now I've been working from home, enjoying the unusual freedoms—and anxieties—that it brings. If a single theme has emerged, it's this: by default, I'm an undisciplined mess. When given the choice between short-term distractions and long-term goals, I'll take the passing hit of instant gratification every time. (This paragraph took me ten minutes to write because I was text messaging back-and-forth with @hone02.)

Only one thing can overcome my lack of self control: replace all my good intentions with hard and fast rules, then stick to them so rigidly that my constant fear of failure will inadvertently be put to productive use.

But wait, there's more…

Giving the iPad a full-time job

[A translation of this post is available in Chinese and in Spanish ]

Programmers often describe their ideal tools with adjectives like "powerful", "feature-rich", and "highly-configurable". Few users are seen as wanting more from their computers than programmers.

This popular notion agrees with our general intuition that more capability intrinsically yields greater productivity. My lived experience suggests, however, that while capability is a prerequisite for productivity, the two hardly share a linear correlation. A dozen ways to do the same thing just results in time-wasting analysis paralysis. Apps packed with features to cover every conceivable need will slowly crowd out the tool's primary use. Every extra configuration option that I delight in tweaking is another if-else branch in the system, requiring its developers to test more and change less, slowing the pace of innovation.

You'll never guess what happens next…

My counter-cultural iOS 11 wish list

In seven days, it will have been ten years since the keynote to end all keynotes.

A decade hence, I'm uncomfortable with declarations that the iPhone and iOS are "mature," however. Unlike other mature platforms that became commoditized and absorbed into daily life, smartphones have not receded into the periphery of our attention. If anything, the entire planet is more glued to their phones today than ever before. "Mature" is code for "set in stone", and it's my view that any device that's able to so thoroughly capture the attention of the human race as the iPhone-era smartphone demands continuous reinvestment. Past decisions should be constantly reconsidered.

And after a year like 2016, I like to hope that Apple is reconsidering some of the decisions they've made about how their platforms have influenced life on earth, including our politics.

Keep reading…

Warm Takes on Microsoft's Surface Pro 4

First, some background

I embarked on a spiritual journey this week to answer these questions: do all the developers I see switching (or threatening to switch) to Windows see something that I don't? Has Microsoft actually turned the Windows user experience around? Is the combination of Microsoft's hardware & software superior to Apple's on the Mac?

At the heart of these questions: hope. Hope that people have placed in Microsoft's embrace of open source, its in-house hardware design, and its broadened cross-platform support. My suspicion at the outset of this experiment was that the recent, near-universal praise of Windows is, instead, mostly hype—fueled in large part by a general frustration that Apple has been the only serious contender for developer mindshare for over a decade. Most of the people I know switching to Windows of-late are furious about Apple's apparent product direction, and I'm biased to think their praise of Windows represents a sort of motivated reasoning. But I can't test my bias without seeing Windows appraised by someone who, like me, is a genuine fan of Apple's products… reluctantly, I figured I'd do it myself.

So, last week, I bought a Surface Pro 4 from the local Microsoft Store and put it through its paces on my own terms and carrying my own biases. I'm not a dispassionate reviewer, poring over each feature checkbox and assessing its objective quality. I'm just someone who really likes using iOS and macOS, but is interested in challenging Microsoft's products by asking them to "make me switch." These are my initial notes.

You'll never guess what happens next…

Registering a Microsoft Surface Pro 4

I'll have a lot more to say about my experiments in trying out Windows over the coming days, but as a special Christmas bonus edition, I thought I'd share the steps that were apparently required for me to register my Surface Pro 4 with Microsoft.

As I got in bed last night, I activated tablet mode for the first time and while perusing the don't-call-it-Metro tile page, I saw an app called "Surface". I have one of those, I thought, I should tap that!

At first blush, the purpose of the app is to introduce you to the Surface's features, process device registration, solicit customer feedback, and so forth. The first thing the app asks of its users is to register the Surface device for benefits that include both requesting (and cancelling!) hardware service. Since part of my aforementioned experiment is to begrudgingly click "yes" to every asanine pop-up and prompt the operating system throws at me, I decided to go ahead and register the device.

You'll never guess what happens next…

Giving Windows a Chance

I bought a Surface Pro 4 tonight. Over the next week I'm going to share my notes on getting used to it. And over the next month I'm going to attempt to do all of my open source work from it. but tonight, I want to first comment on the state of the Mac and PC so that I can get my own initial perspective and biases on the table.

Subsequent posts in this series include: this and this.

Keep reading…

How-to: Thanksgiving for Millennials

My wife & I have been returning to my parents house for Thanksgiving every year since we graduated college. Customarily, people stop schlepping home to exploit the free labor of their parents' holiday cheer around the time they have kids—but for a growing number of millennials who aren't interested in bringing children into the world, this presents a dilemma: do I expect my mom to cook me turkeys until she is physically unable?

Short answer: mostly, sure. [Love you, mom!]

Longer answer: this year in particular, having logged over 25 weeks traveling, I was eager to stay home for the holidays for once. I still wanted a traditionally large and wasteful Thanksgiving feast, but I just didn't have it in me to drive three hours for one.

To be continued…