justin․searls․co

Great reader e-mail over at TPM:

Musk’s behavior has been atrocious. But he shouldn’t have been allowed to be in that position in the first place. That’s on the Pentagon and the US government more generally. In the first rush of enthusiasm and support for Ukraine, Musk shipped a bunch of free Starlink devices to Ukraine and agreed to cover the cost of the service. Later when he cooled on Ukraine he started threatening to shut the service off if the Pentagon didn’t pick up the tab. That’s standard mercurial behavior from Musk. But of course the Pentagon and more broadly the US should be picking up the tab. Much as I loathe the person Musk has turned out to be, I remember thinking at the time, how can this even be a question? Of course they should pick up the tab. The idea that we’d leave it to the whim of someone like Musk to be covering the cost of mission-critical technology for an ally at war is crazy.

This was my thought at the time. That Musk's offer to blanket Ukraine with Starlink satellites and terminals was "free as in blackmail", especially if it bypassed (what would have surely been expedited) defense procurement processes that would have mandated their availability and security.

Now Musk has half the US government over a barrel, with no real way out until credible competitors to SpaceX emerge.

DeSantis's campaign manager, emphasis mine:

“Iowa is a real state for us because of its education — it’s a highly educated state — because of income, because of Bible reading,” said Jeff Roe, in audio obtained by POLITICO. “New Hampshire is a terrible state for Donald Trump. That’s a terrible state for him. He’s gonna get like 28 percent. Now there is more people who will have a slice of that and some people are just betting on New Hampshire overall. But he’s going to lose the first two states. We’re going to beat him in Iowa.”

When you're a conservative that doesn't believe in school and for whom universities are bastions of a woke ideology, the two hallmarks of the highly educated are apparently income and Bible reading. TIL.

This essay almost exactly mirrors my feelings about the AI innovations we've seen spring forth over the last year. If you know me at all, you know that I've made my career by sticking my head in the sand and ignoring new technology trends in favor of following the market's fundamentals, even when I'm very excited about those innovations personally. (It's why I'm still making full-stack web apps instead of building everything with native Apple SDKs.)

That said, I've been dismayed to see so many of my friends that reside along the same pessimistic-bordering-on-cynical gradient continue to be stubbornly dismissive of AI as just another fad. This isn't crypto. The real-world economic impact of only the most basic tools has already been profound, and humans are nowhere close to catching up with its implications for a huge swath of jobs in the knowledge economy.

Sure. Few claim that LLMs possess human-like intelligence, “think” like humans, or exhibit self-awareness. Then again, there are also schools of thought that argue that humans are also just glorified response generators. Regardless of philosophical dispositions, it is important to note that a large number of white collar jobs on top of which economies are built involve just reasonable amounts of comprehension, bounded decision making, and text generation—paper pushing, “code monkeying” and whatnot. Really though, probabilistic text generation that maintains context in numerous computer and human languages while being meaningful, useful, and conversational, at the same time exhibiting at least an illusion of reason, in what world is that a trivial achievement ought to be dismissed with a “just”!? Those goal posts have shifted hilariously fast.

Ever since my post about AI and jobs in March, I have felt my take was overly optimistic. The obvious limitations of the tools we see today (e.g. LLM hallucination) do indeed limit the practical application of AI, but the potential for composability to address these concerns is sobering (e.g. a supervisory model that audits and reinforces the accuracy of an LLM's hallucination) and should distress anyone who would prefer that AI didn't devour the middle-class economy.

The rumors in the run-up to the iPhone 15 have been particularly maddening when it comes to the inevitable switch to the USB-C connector.

Speculation therefore remains rife about the USB-C port capabilities of the iPhone 15 lineup, and nothing is certain beyond the switch from Lightning. Rumors suggest the cables supplied in iPhone 15 boxes are limited to USB 2.0 data transfer speeds at a rate of 480 MBps, which is the same as Lightning.

In contrast, the iPhone 15 Pro models are expected to be supplied with cables capable of USB 3.2 or Thunderbolt 3 trans

Here's what I would have predicted a year ago: iPhone 15 would get a USB-C connector with USB data speeds and fast-ish charging speeds, and the Pro models would get a Thunderbolt/USB 4 port with typical transfer speeds and slightly faster-than-the-15-but-nothing-like-a-MacBook-fast charging speeds.

Simple, straightforward, consistent with other products as well as Apple's strategy of widening the segmentation between the phone lines.

The rumors have confirmed this at every step if that's what you'd been expecting. But as far as I can tell, everyone reporting on Apple rumors seems to be befuddled. Last week it was Thunderbolt 3, which has effectively been discontinued across the rest of Apple's line. I realize they need to work to put clicks on the table like everyone else, but they almost seem intentionally dense on this one.

Vision Pro has its work cut out for it

I’m as excited for using Vision Pro as a remote display for my Mac as anybody, but the fact that neither of my brand new, clean install Macs can initiate screen sharing reliably doesn’t bode well that it’ll somehow work better in a VR headset.

Love too set up a new iPhone

I love the iOS Photos widget

Every time I unlock my phone I'm greeted with an oddball memory. If you don't have the Photos widget on your first homescreen, you really should.

I got a chance to sit down with the Changelog crew for a second time this year. This time it was to discuss the provocative blog post I wrote last month. It also featured my colleague Landon, who got to represent The Youth in tech today.

It was a great discussion. It's not technical and doesn't require much prior context to follow. I'm also always blown away by how much better the Changelog's audio engineering is than any other programming podcasts I've heard. Anyone that can make me able to listen to myself is working some kind of magic.

If you use Apple Podcasts (as I do), here's the direct link.

The iTunes revolution of selling individual songs for 99¢ was something I fought against at the time, because I conceived of my favorite albums as complete, integrated works. I worried decoupling the song from the album would optimize the industry to pump out ever-more-saccharine pop hooks.

It’s interesting now, looking back and seeing the thread connecting the 99¢ song to infinitely-scrolling algorithmic video feeds as the logical endgame. Back then, I could never understand why someone would want to buy a single song ala carte, and today I can't get my head around the appeal of TikTok or Instagram Reels. 🤷‍♂️

Your PS5 analog sticks are melting!

UPDATE: A few folks told me that WD-40 works to resolve this and… they were kind of right! Just buy a pen applicator and dab it on.

Years ago, I unpacked an old PS3 and was disgusted to realize that the controllers’ analog sticks had all melted into gooey, sticky black blobs. I wondered if they’d gotten too hot at some point, somehow, but it turns out that the low-grade rubber and oil Sony used just inevitably denatures over time. I figured they’d have fixed this in later generations, but it turns out they haven’t! Both my PS5 controllers are now completely unusable after sitting idle for six months.

Entropy always wins… neat!

A friend mentioned the Sumitomo corporation yesterday and it prompted a conversation about Japanese-style conglomerates and how they've gone out of fashion in the West.

One thing that never came into fashion in the West? Educational comic books about your conglomerate that you link off your corporate home page.

132 pages! Things really start cooking about 12 pages in.

Not Delivered (loljk)

Neat when a messaging app incorporates a payment system. What could go wrong.

Incredibly relatable content in this provocative post by Thorsten Ball:

Many times in your day-to-day programming life you have to wait. Wait for your development environment to boot up, wait for the formatting-on-save command to finish, wait for the website you just opened to load, wait for tests to run, wait for the CI build to finish.

The waiting doesn’t really cause me physical pain, but it does evoke a physical reaction alright. I just can’t stand it. Maybe it’s because I’m impatient by nature. Maybe it’s knowing that things could be faster that causes it. When I have to wait ten seconds for a test to finish that I plan to run many times over the next hour, I tell you, it feels as if I’m about to lose my mind.

"Impatience" has been considered a virtue in software for literal decades, because nearly every single action a programmer redounds to a call-and-response with a computer that can't be considered complete until the computer has delivered its result and a human has interpreted it.

Imagine yourself texting with someone. If the other party replies quickly, it will promote focus and secure your attention—you'll stare at your phone and reply promptly as well. If many seconds or minutes go by between responses, however, you'll rationally lose interest, go do something else, and return to the conversation whenever you happen to come back to it. Most importantly, a fast-paced chat results in many more total messages exchanged than a slow-paced conversation, because time is stubbornly finite.

No one has any problem conceptualizing the above, but perhaps because we tend not to conceive of programming as a two-way conversation between a human and a computer, developers often lack a keen sense of this issue's salience.

I should see more people wince when a website takes longer than two seconds to load. There are very few reasons most websites should take long to load. Yet many times when, together with colleagues, I’d watch a website that we built load for longer than two seconds and say “something’s off, I bet there’s an N+1 query here” and turn out to be right – nobody else noticed anything.

Over the course of my career, very few programmers have seemed as constitutionally impatient as I am with slow computer responses. I've only become more radical in my impatience over time, as my understanding of programming as a two-way "conversation" has deepened.

Here's one way to think about it.

The upper bound of a programmer's productivity is the speed, fidelity, and correctness of the answers they're able to extract from each "feedback loop" they complete with a computer:

  • Speed: if your page is slow to load, you can't refresh it as many times in a given working session, so you can't iterate on it quickly
  • Fidelity: if you run a command that pulls down far too much or far too little information to answer your question, you'll spend additional time parsing and interpreting its results
  • Correctness: if you have the wrong question in mind, you'll run the wrong commands, and you'll probably also waste feedback cycles to ask the wrong follow-up questions, too

I wrote a click-bait title referencing 10x developers a couple weeks ago. That post was careful to minimize value judgments and to avoid venturing into offering advice. Well, if you want some advice, here you go: take to heart the compounding nature of the effects that feedback loops have on productivity, and you'll set yourself apart as a programmer.

To illustrate, compare the potential productivity of two programmers, given a script to compute the upper bound of activities performed in the 480 minutes that comprise an 8-hour workday.

  • Programmer A completes one feedback loop with their computer every 45 seconds. 1 in 10 of their commands ask the wrong question and result in the next 5 questions also being wrong. 1 in 3 of their commands produce low-fidelity results that take them 5 minutes to interpret the answer. They complete 85 productive feedback loops per day.
  • Programmer B completes one feedback loop with their computer every 15 seconds. 1 in 25 of their commands ask the wrong question and result in the next 3 questions also being wrong. 1 in 10 of their commands produce low-fidelity results that take them 2 minutes to interpret the answer. They complete 902 productive feedback loops per day.

85 vs 902. There you go, a 10x difference in productivity.

It would be very fair to quibble over which numbers to measure, whether the numbers I chose are feasible, and so forth. This is only meant to illustrate that the difference between waiting a few hundred milliseconds versus a few seconds versus multiple minutes really adds up, especially when you factor in that expertise and focus can be learned and practiced to get better at asking the right questions and maintaining a clear mindset.

Something more this silly script doesn't capture is the human element of what it feels like to frequently feel like you're waiting. Beyond a certain point, people will develop habits to tab away to more responsive user interfaces like Slack or social media, resulting in minutes lost to distraction and minutes more to their attention residue. There are also reinforcing social effects of working in an organization where these phenomena are normalized that people rarely consider.

If I could go back and change one thing about how I learned to program, it would have been to emphasize the importance of internalizing this lesson and seizing control of the feedback loop between myself and my computer. I've been beating this drum for a while (and it was the primary thrust of my RailsConf 2017 keynote), but it still doesn't feel like the industry is getting any closer to acknowledging its importance or applying it to how we teach people programming, manage programmers, and design systems.

Perks of working at Google in 2007:

“Let me pull this up because there are so many,” he says. When his computer produces a list a moment later, Kallayil makes his way down the screen and continues: “The free gourmet food, because that’s a daily necessity. Breakfast, lunch and dinner I eat at Google. The next one is the fitness center, the 24-hour gym with weights. And there are yoga classes.”

There is a pause before he adds that he also enjoys the speaker series, the in-house doctor, the nutritionist, the dry cleaners and the massage service. He has not used the personal trainer, the swimming pool and the spa — at least not yet, anyway. Nor has he commuted to and from the office on the high-tech, wi-fi equipped, bio-diesel shuttle bus that Google provides for employees, but that is only because he lives nearby and can drive without worrying about a long commute.

Let's check in on how 2023's going:

Being banned from the entire Internet would be tough, but Googlers in the high-security program will still get access to "Google-owned websites," which is actually quite a bit of the Internet. Google Search would be useless, but you could probably live a pretty good Internet life, writing documents, sending emails, taking notes, chatting with people, and watching YouTube.

Somewhere along the way—even by the time I visited the Googleplex in 2007—Google lost their way on this. Employment policies that promote people's autonomy and agency can pay companies like Google massive dividends in increased creativity, productivity, and loyalty. But perks that attempt to squeeze blood from the stone by making the office feel more luxurious than home always obfuscate the nature of the work itself and will inevitably distract everyone involved—a recipe for resentment between workers and management the minute the going gets tough.

Now that the going's gotten tough, it's too Google could never tell the difference between the two.

Maybe Sorkin has a "The Neural Network" treatment left in him.

Stability AI is being sued by a co-founder, who claims he was deceived into selling his 15% stake in one of the hottest startups in the sector for $100 to CEO Emad Mostaque, months before the company raised millions at a $1 billion valuation.

Should've run the contract terms through ChatGPT for a summary first.

The iPad's relative uselessness for getting real work done is probably one reason I get so much value out of it. When I'm on my iPad, there's not much I can do but think through, sketch out, and plan my work—whether that's with the Pencil in Notes or organizing my to-do items in Things.

I rarely do these things as diligently on my Mac, because it's so much easier to stick my head in the digital sand and just bury my head in whatever work is right in front of me.

Not exactly a ringing endorsement of iPad, but it is genuinely useful.

Gotta appreciate ingenuity on the platforms when you see it. Because this Amazon Seller's name is "Shopping cart", it means that what a user sees before adding an item to their cart is:

Ships from Amazon
Sold by Shopping cart

Almost got me.

Your new Mac comes with an Accessory Kit

If you've been ordering Macs online since 2004 like I have, the lineage of technical and marketing decisions behind what's going on in Apple's store makes this make sense.

Back in the day, Macs actually came with a handful of necessary accessories.My G4 iBook came with a handful of things, I think. I know I had to install the Airport Express card under the keyboard myself, for some reason.

But the Mac Studio's "Accessory Kit" is literally a power cable. That might be a little generous.

This is worth a read. If you've been harboring any illusions that machine learning and AI are cleanroom scientific breakthroughs, this should dispel it.

There are people classifying the emotional content of TikTok videos, new variants of email spam, and the precise sexual provocativeness of online ads. Others are looking at credit-card transactions and figuring out what sort of purchase they relate to or checking e-commerce recommendations and deciding whether that shirt is really something you might like after buying that other shirt. Humans are correcting customer-service chatbots, listening to Alexa requests, and categorizing the emotions of people on video calls. They are labeling food so that smart refrigerators don’t get confused by new packaging, checking automated security cameras before sounding alarms, and identifying corn for baffled autonomous tractors.

If you sit with the thought that AI models are only valuable when they're provided painstaking and voluminous feedback from poorly-paid workers, the associated "intelligence" begins to evoke thoughts of the mechanical Turk (the one from history, not the Amazon product).