justin․searls․co

A lot of content around here boils down to links to someplace else, for which all I have to add is a brief call-out or commentary. As such, the headlines for each of these posts link to the original source article. (If you want a permalink to my commentary, try clicking the salt shaker.)


The companies have talked about offering a combination of Paramount+ and Apple TV+ that would cost less than subscribing to both services separately, according to people familiar with the discussions. The discussions are in their early stages, and it is unclear what shape a bundle could take, they said.

I have no problem with this story (Apple News+ Link), but I do insist that any bundle containing multiple services following the "{Brand}+" convention include an additional "+" for each such service it includes. I'm willing to pay for "Apple++", but if it's called "Apple+ plus Paramount+", then no deal.

Bitcoin mines aren’t just energy-hungry, it turns out they’re thirsty, too. The water consumption tied to a single Bitcoin transaction, on average, could be enough to fill a small backyard pool, according to a new analysis. Bitcoin mines are essentially big data centers, which have become notorious for how much electricity and water they use.

The first time I read this I figured it referred to the amount of water consumption to mine a coin, as that would seem somewhat reasonable. Nope, it's the amount of water consumed to simply add a transaction to the blockchain. To think, Bitcoin is the one that coffee shops and bodegas were ostensibly accepting for everyday purchases—imagine draining a swimming pool to buy a bottle of water at a corner store!

Sheer madness.

The website's search feature is implemented by a very clever and well-engineered library called Pagefind. It is many of my favorite things: fast, small, and free of dependencies I need to worry about.

The one thing that its built-in user interface couldn't do, but in a miracle of GitHub responsiveness, Liam Bigelow responded to my feature request within an hour and shipped the feature inside a week.

Tip of the hat to Liam and his colleagues at cloudcannon. If you have a static site, I strongly encourage you to check out Pagefind for your search feature. It's free, but even if it weren't, I still prefer it to all of its paid competition.

The Discourse has delighted in the unusual narrative that AI will only affect knowledge workers and spare physical laborers from displacement. Reality will probably be more complicated:

Ekobot AB has introduced a wheeled robot that can autonomously recognize and pluck weeds from the ground rapidly using metal fingers.

Today the story is about reducing the use harmful herbicides, but as advances in AI software continue to be married to advances in robotics, it will be interesting to see which categories of physically laborious jobs will be impacted over the next decade.

(Worth a click just to see the video of how violent the clank of rapid steel finger snatching is, by the way.)

I survived the first half dozen rounds of ✨Web Components™✨ hype before jumping off the wagon to preserve my front-end productivity (if not dignity) somewhere around 2015. I almost didn't read this article, but I'm glad I did, because it looks like in my absence the browsers actually made Web Components a thing. A thing that you can define and build without the help of an ill-considered UI framework and without needing a compilation pipeline to convert high-fantasy JavaScript into something browsers can actually execute.

A spritely simplification of the code to Do A Component from Jake's piece:

// 1. Define it
class MyComponent extends HTMLElement {
  constructor() {
    super();
    this.shadow = this.attachShadow({ mode: "open" });
  }

  connectedCallback() {
    this.shadow.innerHTML = '<span>Hello, World!</span>';
  }
}

// 2. Wire it up
customElements.define("my-component", MyComponent);

And then drop it into your HTML:

<my-component></my-component>

And your page will print: .

(If the above text reads "Hello, World!", that's the component working because this post actually executes that script. Go ahead and view source on it!)

Cool to see that this actually works. I was so sure that the customElements API was some bullshit polyfill that I opened Safari's console to verify that it was, indeed, real, before I continued reading the post.

Will I start using Web Components anytime soon? No. For now, I'm still perfectly happy with Hotwire sprinkles on my Rails apps. But I am glad to see that Web Components are no longer merely a pipe dream used to sell people snake oil.

Google Drive users are reporting files mysteriously disappearing from the service, with some netizens on the goliath's support forums claiming six or more months of work have unceremoniously vanished.

As somebody who for years has expressed total distrust in Google's interest (much less ability) in keeping user data available and secure, this story confirms my biases. I've been burned by so many tools at this point that I'll choose a diffable and mergeable file format that I can store and back up on my own hardware whenever feasible.

The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.

Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands.

The Journal set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults. The Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more-disturbing content interspersed with ads.

Neat.

Experts on algorithmic recommendation systems said the Journal’s tests showed that while gymnastics might appear to be an innocuous topic, Meta’s behavioral tracking has discerned that some Instagram users following preteen girls will want to engage with videos sexualizing children, and then directs such content toward them.

Since the dawn of the Internet, people have been consuming innocuous content alongside not-at-all innocuous content, often for the same not-at-all innocuous purpose. Who could have possibly predicted that a naive correlation-matching algorithm might reflect this? Where's my fainting couch when I need it?

Just pushed a major update with a minor version number to my feed2gram gem, which reads an Atom/RSS feed for specially-crafted entries and posts them to Instagram. The initial 1.0 release only supported photos and carousels of photos as traditional posts, but v1.1 supports posting reels, carousels containing videos, and posting images and videos as stories.

Was it a pain in the ass to figure out how to do all this given the shoddy state of Meta's documentation? Why yes, it was, thanks for asking!

It seems to work pretty reliably, but YMMV with aspect ratios on videos. I don't think there's any rhyme or reason to which videos the API will reject when included in a carousel alongside images at different aspect ratios (unless the rhyme is "the aspect ratio must be an exact match" and the reason is "the Instagram app trims the aspect ratio of these videos on the client before uploading them", which I guess makes sense now that I type that out).

The media loves a clash of the titans narrative when it comes to the big tech companies, but the fact that they've all carved out such careful moats for themselves means none of them really compete head-on with one another. That said, the cultural competition is always fascinating to me. For example, while the broad story here is that Google is making ad blocking less accessible to Chrome users despite a pretty obvious perverse incentive to cram display ads down users' gullets, there's a beneath-the-surface contrast with Apple that's just as interesting.

I knew that the v3 manifest limited ad blockers, but I didn't realize it does so by drastically limiting the number of rules extensions can define (and then, favoring dynamic rules over static ones):

Originally, each extension could offer users a choice of 50 static rulesets, and 10 of these rulesets could be enabled simultaneously. This changes to 50 extensions simultaneously and 100 in total… Extensions could add up to 5,000 rules dynamically which encouraged using this functionality sparingly

So going forward, Chrome extension developers will still be able to execute as much JavaScript as they want—potentially invading users' reasonable expectations of privacy and slowing their machines down—but they will be limited in how many ads and trackers they can block.

Compare this to how Apple's Content Blocking API ties the hands of developers to ensure users are protected from nefarious ad blockers that scrape user data or replace all the ads with their own. They do this by forcing every rule to be statically defined—the very thing Chrome is restricting now. As Andy Ibanez puts it:

At the most elemental level, content blockers are rules represented in a JSON file. Everything all the Content Blocker apps in the App Store do is create a JSON file that gets handed to Safari. You could theorically create an extensions that all it does is hand a static JSON file to Safari without writing a line of code.

So on one end, we have Chrome restricting its extension API to protect itself from lost advertising revenue at the expense of users and which will incentivize developers to implement resource-intensive workarounds at best and publish bogus "ad blocking" extensions that do underhanded, skeevy stuff at worst. At the other end, Apple's Content Blocking API protects itself from scandal by preventing most categories of scams outright at the expense of developers having any tools to differentiate their extensions over others and which will result in users having the trade-off of speed (fast) vs effectiveness (middling) decided for them.

It's interesting to think about how the structural incentives of each company leads them to approach problems like this one so differently. For Google, it's always Google first, developers second, users third. For Apple, it's always Apple first, customers second, developers third. (Explaining why so many programmers are happy Apple customers while remaining wholly disinterested in being Apple-ecosystem developers.)

A great little post on how much Google has changed over the years. When I visited for an on-site interview in 2007, my takeaway vibe was, "yeah, right, all this nonsense won't last." Of course, the satisfaction I take from correctly predicting bad things happening brings me less joy than I wish it did.

The effects of layoffs are insidious. Whereas before people might focus on the user, or at least their company, trusting that doing the right thing will eventually be rewarded even if it's not strictly part of their assigned duties, after a layoff people can no longer trust that their company has their back, and they dramatically dial back any risk-taking. Responsibilities are guarded jealously. Knowledge is hoarded, because making oneself irreplaceable is the only lever one has to protect oneself from future layoffs.

I am very proud that Test Double's leadership has designed the company around minimizing the likelihood layoffs will ever be necessary, and that in the 12 years since we founded it, we have succeeded in avoiding them.

A recurring theme of my own career is that people generally fail to appreciate how big an impact one's emotional state has on knowledge workers. When people feel unsafe, they tend to prioritize making themselves feel safe over everything else. When work starts to feel like a slog, people move slower—that's literally what slogging means!

The best way to avoid the deleterious effects of layoffs, of course, is to do things to prevent them. Manage to profitability over revenue. Build a long runway of cash-equivalent reserves. Hire fewer full-timers and don't be shy about bringing on temporary help (which I wrote about earlier this year). Simple stuff—seemingly common sense, even—but increasingly unconventional.

"The game was to take place on a windswept foggy Scottish island. The player would be under constant attack from zombies. The player would need to use vehicles to get around but vehicles would need fuel. Acquiring the fuel would be a big part of the game."

Z was only in development for "maybe a month or so", however. According to Vermeij, the concept proved to be a bit of a downer. "The idea seemed depressing and quickly ran out of steam. Even the people who originally coined the idea lost faith.

To be fair, gray Scottish landscapes are a bit of a downer even without the zombies.

Programmers like fantasy. Artists like zombies. Not sure why that is.

The over-representation of fantasy and zombie themes in games is such a bummer. My favorite series take place in contemporary settings and eschew the supernatural—it's too bad games like that are so few and far between.

Was delighted to be interviewed by Jerod on the Changelog for what must be the third time this year. This time, we discuss our approaches to managing dependencies (an evergreen debate), before moving onto discussing the emerging POSSE trend which I seem to have backed myself into with the updates I've made to the design of this website over the past year.

Here's an Apple Podcasts link, if that's more your thing.

One call-to-action mentioned in the episode: if you're interested in how this website or its syndication tools work, shoot me an e-mail and I'll log your interest. Once enough people ask about it, I'll figure out how to open source or otherwise chart a path for others who want to wrest back control of their work from The Platforms.

From the department of Life Comes At You Fast:

Altman holding talks with the company just a day after he was ousted indicates that OpenAI is in a state of free-fall without him. Hours after he was axed, Greg Brockman, OpenAI’s president and former board chairman, resigned, and the two have been talking to friends about starting another company. A string of senior researchers also resigned on Friday, and people close to OpenAI say more departures are in the works.

Today we learned (as if we couldn't have guessed) that Satya Nadela was furious at the news of Altman's ouster. It's not hard to imagine why: OpenAI's top talent could all be gone by Monday, start a new company they actually own shares of, and let OpenAI fall apart. If that comes to pass, something tells me the only "thermonuclear" lawsuit to be born this weekend will have been Microsoft's, in a bid to extract every last dollar it invested in OpenAI.

Anyway, you're not alone if you're feeling whiplash. Makes me wonder if Steve Jobs had access to our current era's instantaneous communication channels the day he was kicked out of Apple, would his acolytes have mobilized to such great extent that the board would have felt pressured to bring him back the very next day? (I doubt it.)

Elon Musk said on Saturday that he will file a "thermonuclear lawsuit" against non-profit watchdog Media Matters and others, as companies including Disney, Apple and IBM reportedly have paused advertising on X amid an antisemitism storm around the social media platform.

So, basically, "if your free speech on your website reduces advertising revenue on my website, then your speech must be curtailed in the interest of free speech." Got it.

He also said that "for speech to be truly free, we must also have the freedom to see or hear things that some people may consider objectionable" and added that "we will not allow agenda driven activists, or even our profits, to deter our vision."

I suppose it's still possible to view Elon as a once-in-a-generation genius if the generation you're referring to is the fascistic incel progeny of GamerGate.

In a surprising move, Apple has announced today that it will adopt the RCS (Rich Communication Services) messaging standard. The feature will launch via a software update “later next year” and bring a wide range of iMessage-style features to messaging between iPhone and Android users.

Didn't see this coming. Six years too late, but sooner than expected.

Apple's statement:

Later next year, we will be adding support for RCS Universal Profile, the standard as currently published by the GSM Association. We believe RCS Universal Profile will offer a better interoperability experience when compared to SMS or MMS. This will work alongside iMessage, which will continue to be the best and most secure messaging experience for Apple users.

This is a welcome improvement—green bubbles should turn more aquamarine, I guess—but by no means will this eradicate the lock-in effect that iMessage has in certain markets, particularly the US. When Google and others started badgering Apple to adopt RCS they were implicitly arguing for the abandonment of iMessage, but that was never going to happen. Feature parity aside, the fact RCS is not truly end-to-end encrypted makes it a non-starter for Apple. Besides, there is no way they'd volunteer to wait on a standards body to approve new features in the Messages app. In spite of this news, we should expect the two class system of blue bubbles and green bubbles to remain largely unchanged—the green bubbles will just be marginally less shitty to deal with.

I suspect this is a calculated capitulation to get the EU to back off from forcing Apple to open iMessage to other platforms. I imagine Apple's argument will be straightforward: this is us adopting the modern industry standard to facilitate interoperability, and any further regulations by the EU would have no effect but to stifle innovation. Because their competitors spent all their energy lobbying for Apple adopting RCS (as opposed to opening iMessage), maybe the EU will go along with this.

One last thought: the "later next year" timeframe helps Apple's case in escaping the EU's imminent ruling on messaging platform rules, as most observers will probably fail to realize that Apple's adoption of RCS will not actually dissolve the blue-green class divide.

What is interesting from all this is how quickly it appears that Starfield has tailed down from its impressive launch, dropping down just two months after it released; by way of comparison, Skyrim’s original release had fewer players at launch in 2011, peaking at around 290,000, but didn’t fall below 20,000 players until almost seven years later in May 2018. (Two months after release, Skyrim was still on somewhere around 90,000 concurrent players on Steam.)

Keep in mind, Skyrim’s players are far likelier to be playing on Steam (both because it launched there and due to better mod support) than the official Xbox app, where Game Pass subscribers have always had “free” access to Starfield.

That said, it’s hard to read this without thinking about the chorus of anecdotal reports I’ve heard from people who’ve played Starfield: Bethesda successfully delivered another “one of those” games, but it simply lacks the magic and awe that gave Skyrim its staying power.

Personally, I think a lot of this is due to the game’s setting. High fantasy lends itself to dramatic moments celebrated by epic orchestral movements in a way that feels tacked on when applied to a sterile sci-fi environment. Fast travel between star systems in Starfield undercuts its sense of scale to a degree that manages to make it feel smaller than Skyrim’s technically-not-quite-as-big-in-aggregate map.

These combine to rob Starfield of what industry folks call “emergent gameplay.” A random hail from a starship feels like a procedurally-generated interruption, whereas a fellow traveler approaching you along a trail at night feels like a natural coincidence or kismet. Both might be random scripted events, but only one maintains suspension of disbelief. Similarly, knowing that I could walk thirty minutes and eventually reach a mountain in the center of Skyrim’s map lends credibility to its environment, but being forced to constantly fast travel between planets shatters any illusion that Starfield players are exploring a single connected universe.

As a result, Starfield is merely a good game. It’s more Fallout than Elder Scrolls. And where Fallout oozes with charm and satire, Starfield feels flat and mundane. The premise may be more realistic, but the world feels less real as a result.

Finishing Starfield’s campaign and then deciding to continue exploring its world raises the question, “why?” It would be like checking out at Costco and then deciding to walk another lap through the store. Maybe there was something you didn’t check off your list the first time around, but one imagines you’re going to impatiently hustle to get it and get out.

While nothing has yet been confirmed, it appears people were able to figure out (or someone accidentally shared) Emergency Pizza codes that could be used over and over again by the same customer. This, obviously, isn’t how the program was intended to work.

Expensive bug.

During the M1 generation of Apple Silicon Macs, one area of controversy was whether the default 8GB of RAM was less of a performance bottleneck as it was under Intel Macs. The pundits contended that a paltry 8GB was unacceptably stingy on Apple's part, but in practice—and who's to say whether it had to do with the ARM CPU or the unified memory architecture or the blazing-fast SSD performance—no matter how much you threw at the M1 Macs, it never seemed to get bogged down by swapping out to disk.

Well, that era is apparently over, because the new M3 with 8GB is getting absolutely smoked by the same chip when it has 16GB to work with:

The 8GB model suffered double-digit losses in Cinebench benchmarks, and took several minutes longer to complete photo-merging jobs in Photoshop as well as media exports in Final Cut and Adobe Lightroom Classic.

If you don't watch the video, it's worth clicking through to MacRumors' summary for the charts alone.