justin․searls․co

In 2011, the same month Todd and I decided to start Test Double, Steve Jobs had recently died, and we both happened to watch Steve Jobs' incredible 2005 Stanford commencement speech. Among the flurry of remembrances and articles being posted at the time, the video of this speech in particular broke through and became the lodestone for those moved by his passing.

The humble "just three stories" structure, the ephemera described in Isaacson's book, and the folklore about Steve's brooding in the run-up to the speech became almost as powerful as his actual words. The fact that Jobs, the ruthlessly focused product visionary and unflinching pitchman, was himself incredibly nervous about this speech might be the most humanizing thing any of us have ever heard about him.

Well, it's been twenty years, and the Steve Jobs Archive has written something of a coda on it. They've also released the e-mails Steve wrote to himself in lieu of proper notes (perhaps the second-most humanizing thing). They've also spruced up and remastered the video of the speech itself on YouTube.

Looking through his e-mails, I found I actually prefer this draft phrasing on the relieving clarity of our impending demise:

The most important thing I've ever encountered to help me make big choices is to remember that I'll be dead soon.

In 2011, Todd and I ran out of good reasons not to take the leap and do what we could to make some small difference in how people wrote software. In 2025, I believe we're now at an inflection point that we haven't seen since then. If you can see a path forward to meet this moment and make a meaningful impact, do it. Don't worry, you'll be dead soon.

I've never regretted failing to succeed; I've only regretted failing to try.

If you just read this month's newsletter, you might have gotten the impression whoa, it's really hard as a foreigner to buy property in Japan. And the fact it took me over a month, mostly on-site, to enter into contract to buy a condo in cash should serve as ample evidence of that.

However, multiple seemingly conflicting things can be true at once, and Bloomberg's Gearoid Reidy calls out several great points in a saucy column (archive link) which he wrote after I got myself into this mess:

But increasingly, the spotlight is falling on foreign buyers, particularly wealthy Chinese, seeking a safe place for their capital and drawn by Japan's political stability and social safety net. Lawmakers and commentators have been raising the lack of restrictions on property in parliament in recent weeks, as well as in the media. Former international soccer-star-turned-investor Keisuke Honda summed up what many think when he recently tweeted that he thought foreigners should not be allowed to buy land here.

Japan wouldn't be alone in seeing foreign non-residents snap up a bunch of attractive real estate—whether to park capital in a stable economy or to exploit increased tourism by flooding the zone with cheap Airbnb listings. What's different is that Japan's government does almost nothing to document, constrain, or tax foreign buyers.

Amazingly, it was only this decade that Japan first began making it harder for foreigners to buy properties even in sensitive areas next to military bases or nuclear plants. Beyond that, it's open season: Buyers don't even have to be resident in the country, there are no additional taxes or stamp duties for foreign purchasers, nor are there extra levies for second or holiday homes.

Japan is an outlier in the region. Singapore doubled its stamp duty on foreign buyers to 60% in 2023 as part of a series of disincentives, while Hong Kong only recently removed a similar curb in an effort to breathe life into the property market. Elsewhere, Australia announced a two-year outright ban on foreigners buying some homes, a step Canada last year extended.

All of this is genuinely surprising when you consider Japan's general hesitation around immigration. The suppressed value of the yen over the last four years has only exacerbated the issue and led to a run on housing inventory since 2021. Nevertheless, over-tourism has gotten far more attention from the media—pointing a camera at throngs of poorly-behaved white people outside Sensoji Temple makes for better TV than footage of largely-empty luxury condominiums popping up on every corner in Nakameguro.

Ultimately, the barriers to buying real estate in Japan have less to do with legal restrictions or taxes and more to do with language, culture, and the lack of comprehensive regulation against discrimination. What this adds up to is that real estate agencies specializing in serving foreign buyers and for which there are dozens in Tokyo specifically (many marketing to a single locale like Singapore or Hong Kong), can do deals all day long while asking almost nothing of the buyer beyond the funds for the purchase. However, there are very few such real estate agents outside Tokyo and a handful of foreign-friendly mid-market metros like Fukuoka.

Once you venture outside Tokyo, if you intend to buy desirable homes or new construction (i.e. not an abandoned house in the middle of nowhere), few realtors will have experience dealing with foreign non-residents and, regardless, many developers will insist on working with buyers directly—which means foreigners are often boxed out entirely. (About 40% of foreign buyers report having been turned away as a result of not being Japanese.)

Anyway, Gearoid describes a very real affordability crisis. Many Japanese workers with well-paid jobs have lost all hope of ever becoming homeowners despite a rapidly-declining population. Personally, I wouldn't be thrilled to have to pay more in tax when our purchase closes, but I'd completely understand and support the policy outcome such a tax would serve.

These 4 Code Snippets won WWDC

WWDC 2025 delivered on the one thing I was hoping to see from WWDC 2024: free, unlimited invocation of Apple's on-device language models by developers. It may have arrived later than I would have liked, but all it took was the first few code examples from the Platforms State of the Union presentation to convince me that the wait was worth it.

Assuming you're too busy to be bothered to watch the keynote, much less the SOTU undercard presentation, here are the four bits of Swift that have me excited to break ground on a new LLM-powered iOS app:

  1. @Generable and @Guide annotations
  2. #Playground macro
  3. LanguageModelSession's async streamResponse function
  4. Tool interface

The @Generable and @Guide annotations

Here's the first snippet:

@Generable
struct Landmark {
  var name: String
  var continent: Continent
  var journalingIdea: String
}

@Generable
enum Continent {
  case africa, asia, europe, northAmerica, oceania, southAmerica
}

let session = LanguageModelSession()
let response = try await session.respond(
  to: "Generate a landmark for a tourist and a journaling suggestion",
  generating: Landmark.self
)

You don't have to know Swift to see why this is cool: just tack @Generable onto any struct and you can tell the LanguageModelSession to return that type. No fussing with marshalling and unmarshalling JSON. No custom error handling for when the LLM populates a given value with an unexpected type. You simply declare the type, and it becomes the framework's job to figure out how to color inside the lines.

And if you want to make sure the LLM gets the spirit of a value as well as its basic type, you can prompt it on an attribute-by-attribute basis with @Guide, as shown here:

@Generable
struct Itinerary: Equatable {
  let title: String
  let destinationName: String
  let description: String

  @Guide (description: "An explanation of how the itinerary meets user's special requests.")
  let rationale: String

  @Guide(description: "A list of day-by-day plans.")
  @Guide(.count(3))
  let days: [DayPlan]
}

Thanks to @Guide, you can name your attributes whatever you want and separately document for the LLM what those names mean for the purpose of generating values.

The #Playground macro

My ears perked up when the presenter Richard Wei said, "then I'm going to use the new playground macro in Xcode to preview my non-UI code." Because when I hear, "preview my non-UI code," my brain finishes the sentence with, "to get faster feedback." Seeing magic happen in your app's UI is great, but if going end-to-end to the UI is your only mechanism for getting any feedback from the system at all, forward progress will be unacceptably slow.

Automated tests are one way of getting faster feedback. Working in a REPL is another. Defining a #Playground inside a code listing is now a third tool in that toolbox.

Here's what it might look like:

#Playground {
  let session = LanguageModelSession()
  for landmark in ModelData.shared.landmarks {
    let response = try await session.respond(
      to: "What's a good name for a trip to \(landmark.name)?
          Reply only with a title."
    )
  }
}

Which brings up a split view with an interactive set of LLM results, one for each landmark in the set of sample data:

Watch the presentation and skip ahead to 23:27 to see it in action.

Streaming user interfaces

Users were reasonably mesmerized when they first saw ChatGPT stream its textual responses as it plopped one word in front of another in real-time. In a world of loading screens and all-at-once responses, it was one of the reasons that the current crop of AI assistants immediately felt so life-like. ("The computer is typing—just like me!")

So, naturally, in addition to being able to await a big-bang respond request, Apple's new LanguageModelSession also provides an async streamResponse function, which looks like this:

let stream = session.streamResponse(generating: Itinerary.self) {
  "Generate a \(dayCount)-day itinerary to \(landmark.name). Give it a fun title!"
}
for try await partialItinerary in stream {
  itinerary = partialItinerary
}

The fascinating bit—and what sets this apart from mere text streaming—is that by simply re-assigning the itinerary to the streamed-in partialItinerary, the user interface is able to recompose complex views incrementally. So now, instead of some plain boring text streaming into a chat window, multiple complex UI elements can cohere before your eyes. Which UI elements? Whichever ones you've designed to be driven by the @Generable structs you've demanded the LLM provide. This is where it all comes together:

Scrub to 25:29 in the video and watch this in action (and then re-watch it in slow motion). As a web developer, I can only imagine how many dozens of hours of painstaking debugging it would take me to approximate this effect in JavaScript—only for it to still be hopelessly broken on slow devices and unreliable networks. If this API actually works as well as the demo suggests, then Apple's Foundation Models framework is seriously looking to cash some of the checks Apple wrote over a decade ago when it introduced Swift and more recently, SwiftUI.

The Tool interface

When the rumors were finally coalescing around the notion that Apple was going to allow developers to invoke its models on device, I was excited but skeptical. On device meant it would be free and work offline—both of which, great—but how would I handle cases where I needed to search the web or hit an API?

It didn't even occur to me that Apple would be ready to introduce something akin to Model Context Protocol (which Anthropic didn't even coin until last November!), much less the paradigm of the LLM as an agent calling upon a discrete set of tools able to do more than merely generate text and images.

And yet, that's exactly what they did! The Tool interface, in a slide:

public protocol Tool: Sendable {
  associatedtype Arguments
  var name: String { get }
  var description: String { get }
  func call(arguments: Arguments) async throws -> ToolOutput
}

And what a Tool that calls out to MapKit to search for points of interest might look like:

import FoundationModels
import MapKit

struct FindPointOfInterestTool: Tool {
  let name = "findPointOfInterest"
  let description = "Finds a point of interest for a landmark."

  let landmark: Landmark

  @Generable
  enum Category: String, CaseIterable {
    case restaurant
    case campground
    case hotel
    case nationalMonument
  }

  @Generable
  struct Arguments {
    @Guide(description: "This is the type of destination to look up for.")
    let pointOfInterest: Category

    @Guide(description: "The natural language query of what to search for.")
    let naturalLanguageQuery: String
  }

  func call(arguments: Arguments) async throws -> ToolOutput {}

  private func findMapItems(nearby location: CLLocationCoordinate2D,
    arguments: Arguments) async throws -> [MKMapItem] {}
}

And all it takes to pass that tool to a LanguageModelSession constructor:

self.session = LanguageModelSession(
  tools: [FindPointOfInterestTool(landmark: landmark)]
)

That's it! The LLM can now reach for and invoke whatever Swift code you want.

Why this is exciting

I'm excited about this stuff, because—even though I was bummed out that none of this came last year—what Apple announced this week couldn't have been released a year ago, because basic concepts like agents invoking tools didn't exist a year ago. The ideas themselves needed more time in the oven. And because Apple bided its time, version one of its Foundation Models framework is looking like a pretty robust initial release and a great starting point from which to build a new app.

It's possible you skimmed this post and are nevertheless not excited. Maybe you follow AI stuff really closely and all of these APIs are old hat to you by now. That's a completely valid reaction. But the thing that's going on here that's significant is not that Apple put out an API that kinda sorta looks like the state of the art as of two or three months ago, it's that this API sits on top of a strongly-typed language and a reactive, declarative UI framework that can take full advantage of generative AI in a way web applications simply can't—at least not without a hobbled-together collection of unrelated dependencies and mountains of glue code.

Oh, and while every other app under the sun is trying to figure out how to reckon with the unbounded costs that come with "AI" translating to "call out to a hilariously-expensive API endpoints", all of Apple's stuff is completely free for developers. I know a lot of developers are pissed at Apple right now, but I can't think of another moment in time when Apple made such a compelling technical case for building on its platforms specifically and at the exclusion of cross-compiled, multi-platform toolkits like Electron or React Native.

And now, if you'll excuse me, I'm going to go install some betas and watch my unusually sunny disposition turn on a dime. 🤞

I checked to see what the Internet thought about Apple's WWDC keynote, and I gotta say that I'm very impressed to see that despite the woke backlash, every stereotypical tech bro is suddenly an accessibility advocate.

This fucking fish

Felt extremely stupid not knowing how to pronounce this fish.

Asked waitress. She didn't know

Neighboring elderly couple next to me didn't know.

Sushi chef across counter didn't know.

Took three staff members to identify it as "isaki"

Was an incredibly validating moment

If Apple does decide to curry some nerd favor by introducing a DeX-like desktop mode for iPhone, the thought that I could travel and code with an iPhone, Xreal glasses, and portable keyboard is extremely appealing.