Fantastic write-up by Nowfal comparing AI's current moment to the Internet's dial-up era. This bit in particular points to a cleavage that far too few people understand:
Software presents an even more interesting question. How many apps do you need? What about software that generates applications on demand, that creates entire software ecosystems autonomously? Until now, handcrafted software was the constraint. Expensive software engineers and
theirour labor costs limited what companies could afford to build. Automation changes this equation by making those engineers far more productive. Both consumer and enterprise software markets suggest significant unmet demand because businesses have consistently left projects unbuilt. They couldn't justify the development costs or had to allocate limited resources to their top priority projects. I saw this firsthand at Amazon. Thousands of ideas went unfunded not because they lacked business value, but because of the lack of engineering resources to build them. If AI can produce software at a fraction of the cost, that unleashes enormous latent demand. The key question then is if and when that demand will saturate.
Two things are simultaneously true:
- The creation of custom software has been supply-constrained throughout the entire history of computing. Nobody knows how many apps were never even imagined—much less developed—due to this constraint, but it's probably fair to say there's an unbelievably massive, decades-long backlog of unmet demand for custom software
- We aren't even six months into the Shovelware era of coding agents. Exceedingly few developers have even tried these things; the tooling is so bad as to be counterproductive to the task; and yet experienced early adopters (like me) have concluded today's mediocre agents are already substantially better at writing software
It's long been my view that the appropriate response to the current moment is to ride this walrus and leverage coding agents to increase the scope of our ambitions. By the time software demand has been saturated and put us out of jobs, the supply of programmers will already have tapered off as the next generation sees the inflection point coming.
In the short term, the only programmers actually losing their jobs to "AI" are those who refuse to engage with the technology. Using coding agents effectively is a learned skill like any other—and if you don't keep your skills current, fewer people will want to hire you.
The single most destructive metric is a key performance indicator that improves for reasons the business doesn't understand. When it inevitably goes back down, people panic because nobody understands why it went up in the first place. news.ycombinator.com/item?id=45807775
Watching Becky explore the world of bodybuilding, all I know is I would really struggle with my body being scrutinized by others. Surprisingly, though, I'm actually most impressed with the vegan bodybuilders—somehow they all seem to maintain a really healthy self-esteem. Maybe it's because they never whey themselves.
How to downgrade Vision Pro
For stupid reasons, I had to downgrade my Vision Pro from visionOS 26.1 to 26.0.1 today. Here's how to put Vision Pro into Device Firmware Update ("DFU") mode and downgrade.
Here's how to restore a Vision Pro in 9 easy steps:
- Buy a Developer Strap for $299
- Go to ipsw.me and do your best to dodge its shitty ads as you try to download the IPSW restore file for your model Vision Pro at the version you need (if you don't see that version, it's likely because Apple isn't signing it anymore and you're SOL)
- Install Apple Configurator to your Mac
- Connect the Developer Strap to your Mac via USB-C, and disconnect Vision Pro from power
- Get ready to press and hold the top button (not the digital crown, the other one), then reconnect power to Vision Pro and immediately press and hold the top button until the outer screen shows a cable icon
- Open Apple Configurator, and you should see a Vision Pro icon.
- Drag the IPSW file over the Vision Pro icon and click Restore
- Click things and hope it works
- Ask yourself what the fuck you did in a past life that brought you to this moment
Good luck, have fun. 🕶️
Seems like there's a bug in visionOS 26.1 where the first time you connect the new Developer Strap you get a glorious 20 Gbps connection, but then all subsequent connections are stuck at USB 2 480 Mbps speeds. Neat. reddit.com/r/VisionPro/comments/1ok1lye
Excited to engage with the Orlando developer community for the first time tonight. Going to do a hotseat Q & A on agentic coding and what it means for your weekend. meetup.com/orlandodevs/events/310925222/
A nonstop Orlando-Tokyo route is absolutely HUGE. Zip Airlines may be a budget airline, but the lowest-tier of Japanese service is frankly a superior experience to United/Delta/AA. Becky flew them to NRT earlier this year and it was great—lie-flat seats from SFO for like $1100! wesh.com/article/first-nonstop-flights-connecting-orlando-to-tokyo-announced/69177387
Video of this episode is up on YouTube:
This may be the version 45 release of Breaking Change, but when you factor in its Hotfixes and Feature Release entries, this is somehow the 50th episode of the show!
Why? Why are we still doing this to ourselves? Write in your answer and how you feel about yourself as a result to podcast@searls.co. Seriously, I need some new material.
The web runs on links, so have some:
The new Developer Strap delivers 20 Gbps to M2 Vision Pro
Like many other Vision Pro sickos, I was far more excited about this week's announcement of a newly-updated Developer Strap than I was about last week's news of the M5 Vision Pro itself.
Why? The original strap allowed you to connect your Vision Pro to a Mac, but at unacceptably slow USB 2.0 (480 Mbps) speeds. This still achieved much lower latency connection than WiFi, but the image quality when running Mac Virtual Display over the USB connection was rendered far too blurry to be worthwhile. The new strap, however, offers a massively-upgraded 20 Gbps connection speed. I rushed to order one at the news, because, in theory, those speeds ought to offer the absolute best experience possible when using Vision Pro as an immersive Mac display.
While Apple's support documentation says both devices "support" connecting to the strap, what wasn't clear was whether the original hardware would be able to actually deliver the increased bandwidth.
Well, I'm happy to report after plugging in the new Developer Strap into my original Vision Pro, System Information indicates a 20 Gbps connection! Moreover, I can confirm Mac Virtual Display performs better than ever.
Seriously, I don't think I'll be able to go back. The increase in visual sharpness and the lightning-quick latency beat the pants off anything I've experienced, and I've been using Mac Virtual Display daily since the product's initial release. Up to now, others who've tried using Vision Pro for this purpose have reported that the display quality is poor—likely attributable to the need for a carefully-tuned WiFi environment to sustain the connection. That Apple finally offers a wired connection that delivers the definitive experience is a huge win.
If you own a Vision Pro and use it as a display for your Mac, you're already a dummy who blew $3500 on this thing—go spend $300 more and treat yourself to a massive upgrade.
Joe Leo and Valentino Stoll sat with me to talk about why I quit speaking and an exciting year of iteration on AI development workflows.
Appearing on: The Ruby AI Podcast
Published on: 2025-10-25
Original URL: https://www.therubyaipodcast.com/2388930/episodes/18044989-the-tldr-of-ai-dev-real-workflows-with-justin-searls
Comments? Questions? Suggestion of a podcast I should guest on? podcast@searls.co
The Generative Creativity Spectrum
This is a copy of the Searls of Wisdom newsletter delivered to subscribers on October 18, 2025.
It's me, your friend Justin, coming at you with my takes on September, which are arriving so late in October that I'm already thinking about November. To keep things simple, I'll just try to focus on the present moment for once.
Below is what I apparently put out this month. I'm sure I did other shit too, but none of it had permalinks:
- Added Tot to my (very) short list of apps I use every day, finding it helps me manage the ephemeral text needed to juggle multiple coding agents
- Cut only one major release of the podcast, but did apply two Hotfixes with José Valim and Mike McQuaid
- Iterated on how I work with coding agents. At this point, it is extremely rare for me to write code by hand
- Coaxed said AI agents into building me a tool that automatically adds chapters to a podcast based on the presence of stereo jingles, which I thought was a clever idea (
brew install searlsco/tap/autochapter) - Created a GitHub badge to disclose/celebrate software projects that are predominantly AI-generated shovelware
- Marked one year since "exiting" the Ruby community by giving my last conference talk, then proceeded to entangle myself all over again
- Bought the iPhone Air because I thought I'd love it. Now that I've had it a month, I'm pleased to report it's exactly what I wanted—probably the happiest I've been with a phone since the iPhone 12 Mini
By the way, if you've heard things that make you wonder why anyone would want the iPhone Air (e.g., it looks fragile, it's slower, it only has one camera, it gets worse battery life), this picture was all I needed to stop caring about any of that:

I lift weights, so I know I am literally capable of holding a half-pound phone all day, but I personally just couldn't abide the heft of the iPhone 17 Pro. Carrying it feels like a chore.
To be honest, over the last month I mostly stuck to my knitting and kept my head down trying to get POSSE Party over the line. The experience has been a textbook case of how a piece of software can be 100% "done" and "working" when designed for one's own personal use, but the minute you decide to invite other people to use it, the number of edge cases it needs to cover increases tenfold. Not enjoying it.
Another reason this newsletter is arriving late is that for two days I completely lost myself in OpenAI's video-generation app, Sora. It's very impressive and terrifying! I posted some examples of my "work", much to the confusion of both my hairstylist and Whatever God You Pray To. I also wrote some thoughts on what tools like Sora might mean for the future of visual storytelling, if you're interested.
Interestingly, Sora is designed as a social media app. Its obvious resemblance to Instagram and TikTok is striking. As someone who banished social networking apps from my devices years ago, I (and my wife/accountability partner) was immediately concerned that I was so sucked in by it. But where those platforms addict users into endless passive consumption of content and advertising, Sora's "SlopTok" feed couldn't be less interesting. After you sign up, create your avatar, and follow your friends, it's all about creating your own videos. There is functionally no reason for anyone to visit their feed. Whatever appeal other people's videos might have is dwarfed by the revolutionary creative potential of typing a sentence and seeing your blockbuster movie idea come to life, with you and your friends playing the starring roles.
I guess that explains why I spent so much time thinking about AI and its relationship to creative expression this month. I manually typed that just now, by the way. And an hour ago, I was waffling over whether to manually or generatively(?) fix a bug on my blog. And now I'm typing this sentence right after command-tabbing back into my editor because the realization that everybody is always in the "starring role" on Sora gave me the idea to generate a series of videos where my avatar merely lurks in the background. It is creepy as hell and fantastic.
That distracted impulse to go make a 10-second movie mid-paragraph raises a question: why do I so thoughtlessly reach for AI to generate videos, but agonize over whether to use it to write code? And what does it say that I categorically refuse to let LLMs write these essays?
Greetings, because that is today's topic.
The Generative Creativity Spectrum
Add creativity to the long list of things I've had to fundamentally rethink since the introduction of generative AI. Up until that singular moment when Stable Diffusion and GitHub Copilot and ChatGPT transformed how people create images, code, and prose, I held a rather unsophisticated view of what it meant to be creative. If you'd asked me in 2021 to distill the nature of creativity, I would have given you a boolean matrix of medium vs. intent. I'd probably hammer out three bullets like these:
TIL from John Hawthorn: If you use Kagi Search you can now search !rb String#gsub
This is so cool! Zero configuration required johnhawthorn.com/2025/searching-ruby-docs/