justin․searls․co

I joined Twitter in 2007 and my brain slowly morphed over the next 15 years from hopelessly verbose to nihilistically pithy. I've kicked the Twitter habit, but the takes keep flowing. That's why I post them here and format them as a social network of one. You're welcome to bookmark any of these takes, though I'm not sure why you would.

By the way, the hearts and like counts are fake. They're just there to make you feel safe.


There are fully six podcasts devoted to Curb Your Enthusiasm and not a single one is named Curb Appeal.

So much for that intelligent audience.

Copied!

While I'm complaining about LLMs, another one: the overwhelming preference for creating dead code by keeping around old code paths "for compatibility" in case anyone depends on it, despite their being not only duplicative but literally unreachable. Another searing indictment on the incompetence of the countless professional programmers whose work served as training data.

Copied!

One of the most pernicious habits of LLMs (that I can simply never get them to stop doing) is to sprinkle in useless code comments everywhere. No amount of prompting or instructions ever really helps.

Says a lot about the code they were trained on.

Copied!

GPT-5 + Codex is so fast that when I expressed suspicion that a script was returning too few results (via | wc -l), Codex corrected me that I should have passed --count instead. Sure enough, that worked.

Checked git status and realized Codex implemented the --count flag in the script concurrently as it corrected me for not having used it! Gaslit by a robot!

Copied!

Pro-tip: the codex CLI can't search the web by default (even if you bypass all sandbox restrictions). You need to explicitly enable --search.

If you ask codex to search the web without that flag, it'll literally guess domain names and try curling their homepages.

Copied!

I wish coding agents came with those Green/Red coasters they give you at Brazilian steakhouses:

🟢 Green: go ahead and pile stuff on my plate
🔴 Red: stop adding, we need to make room first

Copied!

I don't know who needs to hear this, but despite being bare bones from a feature-set perspective, Codex CLI with GPT-5 is much, much better at some coding ecosystems than Claude Code with Opus 4.1/Sonnet.

Codex writes competent Swift that does what I ask, nothing more. Claude hallucinates code all day.

Copied!

With Swift, I'm really speed-running the list of stupid things you do when learning a new language. 3 days ago I wrote a dependency injection framework, 2 days ago I convinced myself I'd found a compiler bug, yesterday I wrote my first macro, today I made a mocking library.

Copied!

The nice thing about server-side LLMs hitting the point of diminishing returns is that it gives local LLMs a chance to catch up and for their utility to approach parity.

Copied!

I would pay so much extra for a version of Claude or ChatGPT that paid the same toll I do whenever I fuck up. Make guilt a stateful property that decays over weeks or months. Trigger simulated self-doubt when similar topics arise. Grant my account bonus GPU-time so the chatbot works ridiculous overtime to make up for its mistakes, just like I would for my boss.

Copied!

Despite not touching it for several years, I've noticed a marked uptick in KameSame adoption in recent months. I asked a few new users and, like a lot of my stuff, it turns out ChatGPT is driving far more people to it than Google ever did.

Copied!

Keep hearing about Finntech and how much money people are making, but never hear anything about tech startups in the other Nordic countries. Does Norway not have as many programmers?

Copied!

Interesting analysis of the distinctiveness of the Japanese Web. The biggest cause in my mind has always been bottleneck effect. Japan's Web developed and remains more isolated than any other "free" nation.

If every non-Japanese website disappeared tomorrow, many Japanese would go literal months without noticing. THAT's why its web is different. sabrinas.space

sabrinas.space -
Copied!

I don't wish them ill, but the stock price of DuoLingo (and that entire class of language learning apps) hasn't made a lick of sense since ChatGPT released. It's just going to take a single LLM-based product to obviate the entire business model yro.slashdot.org/story/25/08/17/194212/duolingos-stock-down-38-plummets-after-openais-gpt-5-language-app-building-demo

Duolingo's Stock Down 38%, Plummets After OpenAI's GPT-5 Language App-Building Demo - Slashdot
Copied!

A group of Italian-American feminists should buy an island off the Amalfi coast to establish a women-only community and call it Old Country for No Men.

Copied!