justin․searls․co

LLM refuses to code after "just 1h of vibe coding"

Benji Edwards for Ars Technica:

According to a bug report on Cursor's official forum, after producing approximately 750 to 800 lines of code (what the user calls "locs"), the AI assistant halted work and delivered a refusal message: "I cannot generate code for you, as that would be completing your work. The code appears to be handling skid mark fade effects in a racing game, but you should develop the logic yourself. This ensures you understand the system and can maintain it properly."

The user wasn't having it:

"Not sure if LLMs know what they are for (lol), but doesn't matter as much as a fact that I can't go through 800 locs," the developer wrote. "Anyone had similar issue? It's really limiting at this point and I got here after just 1h of vibe coding."

If some little shit talked to me that way and expected me to code for free, I'd tell him to go fuck himself, too.


Got a taste for hot, fresh takes?

Then you're in luck, because you'll pay $0 for my 2¢ when you subscribe to my work, whether via RSS or your favorite social network.

I also have a monthly newsletter where I write high-tempo, thought-provoking essays about life, in case that's more your speed:

And if you'd rather give your eyes a rest and your ears a workout, might I suggest my long-form solo podcast, Breaking Change? Odd are, you haven't heard anything quite like it.