I haven't been happy with the size or weight of any of Apple's notebook computers since 12" MacBook. And since moving to 14" and 16" for the MacBook Pro, I've been praying for a 12" MacBook Pro to be released. But now, with the Vision Pro among us, my desire for an ultralight Mac has only increased. In fact, about a month ago, I tried to get DALL•E and Midjourney to generate images of a headless MacBook Pro device (like an Apple II but in a modern industrial design language), but they weren't up to the task.
I had never even considered beheading a MacBook Air, but this guy sure did:
The fact this mod is as straightforward as it is is a real testament to Apple's improvements in repairability over the last few years. The only shame of it is that there's no way to reliably log into the device after a reboot (short of guessing at the state of the password entry screen). If it weren't for that I seriously might consider doing this to my own M2 MacBook Air after its warranty is up.
UPDATE: Rob Carlson inspires some hope that a headless MacBook isn't as unusuable at boot as I might have worried:
I log into a headless MacBook Pro all the time. Just hit the "up volume" key a bunch of times until VoiceOver turns on, then it'll prompt you for username, then password, and give three beeps if you're right.
Accessibility truly is for everyone!
UPADATE: Confirmed
As I discussed on my Breaking Change podcast a couple weeks ago, Apple it's downright impressive how well Apple is executing on a strategy of malicious compliance in response to the EU's Digital Markets Act.
One tidbit from a week ago was that third-party browser rendering engines wouldn't be able to be saved to the home screen and launched as progressive web apps (PWAs). Well, in order to level the playing field, Apple's apparently decided to just disable PWAs entirely for EU customers:
Apple appears to be turning off the ability to use web apps right from the iPhone's homescreen in the European Union. Support for progressive web apps appeared to be broken inside in the EU during the first two betas of iOS 17.4, but today developer Maximiliano Firtman said in a post on X that web apps are still turned off in the third beta, which arrived yesterday. "At this point, it's a feature disabled on purpose," Firtman wrote.
If you haven't heard Firtman's name before, he's been the guy following Apple's largely-undocumented, unnecessarily circuitous path to progressive web app support in iOS, so if that's his read, that's probably the case.
I know a lot of people are angry at Apple's flagrant display of spite as they take their ball and go home on all these issues, but I really can't manage any other reaction but to be impressed by their commitment to the bit. After the DMA passed but before Apple released their response, there was expectation that similar laws would pass in other developed countries over the next 18-24 months. But Apple's ability to outpace this months-old legislation so completely is probably throwing cold water on those plans, if they exist. The EU could choose to deny Apple's proposals, but something tells me that substantive changes would require even more complex legislative changes which would only invite even more sophisticated technical and policy countermeasures by Apple.
One-shotting git pull-commit-push in VS Code
A frustration I've had since switching to VS Code last year from terminal vim is that the built-in source control extension isn't very keyboard-friendly. As a result, I've been tabbing back and forth between VS Code and Fork and kicking myself every single time, especially when I'm just editing a single file and I really don't need to review my changes before I push.
Well, I finally took the five minutes to write a VS Code
macro
to do this for me. First, run Open Keyboard Shortcuts (JSON)
and add this
to the array of keyboard shortcuts:
{
"command": "runCommands",
"key": "cmd+alt+ctrl+p",
"args": {
"commands": [
"workbench.action.files.save",
"git.sync",
"git.stageAll",
"git.commitAll",
"git.push"
]
}
}
Now when I smoosh command
, option
, and control
, then hit P
, it'll pull
from the tracked remote branch, stage & commit everything, open a window for me
to enter a quick message (usually "lol"
), and then when I hit command-w
, the
result will be pushed. Saves me about 10 seconds per commit.
GPT 3.5 is a lot worse than GPT 4
It should shock no one to learn that Open AI's newer, better language model is an improvement over the old one, but if you aren't an active user of any of this newfangled AI stuff, it can be easy to lose track of just how much better things are getting and how quickly.
If you subscribe to GPT Plus, ChatGPT will also implement the ReAct pattern for requests it thinks can be formalized, which is one way to mitigate hallucinations.
Pictured here, asking "days between 12/10 and 2/11":
- GPT 3.5, which gets it completely wrong with a nonsensical reasoning
- GPT 4, which gets it right and even has a little terminal prompt link
- Opening up that link will actually show you a Python script GPT 4 used to compute the answer
Cool beans. 🫘
January's Searls of Wisdom contained some big news about my new line of software products, including the launch of my first NEVER app, Beckygram. Subscribe to learn more justin.searls.co/newsletter/
Vision Pro with the Good Strap
Following up on my post from this morning on how to use a 3rd party "halo" strap for Vision Pro, my incredible brother Jeremy printed and sanded these adapters for me. The experience wearing this strap is night-and-day better than either of Apple's built-in straps. The headset's weight is finally where it should be.
PSA: This is the first good Vision Pro strap
UPDATE: It works! Photo here
As I mentioned in my review podcast, the two straps that ship with the $3500 Apple Vision Pro are god-awful and mediocre, respectively.
If you just spent that much money on this thing, do yourself a favor and buy two more things:
-
A BOBOVR M2 Plus strap
-
This 3D-printed conversion kit for connecting it to Vision Pro (you can also print it yourself)
And boom: for under $50 you'll have a comfortable way to actually use the Vision Pro. Shame on Apple for dropping the ball so badly in the name of aesthetics (what happened to, "design is how it works"?), but hat tip to Mark Miranda for pointing me to this Etsy listing.
One of the strangest things about United Airlines is that booking a premium economy seat (of which there are only a dozen or so on this plane) has zero bearing on your boarding zone unless you buy an extra add-on. It's so bizarre because it results in United being wholly excluded from Premium Economy searches on sites like Google Flights. Feels strange paying twice as much to hang out in Group 5.
HTML fragment caching really works!
I have somehow been using Ruby on Rails since 2005 and have never worked on an app that needed to think seriously about web request caching, probably because of my proclivity to reach for static site generators and simple asset hosting whenever anything I make will be public-facing. But the current app I'm working on is actually mostly accessible without requiring users be logged in, which means it will both (1) run the risk of having bursts of hard-to-anticipate traffic to certain pages and (2) render pretty much the exact same markup for everyone.
I'll start with the results. Here's a mostly-empty, public-facing page my basic Heroku dyno without caching:
Completed 200 OK in 281ms (Views: 201.9ms | ActiveRecord: 47.5ms | Allocations: 37082)
And now with a few lines of caching setup:
Completed 200 OK in 9ms (Views: 3.5ms | ActiveRecord: 1.6ms | Allocations: 2736)
So over 30 times faster. And that's on a very basic page. Once the site is primed with content it'll probably be even more dramatic.
Here's how to do it.
If you're curious at all about this Vision Pro thing and you've read the reviews and you want to go DEEPER, then I invite you to listen to me drill into all kinds of details for two hours. My takes are, if nothing else, incisive (if not outright biting) justin.searls.co/casts/breaking-change-v4-facial-computing/
This podcast is a month old and four episodes in and the singular event looming over all of it has finally arrived! The era of facial computing has begun!
Join me for a Vision Pro extravaganza in which I detail all of my first impressions using the device, including dozens of things that seemingly every media and YouTube reviewer missed or excluded. Listen to this podcast and you'll hear tell of bugs you wouldn't believe even if you did see them!
The headline takeaway is: Apple Vision is clearly the future, because it's clearly not yet the present. (And why I'm probably keeping it anyway.)
As always, e-mail me your reviews, reactions, and errata at podcast@searls.co and I'll absorb them into the bubbling stew of opinions I'm forming about this futuristic-and-not-necessarily-in-a-good-way computing platform.
Scant show notes follow:
All aboard the Persona Polar Express! Next stop, Uncanny Valley!
I'm going to be cutting an episode of my Breaking Change podcast about what it's like to use Vision Pro. If you have any questions you want me to answer? E-mail me podcast@searls.co 💚