I love writing scripts against the ChatGPT API that are intentionally constrained to particular input and output formats. Here's one I wrote this morning to help me write spreadsheets with a REPL-like interface to continuously improve the output until I'm happy with it.
A tale of artificial intelligence in four acts
I was wondering if I should keep dragging my hacky little OpenAI API wrapper class from script to script, so:
- Search rubygems.org for "gpt"
- Find one called
chat_gpt
described as "This is OpenAI's ChatGPT API wrapper for Ruby" - Click the "Homepage" link
- The code repository is archived and contains the disclaimer "NOTE this code was written by ChatGPT and may not work"
Great job, everyone.
MacRumors, based on forum and social media posts:
Over the past week, some BMW owners have complained that their iPhone 15's NFC chip no longer works after charging the device with their vehicle's wireless charging pad, according to comments shared on the MacRumors Forums and X, formerly known as Twitter. Affected customers say the iPhone goes into a data recovery mode with a white screen, and the NFC chip is no longer functional after the device reboots.
Apparently it wasn't enough for BMWs to damage themselves during normal use.
This video gives a pretty good perspective of what gaming-oriented VR enthusiasts think of Meta's Quest 3 headset. Reading the coverage of their event, it seems like Zuck and company are embracing the device as more of a game console than a metaverse client.
This seems smart, because more people want immersive VR games than want a metaverse client.
But it's really interesting to consider the Quest 3 rollout with knowledge of how Apple carefully framed the Vision Pro as being built for computing, going well out of their way to ever use the phrase "virtual reality" and with gaming as a total after thought. In fact, visionOS doesn't allow for immersive roomscale apps, which dramatically decreases the number of games that could be ported to it.
What consumers want from VR, so far: games.
What Meta is giving them: games (with the hope that they will convert people into social metaverse experiences).
What Apple is giving them: a computing platform (with an express deemphasis on gaming, to the point of kneecapping its capabilities).
Watching this play out will be a great test of Apple's ability to "skate where the puck is going". I suspect later revisions will allow for better game capabilities but (like the rest of their platforms) never enough to excite gamers. Meta, meanwhile, happily lets gamers use their sold-at-a-loss headset as a wireless or wired dumb terminal connected to a more powerful gaming PC.
My 2¢:
- Meta will indeed capture a lot of market share, but (similar to the Wii) most devices will play a couple games and then collect dust. And the people who buy Quest units won't be the same people who want the experiences Meta really wants to offer. The most likely case seems to be that they'll start to look more and more like a traditional game console platform holder like Sony or Microsoft. And, unlike their current business, that's at least an honest one that can turn a tidy profit without selling out their customers
- Apple Vision Pro will sell in such low volumes for the first several years that we'll keep hearing premature obituaries from the media until 7 or 8 years from now Apple is able to break through with a mainstream computing platform by sheer force of will
If you ever had to settle for "Sudafed PE" instead of "Actually works Sudafed", you will want to know about the FDA's vote validating everyone's perception that phenylephrine is useless as a decongestant:
Advisers for the Food and Drug Administration this week voted unanimously, 16 to 0, that oral doses of phenylephrine—found in brand-name products like Sudafed PE, Benadryl Allergy Plus Congestion, Mucinex Sinus-Max, and Nyquil Severe Cold & Flu—are not effective at treating a stuffy nose.
I got sick in Greece and noticed straight away that low-dose pseudoephedrine is widely available over the counter and without any ID or registration requirements. This ruling definitely makes one wonder how the industry will respond if the only available decongestant sells in low volume because it's kept behind the counter to prevent people from using it to cook meth.
Great reader e-mail over at TPM:
Musk's behavior has been atrocious. But he shouldn't have been allowed to be in that position in the first place. That's on the Pentagon and the US government more generally. In the first rush of enthusiasm and support for Ukraine, Musk shipped a bunch of free Starlink devices to Ukraine and agreed to cover the cost of the service. Later when he cooled on Ukraine he started threatening to shut the service off if the Pentagon didn't pick up the tab. That's standard mercurial behavior from Musk. But of course the Pentagon and more broadly the US should be picking up the tab. Much as I loathe the person Musk has turned out to be, I remember thinking at the time, how can this even be a question? Of course they should pick up the tab. The idea that we'd leave it to the whim of someone like Musk to be covering the cost of mission-critical technology for an ally at war is crazy.
This was my thought at the time. That Musk's offer to blanket Ukraine with Starlink satellites and terminals was "free as in blackmail", especially if it bypassed (what would have surely been expedited) defense procurement processes that would have mandated their availability and security.
Now Musk has half the US government over a barrel, with no real way out until credible competitors to SpaceX emerge.
DeSantis's campaign manager, emphasis mine:
"Iowa is a real state for us because of its education — it's a highly educated state — because of income, because of Bible reading," said Jeff Roe, in audio obtained by POLITICO. "New Hampshire is a terrible state for Donald Trump. That's a terrible state for him. He's gonna get like 28 percent. Now there is more people who will have a slice of that and some people are just betting on New Hampshire overall. But he's going to lose the first two states. We're going to beat him in Iowa."
When you're a conservative that doesn't believe in school and for whom universities are bastions of a woke ideology, the two hallmarks of the highly educated are apparently income and Bible reading. TIL.
This essay almost exactly mirrors my feelings about the AI innovations we've seen spring forth over the last year. If you know me at all, you know that I've made my career by sticking my head in the sand and ignoring new technology trends in favor of following the market's fundamentals, even when I'm very excited about those innovations personally. (It's why I'm still making full-stack web apps instead of building everything with native Apple SDKs.)
That said, I've been dismayed to see so many of my friends that reside along the same pessimistic-bordering-on-cynical gradient continue to be stubbornly dismissive of AI as just another fad. This isn't crypto. The real-world economic impact of only the most basic tools has already been profound, and humans are nowhere close to catching up with its implications for a huge swath of jobs in the knowledge economy.
Sure. Few claim that LLMs possess human-like intelligence, "think" like humans, or exhibit self-awareness. Then again, there are also schools of thought that argue that humans are also just glorified response generators. Regardless of philosophical dispositions, it is important to note that a large number of white collar jobs on top of which economies are built involve just reasonable amounts of comprehension, bounded decision making, and text generation—paper pushing, "code monkeying" and whatnot. Really though, probabilistic text generation that maintains context in numerous computer and human languages while being meaningful, useful, and conversational, at the same time exhibiting at least an illusion of reason, in what world is that a trivial achievement ought to be dismissed with a "just"!? Those goal posts have shifted hilariously fast.
Ever since my post about AI and jobs in March, I have felt my take was overly optimistic. The obvious limitations of the tools we see today (e.g. LLM hallucination) do indeed limit the practical application of AI, but the potential for composability to address these concerns is sobering (e.g. a supervisory model that audits and reinforces the accuracy of an LLM's hallucination) and should distress anyone who would prefer that AI didn't devour the middle-class economy.
The rumors in the run-up to the iPhone 15 have been particularly maddening when it comes to the inevitable switch to the USB-C connector.
Speculation therefore remains rife about the USB-C port capabilities of the iPhone 15 lineup, and nothing is certain beyond the switch from Lightning. Rumors suggest the cables supplied in iPhone 15 boxes are limited to USB 2.0 data transfer speeds at a rate of 480 MBps, which is the same as Lightning.
In contrast, the iPhone 15 Pro models are expected to be supplied with cables capable of USB 3.2 or Thunderbolt 3 trans
Here's what I would have predicted a year ago: iPhone 15 would get a USB-C connector with USB data speeds and fast-ish charging speeds, and the Pro models would get a Thunderbolt/USB 4 port with typical transfer speeds and slightly faster-than-the-15-but-nothing-like-a-MacBook-fast charging speeds.
Simple, straightforward, consistent with other products as well as Apple's strategy of widening the segmentation between the phone lines.
The rumors have confirmed this at every step if that's what you'd been expecting. But as far as I can tell, everyone reporting on Apple rumors seems to be befuddled. Last week it was Thunderbolt 3, which has effectively been discontinued across the rest of Apple's line. I realize they need to work to put clicks on the table like everyone else, but they almost seem intentionally dense on this one.
There's a reason I exported my Twitter archive the day Elon took over. theverge.com/2023/8/20/23838823/twitter-x-deleted-pictures-links-2014-metadata-t-co-shortener
I got a chance to sit down with the Changelog crew for a second time this year. This time it was to discuss the provocative blog post I wrote last month. It also featured my colleague Landon, who got to represent The Youth in tech today.
It was a great discussion. It's not technical and doesn't require much prior context to follow. I'm also always blown away by how much better the Changelog's audio engineering is than any other programming podcasts I've heard. Anyone that can make me able to listen to myself is working some kind of magic.
If you use Apple Podcasts (as I do), here's the direct link.
The iTunes revolution of selling individual songs for 99¢ was something I fought
against at the time, because I conceived of my favorite albums as complete,
integrated works. I worried decoupling the song from the album would optimize
the industry to pump out ever-more-saccharine pop hooks.
It's interesting now, looking back and seeing the thread connecting the 99¢ song
to infinitely-scrolling algorithmic video feeds as the logical endgame. Back
then, I could never understand why someone would want to buy a single song ala
carte, and today I can't get my head around the appeal of TikTok or Instagram
Reels. 🤷♂️
Your PS5 analog sticks are melting!
UPDATE: A few folks told me that WD-40 works to resolve this and… they were kind of right! Just buy a pen applicator and dab it on.
Years ago, I unpacked an old PS3 and was disgusted to realize that the controllers' analog sticks had all melted into gooey, sticky black blobs. I wondered if they'd gotten too hot at some point, somehow, but it turns out that the low-grade rubber and oil Sony used just inevitably denatures over time. I figured they'd have fixed this in later generations, but it turns out they haven't! Both my PS5 controllers are now completely unusable after sitting idle for six months.
Entropy always wins… neat!
A friend mentioned the Sumitomo corporation yesterday and it prompted a conversation about Japanese-style conglomerates and how they've gone out of fashion in the West.
One thing that never came into fashion in the West? Educational comic books about your conglomerate that you link off your corporate home page.
132 pages! Things really start cooking about 12 pages in.
Incredibly relatable content in this provocative post by Thorsten Ball:
Many times in your day-to-day programming life you have to wait. Wait for your development environment to boot up, wait for the formatting-on-save command to finish, wait for the website you just opened to load, wait for tests to run, wait for the CI build to finish.
The waiting doesn't really cause me physical pain, but it does evoke a physical reaction alright. I just can't stand it. Maybe it's because I'm impatient by nature. Maybe it's knowing that things could be faster that causes it. When I have to wait ten seconds for a test to finish that I plan to run many times over the next hour, I tell you, it feels as if I'm about to lose my mind.
"Impatience" has been considered a virtue in software for literal decades, because nearly every single action a programmer redounds to a call-and-response with a computer that can't be considered complete until the computer has delivered its result and a human has interpreted it.
Imagine yourself texting with someone. If the other party replies quickly, it will promote focus and secure your attention—you'll stare at your phone and reply promptly as well. If many seconds or minutes go by between responses, however, you'll rationally lose interest, go do something else, and return to the conversation whenever you happen to come back to it. Most importantly, a fast-paced chat results in many more total messages exchanged than a slow-paced conversation, because time is stubbornly finite.
No one has any problem conceptualizing the above, but perhaps because we tend not to conceive of programming as a two-way conversation between a human and a computer, developers often lack a keen sense of this issue's salience.
I should see more people wince when a website takes longer than two seconds to load. There are very few reasons most websites should take long to load. Yet many times when, together with colleagues, I'd watch a website that we built load for longer than two seconds and say "something's off, I bet there's an N+1 query here" and turn out to be right – nobody else noticed anything.
Over the course of my career, very few programmers have seemed as constitutionally impatient as I am with slow computer responses. I've only become more radical in my impatience over time, as my understanding of programming as a two-way "conversation" has deepened.
Here's one way to think about it.
The upper bound of a programmer's productivity is the speed, fidelity, and correctness of the answers they're able to extract from each "feedback loop" they complete with a computer:
- Speed: if your page is slow to load, you can't refresh it as many times in a given working session, so you can't iterate on it quickly
- Fidelity: if you run a command that pulls down far too much or far too little information to answer your question, you'll spend additional time parsing and interpreting its results
- Correctness: if you have the wrong question in mind, you'll run the wrong commands, and you'll probably also waste feedback cycles to ask the wrong follow-up questions, too
I wrote a click-bait title referencing 10x developers a couple weeks ago. That post was careful to minimize value judgments and to avoid venturing into offering advice. Well, if you want some advice, here you go: take to heart the compounding nature of the effects that feedback loops have on productivity, and you'll set yourself apart as a programmer.
To illustrate, compare the potential productivity of two programmers, given a script to compute the upper bound of activities performed in the 480 minutes that comprise an 8-hour workday.
- Programmer A completes one feedback loop with their computer every 45 seconds. 1 in 10 of their commands ask the wrong question and result in the next 5 questions also being wrong. 1 in 3 of their commands produce low-fidelity results that take them 5 minutes to interpret the answer. They complete 85 productive feedback loops per day.
- Programmer B completes one feedback loop with their computer every 15 seconds. 1 in 25 of their commands ask the wrong question and result in the next 3 questions also being wrong. 1 in 10 of their commands produce low-fidelity results that take them 2 minutes to interpret the answer. They complete 902 productive feedback loops per day.
85 vs 902. There you go, a 10x difference in productivity.
It would be very fair to quibble over which numbers to measure, whether the numbers I chose are feasible, and so forth. This is only meant to illustrate that the difference between waiting a few hundred milliseconds versus a few seconds versus multiple minutes really adds up, especially when you factor in that expertise and focus can be learned and practiced to get better at asking the right questions and maintaining a clear mindset.
Something more this silly script doesn't capture is the human element of what it feels like to frequently feel like you're waiting. Beyond a certain point, people will develop habits to tab away to more responsive user interfaces like Slack or social media, resulting in minutes lost to distraction and minutes more to their attention residue. There are also reinforcing social effects of working in an organization where these phenomena are normalized that people rarely consider.
If I could go back and change one thing about how I learned to program, it would have been to emphasize the importance of internalizing this lesson and seizing control of the feedback loop between myself and my computer. I've been beating this drum for a while (and it was the primary thrust of my RailsConf 2017 keynote), but it still doesn't feel like the industry is getting any closer to acknowledging its importance or applying it to how we teach people programming, manage programmers, and design systems.
Perks of working at Google in 2007:
"Let me pull this up because there are so many," he says. When his computer produces a list a moment later, Kallayil makes his way down the screen and continues: "The free gourmet food, because that's a daily necessity. Breakfast, lunch and dinner I eat at Google. The next one is the fitness center, the 24-hour gym with weights. And there are yoga classes."
There is a pause before he adds that he also enjoys the speaker series, the in-house doctor, the nutritionist, the dry cleaners and the massage service. He has not used the personal trainer, the swimming pool and the spa — at least not yet, anyway. Nor has he commuted to and from the office on the high-tech, wi-fi equipped, bio-diesel shuttle bus that Google provides for employees, but that is only because he lives nearby and can drive without worrying about a long commute.
Let's check in on how 2023's going:
Being banned from the entire Internet would be tough, but Googlers in the high-security program will still get access to "Google-owned websites," which is actually quite a bit of the Internet. Google Search would be useless, but you could probably live a pretty good Internet life, writing documents, sending emails, taking notes, chatting with people, and watching YouTube.
Somewhere along the way—even by the time I visited the Googleplex in 2007—Google lost their way on this. Employment policies that promote people's autonomy and agency can pay companies like Google massive dividends in increased creativity, productivity, and loyalty. But perks that attempt to squeeze blood from the stone by making the office feel more luxurious than home always obfuscate the nature of the work itself and will inevitably distract everyone involved—a recipe for resentment between workers and management the minute the going gets tough.
Now that the going's gotten tough, it's too Google could never tell the difference between the two.