A lot of content around here boils down to links to someplace else, for which all I have to add is a brief call-out or commentary. As such, the headlines for each of these posts link to the original source article. (If you want a permalink to my commentary, try clicking the salt shaker.)
Starfield’s player count slips below Skyrim on Steam, just two months after Bethesda’s ‘biggest launch ever’
What is interesting from all this is how quickly it appears that Starfield has tailed down from its impressive launch, dropping down just two months after it released; by way of comparison, Skyrim’s original release had fewer players at launch in 2011, peaking at around 290,000, but didn’t fall below 20,000 players until almost seven years later in May 2018. (Two months after release, Skyrim was still on somewhere around 90,000 concurrent players on Steam.)
Keep in mind, Skyrim’s players are far likelier to be playing on Steam (both because it launched there and due to better mod support) than the official Xbox app, where Game Pass subscribers have always had “free” access to Starfield.
That said, it’s hard to read this without thinking about the chorus of anecdotal reports I’ve heard from people who’ve played Starfield: Bethesda successfully delivered another “one of those” games, but it simply lacks the magic and awe that gave Skyrim its staying power.
Personally, I think a lot of this is due to the game’s setting. High fantasy lends itself to dramatic moments celebrated by epic orchestral movements in a way that feels tacked on when applied to a sterile sci-fi environment. Fast travel between star systems in Starfield undercuts its sense of scale to a degree that manages to make it feel smaller than Skyrim’s technically-not-quite-as-big-in-aggregate map.
These combine to rob Starfield of what industry folks call “emergent gameplay.” A random hail from a starship feels like a procedurally-generated interruption, whereas a fellow traveler approaching you along a trail at night feels like a natural coincidence or kismet. Both might be random scripted events, but only one maintains suspension of disbelief. Similarly, knowing that I could walk thirty minutes and eventually reach a mountain in the center of Skyrim’s map lends credibility to its environment, but being forced to constantly fast travel between planets shatters any illusion that Starfield players are exploring a single connected universe.
As a result, Starfield is merely a good game. It’s more Fallout than Elder Scrolls. And where Fallout oozes with charm and satire, Starfield feels flat and mundane. The premise may be more realistic, but the world feels less real as a result.
Finishing Starfield’s campaign and then deciding to continue exploring its world raises the question, “why?” It would be like checking out at Costco and then deciding to walk another lap through the store. Maybe there was something you didn’t check off your list the first time around, but one imagines you’re going to impatiently hustle to get it and get out.
While nothing has yet been confirmed, it appears people were able to figure out (or someone accidentally shared) Emergency Pizza codes that could be used over and over again by the same customer. This, obviously, isn’t how the program was intended to work.
During the M1 generation of Apple Silicon Macs, one area of controversy was whether the default 8GB of RAM was less of a performance bottleneck as it was under Intel Macs. The pundits contended that a paltry 8GB was unacceptably stingy on Apple's part, but in practice—and who's to say whether it had to do with the ARM CPU or the unified memory architecture or the blazing-fast SSD performance—no matter how much you threw at the M1 Macs, it never seemed to get bogged down by swapping out to disk.
Well, that era is apparently over, because the new M3 with 8GB is getting absolutely smoked by the same chip when it has 16GB to work with:
The 8GB model suffered double-digit losses in Cinebench benchmarks, and took several minutes longer to complete photo-merging jobs in Photoshop as well as media exports in Final Cut and Adobe Lightroom Classic.
Big day at work. Emphases mine:
Test Double’s mission is to improve the way the world builds software. We’ve done that by building great teams and great software—with a special focus on building things right.
Pathfinder Product’s mission is to unleash greatness through modern product management. They’ve done that by being passionate problem solvers—with a special focus on building the right thing.
That about sums it up. When you put the two teams side by side, it's uncanny how good a fit they are. Each has hired top practitioners in their field, united by an insatiable drive to make this broken, messy world of software work better for everybody.
And as much as I bristle at the word "synergy", it's really there in this case: brilliant product strategy goes nowhere without execution, and high-performance delivery is a waste of money if it drives you in the wrong direction. The real beneficiaries of that synergy, though, will be the clients that entrust Test Double to help them accomplish both.
Apple PR representative Starlayne Meza confirmed the company’s plans to The Verge. The company encourages those who have been holding out hope for a larger iMac to consider the Studio Display and Mac Studio or Mac mini, which pair a 27-inch 5K screen with a separate computer, compared to the all-in-one design of the iMac.
In the post-Jobs era, has Apple PR ever confirmed they had no plans to create a particular product like this?
Last year, I was featured on the season 1 finale of Matt Swanson's excellent YAGNI podcast. Well, season 2 kicked off yesterday with an interview with Charity Majors and as one of the few programming podcast series I can stand to listen to (high praise, trust me), I recommend you or the programmer in your life consider subscribing!
I'd been thinking about my appearance last year on YAGNI because in his lead-in, Matt joked that he had cajoled me into writing my own test framework as an alternative to RSpec. I have no recollection of this, but he turned out to have succeeded, because 9 months later I indeed published the definitive Ruby testing framework, TLDR.
Remote workers will sign off at 1 pm to participate in a digital rally and picket via Zoom.
That’s sure to get their attention.
In late 2018, Google CEO Sundar Pichai floated a bold idea to Apple CEO Tim Cook. Cook had just told Pichai he wanted to be “deep, deep partners, deeply connected where our services end and yours begin,” according to notes from the meeting. Pichai responded with a proposal: What if Apple preinstalled a Google Search app on every iOS device?
What Pichai heard: deep love as in, “we want to go to market as a dream team, as a package deal.”
What Cook meant: deep down as in, “we don’t want anyone to know Google has anything to do with our products.”
Finally! Some Magic Mouse news for the first time in 8 years:
Apple will likely announce new versions of its Magic Keyboard, Magic Mouse, and Magic Trackpad for the Mac at its "Scary Fast" event on Monday, October 30, according to a report today from Bloomberg's Mark Gurman. The updated accessories are expected to be equipped with USB-C ports instead of Lightning ports for charging.
When the original Mac launched in 1984, it pioneered use of a mouse over command line interfaces. A mouse was undoubtedly a much more direct, tactile input than a keyboard. But ever since the release of the iPhone in 2007, it's been clear that Apple believes touch screens and its glass trackpads are more direct inputs than mice and therefore superior. Apple doesn't want you to use a mouse when you could be using one of its trackpads.
(This emphasis on eliminating indirection in how users express their intent clearly energizes Apple's aspirations for visionOS, where the primary interaction metaphor is to "look at a thing" to select it.)
While the Magic Mouse lasts around a month or longer between charges, the position of the Lightning port on the bottom of the mouse has been a meme in the Apple community for many years, as it prevents the mouse from being used while charging. It's unclear if the USB-C port on the new version will be placed in a more convenient location.
I honestly expect them to keep the port on the bottom out of spite.
Alex Heath at the Verge, with yet another internal meeting leak:
Elon Musk wants X to be the center of your financial world, handling anything in your life that deals with money. He expects those features to launch by the end of 2024, he told X employees during an all hands call on Thursday, saying that people will be surprised with “just how powerful it is.”
“When I say payments, I actually mean someone’s entire financial life,” Musk said, according to audio of the meeting obtained by The Verge. “If it involves money. It’ll be on our platform. Money or securities or whatever. So, it’s not just like send $20 to my friend. I’m talking about, like, you won’t need a bank account.”
This will almost surely be underwritten by a real bank, like Apple Card and Savings accounts are handled by Goldman Sachs, as Musk lacks the discipline to clear the regulatory hurdles to become the actual center of one’s financial life. But if this launches, I fully expect Musk will mandate his employees use this account for direct deposit for their paychecks and may even go further, requiring it of Tesla and SpaceX employees as well.
But the thing I can’t stop imagining is that he might promise social boosting for his mostly red-pilled, socially-disaffected male acolytes who park their money in X’s coffers. Maybe they’ll get a special checkmark or even more prominent placement in people’s timelines and replies.
Hard to see how this doesn’t end in some kind of cross between It’s a Wonderful Life and The Purge when tens of thousands of incredibly online, gun-toting libertarian men realize their life savings have been eradicated following a bank run caused by, surely, some stupid thing Elon will have tweeted.
MacRumors, based on forum and social media posts:
Over the past week, some BMW owners have complained that their iPhone 15's NFC chip no longer works after charging the device with their vehicle's wireless charging pad, according to comments shared on the MacRumors Forums and X, formerly known as Twitter. Affected customers say the iPhone goes into a data recovery mode with a white screen, and the NFC chip is no longer functional after the device reboots.
Apparently it wasn't enough for BMWs to damage themselves during normal use.
This video gives a pretty good perspective of what gaming-oriented VR enthusiasts think of Meta's Quest 3 headset. Reading the coverage of their event, it seems like Zuck and company are embracing the device as more of a game console than a metaverse client.
This seems smart, because more people want immersive VR games than want a metaverse client.
But it's really interesting to consider the Quest 3 rollout with knowledge of how Apple carefully framed the Vision Pro as being built for computing, going well out of their way to ever use the phrase "virtual reality" and with gaming as a total after thought. In fact, visionOS doesn't allow for immersive roomscale apps, which dramatically decreases the number of games that could be ported to it.
What consumers want from VR, so far: games.
What Meta is giving them: games (with the hope that they will convert people into social metaverse experiences).
What Apple is giving them: a computing platform (with an express deemphasis on gaming, to the point of kneecapping its capabilities).
Watching this play out will be a great test of Apple's ability to "skate where the puck is going". I suspect later revisions will allow for better game capabilities but (like the rest of their platforms) never enough to excite gamers. Meta, meanwhile, happily lets gamers use their sold-at-a-loss headset as a wireless or wired dumb terminal connected to a more powerful gaming PC.
- Meta will indeed capture a lot of market share, but (similar to the Wii) most devices will play a couple games and then collect dust. And the people who buy Quest units won't be the same people who want the experiences Meta really wants to offer. The most likely case seems to be that they'll start to look more and more like a traditional game console platform holder like Sony or Microsoft. And, unlike their current business, that's at least an honest one that can turn a tidy profit without selling out their customers
- Apple Vision Pro will sell in such low volumes for the first several years that we'll keep hearing premature obituaries from the media until 7 or 8 years from now Apple is able to break through with a mainstream computing platform by sheer force of will
If you ever had to settle for "Sudafed PE" instead of "Actually works Sudafed", you will want to know about the FDA's vote validating everyone's perception that phenylephrine is useless as a decongestant:
Advisers for the Food and Drug Administration this week voted unanimously, 16 to 0, that oral doses of phenylephrine—found in brand-name products like Sudafed PE, Benadryl Allergy Plus Congestion, Mucinex Sinus-Max, and Nyquil Severe Cold & Flu—are not effective at treating a stuffy nose.
I got sick in Greece and noticed straight away that low-dose pseudoephedrine is widely available over the counter and without any ID or registration requirements. This ruling definitely makes one wonder how the industry will respond if the only available decongestant sells in low volume because it's kept behind the counter to prevent people from using it to cook meth.
Great reader e-mail over at TPM:
Musk’s behavior has been atrocious. But he shouldn’t have been allowed to be in that position in the first place. That’s on the Pentagon and the US government more generally. In the first rush of enthusiasm and support for Ukraine, Musk shipped a bunch of free Starlink devices to Ukraine and agreed to cover the cost of the service. Later when he cooled on Ukraine he started threatening to shut the service off if the Pentagon didn’t pick up the tab. That’s standard mercurial behavior from Musk. But of course the Pentagon and more broadly the US should be picking up the tab. Much as I loathe the person Musk has turned out to be, I remember thinking at the time, how can this even be a question? Of course they should pick up the tab. The idea that we’d leave it to the whim of someone like Musk to be covering the cost of mission-critical technology for an ally at war is crazy.
This was my thought at the time. That Musk's offer to blanket Ukraine with Starlink satellites and terminals was "free as in blackmail", especially if it bypassed (what would have surely been expedited) defense procurement processes that would have mandated their availability and security.
Now Musk has half the US government over a barrel, with no real way out until credible competitors to SpaceX emerge.
DeSantis's campaign manager, emphasis mine:
“Iowa is a real state for us because of its education — it’s a highly educated state — because of income, because of Bible reading,” said Jeff Roe, in audio obtained by POLITICO. “New Hampshire is a terrible state for Donald Trump. That’s a terrible state for him. He’s gonna get like 28 percent. Now there is more people who will have a slice of that and some people are just betting on New Hampshire overall. But he’s going to lose the first two states. We’re going to beat him in Iowa.”
When you're a conservative that doesn't believe in school and for whom universities are bastions of a woke ideology, the two hallmarks of the highly educated are apparently income and Bible reading. TIL.
This essay almost exactly mirrors my feelings about the AI innovations we've seen spring forth over the last year. If you know me at all, you know that I've made my career by sticking my head in the sand and ignoring new technology trends in favor of following the market's fundamentals, even when I'm very excited about those innovations personally. (It's why I'm still making full-stack web apps instead of building everything with native Apple SDKs.)
That said, I've been dismayed to see so many of my friends that reside along the same pessimistic-bordering-on-cynical gradient continue to be stubbornly dismissive of AI as just another fad. This isn't crypto. The real-world economic impact of only the most basic tools has already been profound, and humans are nowhere close to catching up with its implications for a huge swath of jobs in the knowledge economy.
Sure. Few claim that LLMs possess human-like intelligence, “think” like humans, or exhibit self-awareness. Then again, there are also schools of thought that argue that humans are also just glorified response generators. Regardless of philosophical dispositions, it is important to note that a large number of white collar jobs on top of which economies are built involve just reasonable amounts of comprehension, bounded decision making, and text generation—paper pushing, “code monkeying” and whatnot. Really though, probabilistic text generation that maintains context in numerous computer and human languages while being meaningful, useful, and conversational, at the same time exhibiting at least an illusion of reason, in what world is that a trivial achievement ought to be dismissed with a “just”!? Those goal posts have shifted hilariously fast.
Ever since my post about AI and jobs in March, I have felt my take was overly optimistic. The obvious limitations of the tools we see today (e.g. LLM hallucination) do indeed limit the practical application of AI, but the potential for composability to address these concerns is sobering (e.g. a supervisory model that audits and reinforces the accuracy of an LLM's hallucination) and should distress anyone who would prefer that AI didn't devour the middle-class economy.
The rumors in the run-up to the iPhone 15 have been particularly maddening when it comes to the inevitable switch to the USB-C connector.
Speculation therefore remains rife about the USB-C port capabilities of the iPhone 15 lineup, and nothing is certain beyond the switch from Lightning. Rumors suggest the cables supplied in iPhone 15 boxes are limited to USB 2.0 data transfer speeds at a rate of 480 MBps, which is the same as Lightning.
In contrast, the iPhone 15 Pro models are expected to be supplied with cables capable of USB 3.2 or Thunderbolt 3 trans
Here's what I would have predicted a year ago: iPhone 15 would get a USB-C connector with USB data speeds and fast-ish charging speeds, and the Pro models would get a Thunderbolt/USB 4 port with typical transfer speeds and slightly faster-than-the-15-but-nothing-like-a-MacBook-fast charging speeds.
Simple, straightforward, consistent with other products as well as Apple's strategy of widening the segmentation between the phone lines.
The rumors have confirmed this at every step if that's what you'd been expecting. But as far as I can tell, everyone reporting on Apple rumors seems to be befuddled. Last week it was Thunderbolt 3, which has effectively been discontinued across the rest of Apple's line. I realize they need to work to put clicks on the table like everyone else, but they almost seem intentionally dense on this one.
I got a chance to sit down with the Changelog crew for a second time this year. This time it was to discuss the provocative blog post I wrote last month. It also featured my colleague Landon, who got to represent The Youth in tech today.
It was a great discussion. It's not technical and doesn't require much prior context to follow. I'm also always blown away by how much better the Changelog's audio engineering is than any other programming podcasts I've heard. Anyone that can make me able to listen to myself is working some kind of magic.
If you use Apple Podcasts (as I do), here's the direct link.
A friend mentioned the Sumitomo corporation yesterday and it prompted a conversation about Japanese-style conglomerates and how they've gone out of fashion in the West.
One thing that never came into fashion in the West? Educational comic books about your conglomerate that you link off your corporate home page.
132 pages! Things really start cooking about 12 pages in.
Incredibly relatable content in this provocative post by Thorsten Ball:
Many times in your day-to-day programming life you have to wait. Wait for your development environment to boot up, wait for the formatting-on-save command to finish, wait for the website you just opened to load, wait for tests to run, wait for the CI build to finish.
The waiting doesn’t really cause me physical pain, but it does evoke a physical reaction alright. I just can’t stand it. Maybe it’s because I’m impatient by nature. Maybe it’s knowing that things could be faster that causes it. When I have to wait ten seconds for a test to finish that I plan to run many times over the next hour, I tell you, it feels as if I’m about to lose my mind.
"Impatience" has been considered a virtue in software for literal decades, because nearly every single action a programmer redounds to a call-and-response with a computer that can't be considered complete until the computer has delivered its result and a human has interpreted it.
Imagine yourself texting with someone. If the other party replies quickly, it will promote focus and secure your attention—you'll stare at your phone and reply promptly as well. If many seconds or minutes go by between responses, however, you'll rationally lose interest, go do something else, and return to the conversation whenever you happen to come back to it. Most importantly, a fast-paced chat results in many more total messages exchanged than a slow-paced conversation, because time is stubbornly finite.
No one has any problem conceptualizing the above, but perhaps because we tend not to conceive of programming as a two-way conversation between a human and a computer, developers often lack a keen sense of this issue's salience.
I should see more people wince when a website takes longer than two seconds to load. There are very few reasons most websites should take long to load. Yet many times when, together with colleagues, I’d watch a website that we built load for longer than two seconds and say “something’s off, I bet there’s an N+1 query here” and turn out to be right – nobody else noticed anything.
Over the course of my career, very few programmers have seemed as constitutionally impatient as I am with slow computer responses. I've only become more radical in my impatience over time, as my understanding of programming as a two-way "conversation" has deepened.
Here's one way to think about it.
The upper bound of a programmer's productivity is the speed, fidelity, and correctness of the answers they're able to extract from each "feedback loop" they complete with a computer:
- Speed: if your page is slow to load, you can't refresh it as many times in a given working session, so you can't iterate on it quickly
- Fidelity: if you run a command that pulls down far too much or far too little information to answer your question, you'll spend additional time parsing and interpreting its results
- Correctness: if you have the wrong question in mind, you'll run the wrong commands, and you'll probably also waste feedback cycles to ask the wrong follow-up questions, too
I wrote a click-bait title referencing 10x developers a couple weeks ago. That post was careful to minimize value judgments and to avoid venturing into offering advice. Well, if you want some advice, here you go: take to heart the compounding nature of the effects that feedback loops have on productivity, and you'll set yourself apart as a programmer.
To illustrate, compare the potential productivity of two programmers, given a script to compute the upper bound of activities performed in the 480 minutes that comprise an 8-hour workday.
- Programmer A completes one feedback loop with their computer every 45 seconds. 1 in 10 of their commands ask the wrong question and result in the next 5 questions also being wrong. 1 in 3 of their commands produce low-fidelity results that take them 5 minutes to interpret the answer. They complete 85 productive feedback loops per day.
- Programmer B completes one feedback loop with their computer every 15 seconds. 1 in 25 of their commands ask the wrong question and result in the next 3 questions also being wrong. 1 in 10 of their commands produce low-fidelity results that take them 2 minutes to interpret the answer. They complete 902 productive feedback loops per day.
85 vs 902. There you go, a 10x difference in productivity.
It would be very fair to quibble over which numbers to measure, whether the numbers I chose are feasible, and so forth. This is only meant to illustrate that the difference between waiting a few hundred milliseconds versus a few seconds versus multiple minutes really adds up, especially when you factor in that expertise and focus can be learned and practiced to get better at asking the right questions and maintaining a clear mindset.
Something more this silly script doesn't capture is the human element of what it feels like to frequently feel like you're waiting. Beyond a certain point, people will develop habits to tab away to more responsive user interfaces like Slack or social media, resulting in minutes lost to distraction and minutes more to their attention residue. There are also reinforcing social effects of working in an organization where these phenomena are normalized that people rarely consider.
If I could go back and change one thing about how I learned to program, it would have been to emphasize the importance of internalizing this lesson and seizing control of the feedback loop between myself and my computer. I've been beating this drum for a while (and it was the primary thrust of my RailsConf 2017 keynote), but it still doesn't feel like the industry is getting any closer to acknowledging its importance or applying it to how we teach people programming, manage programmers, and design systems.