Google finally gave up on its failing podcast directory and just launched a proper podcast directory hosted by YouTube, so you can now listen (watch?) my new podcast from the comfort of your YouTube app.
Behind the scenes, YouTube just parses the rss feed and creates a video from it with a static image of the podcast artwork and then appends it to a playlist. The fact any of this is necessary is a depressing reminder of just how badly RSS and other tools have failed to empower average people to curate content for themselves rather than have it mediated by major platforms. Alas!
(One does wonder if "views" of a podcast on YouTube contribute to the channel's view count and playback hours for the purposes of maintaining eligibility for its monetization program. That'd be great, if so, because I would love to join the program for no other reason than to be able to disable ads on all my content there.)
One bummer about Apple Vision Pro is that it appears Apple has no plans to sell a USB-C adapter for plugging the device directly into a computer as opposed to the battery pack (despite a couple photos of such a dongle floating around the WWDC reveal) apple.com/shop/accessories/all/apple-vision-pro
Heads up gang, there's a new breaking change with this release: my wallet is like $8000 lighter thanks to the Vision Pro preorders.
As always as of a week ago, you can e-mail me at podcast@searls.co and I'll read it silently in my head. If all goes well, I'll read it out loud, too. And everything works out, I may even read it out loud and into a microphone for the next show.
Okay, let's dig into this latest version:
Trying VS Code's Terminal Loader instead of foreman or overmind
I did a video about debugging Rails in Visual Studio
Code a couple years ago that
showed off how to use the remote interface of the
debug gem with Rails 7's Procfile-based
bin/dev
workflow. Using
foreman or overmind and the remote debugger interface is fine but it's
honestly no replacement for the developer experience of running binding.irb
directly against a dedicated terminal running nothing other than rails server
.
So I decided to give this Terminal Loader extension a try to see if I could have my cake and eat it too: a one-shot way to run all of my development servers across multiple terminals. The verdict? It works!
- Install the extension
- Run the command
TLoader: Load Terminals
(via Command-Shift-P) once, which will:- Launch a couple dummy terminals with split-views, which you can feel free to kill manually
- Create a
workspaceConfiguration/LoadTerminal.json
, which you can (and should, IMO) add to.gitignore
- Edit the
LoadTerminal.json
file to specify which terminal groups you want to open, how many columns per group, and the commands to run for each one
This is my config for a straightforward Rails 7 app that runs the Rails server, the tailwind CLI, a Rails console, and Solid Queue. Because I don't typically need to interact with tailwind or my queue daemon, I relegated those to a shared terminal group. And while I didn't have need for it in this case, I appreciate that the extension allows you to set a different working directory for each terminal, which will be a huge boon to my projects that embed sub-libraries and example apps.
Here's my first crack at a LoadTerminal.json
for this project:
{
"version": "1.2.1",
"groups": [
{
"name": "Rails Server",
"description": "Rails Server",
"enabled": true,
"terminals": [
{
"name": "server",
"path": ".",
"cmd": [
"env RUBY_DEBUG_OPEN=true bin/rails server -p 3000"
],
"num": 0
}
]
},
{
"name": "Rails Console",
"description": "Rails Console",
"enabled": true,
"terminals": [
{
"name": "console",
"path": ".",
"cmd": [
"bin/rails console"
],
"num": 0
}
]
},
{
"name": "Other",
"description": "Tailwind / Queue",
"enabled": true,
"terminals": [
{
"name": "tailwind",
"path": ".",
"cmd": [
"bin/rails tailwindcss:watch"
],
"num": 0
},
{
"name": "queue",
"path": ".",
"cmd": [
"bin/rake solid_queue:start"
],
"num": 0
}
]
}
]
}
Seems to work fine! Nice change of pace not having to juggle virtual-terminals-within-an-electron-wrapper-within-a-terminal anymore.
FedEx announced today that it will launch a new "data-driven commerce platform" this fall called fdx that it says will give online merchants "end-to-end e-commerce solutions." The company's new platform is aimed at helping businesses manage their supply chain, sell to customers, and manage deliveries.
Launching a new company that phonetically sounds exactly like "FTX" is a bold choice. Let's see how this plays out.
Update: nevermind we fixed it. This was a really, really frustrating and demoralizing pairing session. What a mess. github.com/rails/rails/pull/50758
Boy it'd be super cool if someone read this GitHub comment on this issue I found in Rails' ActiveStorage and helped figure out how to fix it. I'm just gonna leave this here: github.com/rails/rails/issues/50560#issuecomment-1892134270
MacRumors, citing a Kuo report:
Apple is planning to produce between 60,000 and 80,000 Vision Pro units for the February 2 launch of the device, according to analyst Ming-Chi Kuo. With the small size of the shipment, Kuo believes that the Vision Pro will "sell out soon after the release."
Relative to Apple's current scale (churning out hundreds of iPhones per minute), a launch this small would be downright artisanal.
I read this post after seeing the news that the purchase process will indeed require a face scan and any prescription lens selection (though, mercifully, one can wait to upload a scan of a prescription until after checkout). Suddenly, it became clear that manically force-relaunching the Apple Store app at 8 AM on January 19th might not be enough to ensure a launch day delivery date.
Or would it?
Lately whenever I've found myself uselessly fretting about something, I've asked ChatGPT 4. So let's see what it has to say in response to this query:
Searls: help me estimate how many programmers between age 20 and 50 work in the US, make over $150,000 per year, and would be likely to buy the Apple Vision Pro at launch for $3,499
Of course, my premise is bullshit: people who aren't programmers and people who aren't between 20 and 50 will buy this product. But given that it'll be pretty hard to estimate what proportion of humans writ large will want to fork over $3,499 (or more), narrowing the demographic seemed necessary.
Chat GPT's response was characteristically humorous:
ChatGPT: Estimating the number of programmers in the U.S. who meet specific criteria like age, salary, and interest in a high-end product requires a multi-step approach:
Identify Total Programmers in the U.S.: We need to determine the total number of programmers or software developers in the U.S.
Age Filter (20-50 years): Estimate the proportion of these professionals who are between the ages of 20 and 50.
Salary Filter (>$150,000/year): Identify what percentage of these individuals earn over $150,000 per year.
Likelihood to Purchase High-End Product: Estimate the likelihood of this demographic purchasing a high-end product like the Apple Vision Pro at launch.
Let's start with step one and use available data to estimate the total number of programmers in the U.S. We'll use data from sources like the U.S. Bureau of Labor Statistics (BLS) or similar reports.
After that, we can apply demographic and income filters based on typical distributions in the tech industry. Finally, we'll estimate the purchasing likelihood based on consumer behavior patterns in similar income brackets and interest in technology products.
I'll begin by finding the total number of programmers or software developers in the U.S.
Based on a placeholder number for illustrative purposes, let's assume there are approximately 4,000,000 programmers or software developers in the U.S. Now, we'll proceed to the next steps:
Age Filter (20-50 years): For this, we need to estimate what proportion of these professionals are in the age group of 20-50 years. Let's assume, based on typical age distributions in the tech industry, that about 70% of programmers fall within this age group.
Salary Filter (>$150,000/year): Next, we'll estimate what percentage of these individuals earn over $150,000 per year. This might vary widely, but for the sake of estimation, let's assume that 20% of programmers in this age group earn over $150,000 per year.
Likelihood to Purchase High-End Product: Finally, we'll estimate the likelihood of this demographic purchasing a high-end product like the Apple Vision Pro at launch. This is quite speculative, but for the sake of estimation, let's assume 10% of the high-earning programmers are interested in purchasing such a product at launch.
Let's calculate these estimates.
Based on these estimates:
- Approximately 70% of programmers are aged between 20 and 50.
- About 20% of these programmers earn over $150,000 per year.
- And roughly 10% of this high-earning group might be interested in purchasing a high-end product like the Apple Vision Pro at launch.
The estimated number of programmers in the U.S. who fit all these criteria and would likely buy the Apple Vision Pro at launch is around 56,000.
Please note that these figures are based on assumptions and should be used as a rough estimate. Actual numbers may vary based on more precise data and research.
Incidentally, 60,000-80,000 might be close to the actual launch day market for this thing in the US! Unfortunately, GPT failed to account for the prevalence of scalpers. This makes me wonder whether the face-scanning step was added not only to improve customer experience (recall the Solo loop band fiasco based on shoddy in-app sizing guidelines), but also to slow scalpers down and to reduce the fungibility of the Vision Pro by obfuscating which size light shield someone is actually buying.
Of course, if Apple really wanted to discourage scalpers, they would lock the device to the iCloud account of whoever purchased it for some period of time. But since that would cause significant customer support headaches, it's unlikely Apple would ever bother.
Whether or not you listened to the first episode of Breaking Change, if you DIDN'T e-mail podcast@searls.co with a question/comment/take, do it! Don't think, "that's something other people will do"—they won't! It's really on you this time!
MacRumors gives a rundown of how Vision Pro can interface with your Mac based on what the keynote video indicated. I forget sometimes I've been scrutinizing Apple keynote videos since 2001 when they would talk at length about things like the number of pipeline stages in PowerPC CPUs relative to Intel, and so not everyone has developed the same eagle eye.
In summary, here's what you apparently can do with a Mac from your Vision Pro:
- Control the Mac's main display in a single 4K-resolution screensharing window called Mac Virtual Display (this is almost certainly why macOS Sonoma added High Performance screen sharing)
- Connect Bluetooth accessories like mice, trackpads, and keyboards to Vision Pro
Here's what you probably can't do:
- Screen share to any Mac over the Internet using the built-in screen-sharing feature (doing so will probably require a third-party VNC-like app like Screens, as Apple will likely keep things simple by requiring being on the same local network and leveraging Bonjour and peer-to-peer wifi)
Here's what you almost certainly won't be able to do:
- Control Intel Macs (much less non-Macs)
- Control multiple Macs at once
- Bring up multiple screens on a Mac
- Run at resolutions higher than 4K (surely a bandwidth limit of high-performance screen sharing, which one presumes sports lower latency and a higher refresh rate)
- Break apps out of Mac Virtual Display to be organized in space (similar to how Parallels allowed Windows apps to be run as free-floating Mac windows circa 2007 with the advent of x86 virtualization in Mac OS X)
Important to get expectations sorted before pre-orders. I'm already trying to think through ways I could conceivably travel exclusively with a Vision Pro and without an accompanying Mac. The 13" M2 MacBook Air has always felt way larger and heavier than it should, and definitely would be for this purpose. If there were ever a time for Apple to rescuscitate the 12" MacBook, it would be now. Maybe I'll settle for a VNC app to remotely connect to a Mac in the cloud? It'd be great if someone could build an Apple TV-sized puck that could run macOS for no other purpose but to be a remote display for Vision Pro.
Of course the real solution would be for a future version of the Vision Pro hardware to support virtualization of macOS… but I'm not going to hold my breath.
Wow, Iconduck seems like a rad place to find open source SVG icons. Been looking for a good directory of them for a while iconduck.com/
Big day, everyone: I've expanded the justin.searls.co family of brands by launching a solo podcast that's slow-paced, long-form, and world-weary. It's called Breaking Change: podcasts.apple.com/us/podcast/breaking-change/id1725165751
I finally gave up and recorded a podcast. And now I'm writing show notes. I tried not to let this happen but here I am typing this and it's happening. Forgive me.
This is a podcast for anybody who shares my interests and likes passing the time with a friendly, opinionated voice in their head. I have done thousands of hours of chores and errands over the years listening to mostly agreeable, sometimes interesting 1.5-3 hour podcasts to keep my mind busy. Now it's time for me to give back, I guess.
Here's the deal: e-mail me at podcast@searls.co and I'll talk about whatever you want me to talk about. Write whatever you want into that e-mail body: ask a question about video games, ask for my take on a new technology, ask for relationship advice, tell me my podcast sucks. You write it, I'll read it.
Okay, here's what I covered in version one of Breaking Change:
If you're a Ruby developer and you've never been to RubyKaigi, the 2024 edition from May 15-17 in Okinawa would be a fantastic place to start. Despite being nearly thirty years old, Ruby is still ascendant in Japan—inspiring a diverse and enthusiastic community of Rubyists who are passionate about pushing the language further. Kaigi has played an important role in that success, gradually picking up steam over the last ten years as it travels between many of the best cities in Japan (owed largely to the organizers' impeccable taste in food and drink).
If you've ever considered coming to Kaigi but hesitated out of concern about Japanese language ability, transportation complexity, unfamiliar customs, or whether you'll like the food, Okinawa offers a uniquely comfortable on-ramp to first-time visitors:
- Okinawa's nature and beaches make it one of the most beautiful places in Japan, with the best weather in the nation in May—tropical and sunny, interspersed with brief heat-breaking rain showers
- Despite the laid-back island atmosphere, English is as widely understood and spoken in Okinawa as anywhere else in Japan (one silver lining of the continuous presence of the American military). Additionally, RubyKaigi's organizers try to schedule at least one English-language talk for every time slot and pay for live translation services into English for all Japanese-langauge talks
- While all of Japan is incredibly welcoming to foreigners, Okinawa is one of the few places in Japan designed to be a tourist destination—easy to get around and with the amenities people expect to see on vacation, while refreshingly free of the congestion and strain tourists' presence has placed on other cities like Osaka and Kyoto
- It's always been my perspective that to experience the full range of Japanese culture, one needs to escape the urban sprawl of Tokyo and visit more rural areas. Okinawa is unique in that it's every bit as accommodating to people unfamiliar with Japanese customs as Tokyo is, while still offering a broad and authentic view into Japanese culture and tradition
- If you're not sure whether you'll like Japanese food, Okinawa is such a melting pot that it's become home to a fusion of American and Asian cuisine that should satisfy any palate. When in doubt, there's always taco rice
For more information on what it's like to attend Kaigi, check out the field report I filed for Test Double last year in Matsumoto. It was easily the most educational and fun conference I've attended since the pandemic, with genuinely useful information about still-in-development Ruby features I wouldn't have found anywhere else and absolutely fantastic official and unofficial activities and parties. I'm sure this year will be no different! Tickets aren't on sale yet, but you can sign up to be notified when they go live.
I'm not planning to speak this year, but you should consider submitting to the
CFP before it closes at the end of
January. RubyKaigi's program is geared for giving open source Ruby programmers a
platform to share their work along with their experiences and insights. In
addition to straightforward presentations of new gems and language features, the
conference also celebrates the joy of programming by highlighting creative and
quirky talks like Shioi-san's Ruby fork to add a oft-requested i++
operator
and @tompng's increasingly fantastical
quines. If you write
open source Ruby, I hope you'll consider submitting a talk! (And if you'd like
some free advice on responding to CFPs, I published a video about
speaking last month you might
enjoy.)
Anyway, if you're thinking of going, drop me a line if you have any questions or concerns and I'll be happy to help. Hope to see you in Okinawa this May! 🏝️
Ruby's REPL, irb has gotten a ton of love over the past few years, essentially obviating most popular debuggers in the process. This year is another huge leap forward.
This is my favorite item in this list because I didn't even know show_source
existed:
The show_source command has always been an essential tool for many IRB users, especially for debugging. This year it received two enhancements that will make it even more useful:
- You can now use -s to get the method's super definition if it has one
- It can now display private methods too
Awesome!
Having a great REPL is essential in dynamic languages, so quality-of-life improvements have a direct impact on the speed and quality of feedback during development.
Rounding out what feels like six months of contemplative and retrospective blog posts over at the Test Double blog is this one on my own idiosyncratic approach to finding meaning in my work and how I wish more managers would approach their direct reports' career development.
To wit, I've never accomplished anything I felt proud of by setting a goal. In fact, the surest way to ensure I don't do something is to set a goal. When asked to set goals for myself, I've found that expressing the goal (as opposed to achieving it) becomes my overriding objective. The moment a manager approved my list of goals, I felt that I had completed the work asked of me and I would instantly lose all motivation to pursue the goals themselves.
This explains why planting flags can succeed where goal-setting fails. If what I'm searching for is meaning in my work, setting a goal creates an expectation of where, when, and how my future self should find that meaning. High pressure. Focusing on doing my job well and reflecting on whatever I did in retrospect, however, has allowed me to sift through my experiences, identify patterns, and give meaning to them. Low pressure.
I feel like managers as well as employees find comfort in planning what their future state will look like, as if it reduces any of the fear-inducing uncertainty that lies ahead. It doesn't, but I can't imagine why else people would so quickly reach for goal-setting as their primary career development approach when it almost always fizzles and fails.
I need exactly two things to do great work: awareness of others' expectations of what they think I should be doing and a shared definition of what success will look like when I achieve it. Nothing more is necessary to excavate metaphorical tons of marble from which I can chisel sculptures to mark the moments that mattered along the way.
One more note on what people get wrong about ✨Thought Leadering✨:
But here's the thing: I create these things for me and me alone. When a bunch of people read something I wrote or show up to one of my talks, do I find it encouraging and validating? Sure. But it's not what drives me. I started creating things to punctuate my life's sentences long before anybody took an interest in me and I wouldn't stop even if everyone loses interest in me.
If the lack of an audience is the reason you've failed to apply creativity in expressing the meaning of your life's work, that's a real shame.
Kudos to Instagram for being so confidently unconcerned its creators would consider pushing back against the platform for saying this:
90% decrease in cost per person recalling seeing ads with a mix of virtual creator content and campaign video, compared to ads with campaign video only
I've talked to several people in advertising about the rise of "The Influencer" as a target for ad spend over the last five years and the one thing—literally, the only thing—ad executives seem to like about influencers is that they're cheap. If they like two things about working with influencers, it's that so many users are competing for so few dollars that moral scruples about aren't a factor.
I don't see how AI talent doesn't give ad buyers more of what they want (even lower prices and even fewer scruples) while also escaping many of the downsides of working with influencers (managing amateur talent, qualifying promoted content, ensuring brand safety). It will be interesting to see how durable influencers' place in the market turns out to be.
Annnnnd just got burned by #8 yet again. Glad I made this list so I could keep straight which bug I'm encountering justin.searls.co/posts/2023-buggies/