A lot of content around here boils down to links to someplace else, for which all I have to add is a brief call-out or commentary. As such, the headlines for each of these posts link to the original source article. (If you want a permalink to my commentary, try clicking the salt shaker.)
Apple is in discussions with Google to integrate its Gemini AI engine into the iPhone as part of iOS 18, according to Bloomberg's Mark Gurman.
Through iOS 5, Maps and YouTube were native apps that Apple built and which were backed by Google services. This was advantageous for both parties at first. Apple wasn't nearly ready to roll out its own mapping service and Google was more focused on growing YouTube's reach than monetizing it. Eventually, it stopped making sense for either party, and they went their separate ways.
The primary media narratives about this focused on Steve Jobs' "thermonuclear" threat over Android's copying of the iOS UI and the degree to which the two companies had begun to compete on services. But one thing that was lost in the discussion—which never really squared with the fact Google has continued to pay Apple tens of billions a year to be Safari's default search engine—was that both companies maintain relatively-tenuous moats to lock in customers.
Right now, Google needs people to reach for its AI and search stack before a generation of users learn to "GPT it", and Apple needs an AI stack for its platform that can compete with the dozens of devices set to launch that are little more than thin candy shells on top of OpenAI's API.
I really hate the idea of this deal, and I bet executives at both companies do, too. Which is why it's so unfortunate that it also makes sense.
Gripping story, overall, and worth a read. This bit stuck out to me as something I'd never considered before, but felt obvious as soon as I was exposed to it:
Political communicators are sticking to approaches developed for an era when ticket-splitters and swing voters composed a sizeable chunk of the electorate. But with a body politic that has sorted into two highly polarized parties—with just one-tenth of voters torn between them—the logic of persuading voters to support a candidate has grown obsolete. Ad campaigns should instead promote the Democratic Party itself, Malchow proposes, particularly at moments when news events might help it win new adherents, such as after a mass shooting, which thrusts gun-control policy back into the news and voters might be ready to reconsider their allegiances.
To wit: in an era of extreme party polarization, 90% of people in the US are voting based on party affiliation, but campaign advertising is still centered on candidate choice. This isn't just inefficient, it's counter-productive, since most candidates run away from their parties in general elections because both parties' brands are so toxic. Focusing money and messaging on bolstering a party's brand seems like a much smarter way to meet this moment of overwhelmingly party-line voting.
I can only hope I'll still have meaningful insights to offer others during my final week on earth.
My Twitter account had been peacefully lying dormant since November 18th, 2022, but so many people are still using X that I finally caved a few weeks ago and spent 45 minutes wiring up a syndication strategy, effectively adding it to this site's POSSE.
If you want to do this, here's what you'll need:
- The azu/rss-to-twitter GitHub Action and a modest amount of a free account's budget
- A Twitter developer account and app
- An Atom feed to read from (here's mine, specially crafted
to cram the full content of each tweet in the
<title>
of each<entry>
) - The handle you're posting as must be marked as an automated account and registered as managed by some other account
Once you have all that, you can define a YAML file defining the action in your GitHub repo like the one for this site:
# .github/workflows/rss_to_twitter.yml
name: rss-to-twitter
on:
schedule:
- cron: "*/30 * * * *"
workflow_dispatch:
jobs:
twitter:
runs-on: ubuntu-latest
steps:
- uses: azu/rss-to-twitter@v1
with:
RSS_URL: "https://justin.searls.co/shorter-form.xml"
TWEET_TEMPLATE: '%title%'
TWITTER_APIKEY: ${{ secrets.TWITTER_APIKEY }}
TWITTER_APIKEY_SECRET: ${{ secrets.TWITTER_APIKEY_SECRET }}
TWITTER_ACCESS_TOKEN: ${{ secrets.TWITTER_ACCESS_TOKEN }}
TWITTER_ACCESS_TOKEN_SECRET: ${{ secrets.TWITTER_ACCESS_TOKEN_SECRET }}
The above cron schedule translates to checking the Atom feed "every 30 minutes",
which is more frequent than it needs to be in my case. And as for all
those ${{ secrets… }}
directives, here's how to configure your repository's
GitHub Action
secrets.
And that's it. Easy!
Apart from being one of my favorite people—and someone whose wisdom had a big impact on my own professional development—Joel Helbing shares a bit about his experience giving himself just enough of a crash-course on whatever skill a new job needs to be able to show up to work on Monday:
Then I ended the call, got in my car, and drove an hour to the nearest Borders bookstore. I purchased two promising books on Microsoft SQL Server, went to the bookstore's in-house Starbucks, purchased a venti iced coffee, sat down with those two books and a legal pad, and mapped out my weekend in fine detail. It came down to 15 minutes for this chapter, 10 for that chapter, skip this other chapter, etc. Then I drove home and followed my script meticulously for the whole weekend. This was not easy for me; I'm a curiosity-driven learner who loves to follow a thread and go deeper. Not this weekend, though. I stuck to the plan, and on Sunday night I got back in my car and started the long drive to my new gig.
(Imagine I spent the thirty seconds to insert a "this is the way" GIF here.)
But seriously, Joel's not kidding. He's indeed one of the most deliberate, curious people I've ever met, and I bet rushing through a bunch of content in order to get his arms around a topic was acutely painful for him. But when you're a consultant, your clients need you to be conversant in whatever they're focused on, and frequently that means brushing up on topics and technologies your previous ten clients weren't focused on.
Doing this was always painful for me, too. In part, because I'm a perfectionist who really struggles to move onto chapter two until I've absorbed, critiqued, and improved on everything the author posited in chapter one. For me, the act of learning is an exhausting war of attrition. But when my job depended on me showing up knowing something, I had no choice but to swallow my pride and immerse myself in a topic uncritically in order to learn enough to be dangerous.
This is an intensely uncomfortable activity. That immersion feels like drowning at first. I'm even feeling it this morning, as I'm upgrading a database I haven't touched in 3 years and which I presently can't remember how to back up—my gut is churning in worry as a result. As a consultant, though, I always understood that the stakes were always higher for my client than for me, and every ounce of discomfort I shouldered almost always translated to an ounce of burden I could remove from their plate. Sometimes that meant using their preferred technology over mine. Or assigning myself to their legacy systems so their full-timers could be the ones to break ground on the next green-field app. Or, and this was always the hardest for me, relenting and using their janky issue trackers and time-keeping systems.
Some consultancies blunt this reality by deploying large teams where there are levels of indirection between clients and practitioners—placing an engagement manager in front of the client who can run interference while the team of consultants behind him ramp up at a more leisurely and comfortable pace (as they bill every hour at a full rate). Other consultancies prioritize their convenience over their clients' needs by narrowing their services and prescribing a single monolithic "Way" to work with them—often requiring clients to build systems in the agency's favorite tech stack—firmly ensconcing each consultant in a cocoon of comfort. But at Test Double, it never occurred to do anything other than lean in and rise to the challenge of actually meeting our clients where they are. We spent years demonstrating to our clients that no matter how arcane their technology or byzantine their org chart, our people will get up to speed so fast it'll make their heads spin. You get good at what you do, and if the thing you do is stomach discomfort as you learn hard things in service of others, then there's almost no limit to what you can accomplish.
One last note: showing up to a client following a weekend-long crash course in a particular technology doesn't make you a fraud. Nearly twenty years in consulting has taught me that the people most worried about misrepresenting themselves and their abilities are the people who have the least reason to worry. The fact they care so much almost always means they'll put in the work when they need to. The real frauds, meanwhile, don't worry at all. And while Joel was holed up in a Starbucks for 72 hours, I'm sure they were having a delightful and relaxing weekend. And Joel's much richer for it, as he's gotten four careers' worth of experience by repeatedly diving into new industries, organizations, and technologies, whereas the real imposters only learned how to talk a good game as they skated through life without ever stretching themselves.
Nice set of reminders on how to validate e-mail addresses in Rails models and was glad to find his second example to be almost identical to what I found in my newest app:
class User < ApplicationRecord
validates :email,
format: { with: URI::MailTo::EMAIL_REGEXP },
presence: true,
uniqueness: { case_insensitive: true }
end
One thing I'd be sure to add, though, is the new normalizes
class
method
to ensure all email addresses saved by your app are easily compared by stripping
whitespace and lower-casing them. From my User
class:
normalizes :email, with: ->(email) { email.strip.downcase }
Never hurts to revisit the stuff you wrote on the first day of an app's life.
"It is currently estimated that new models with significant changes to the Vision Pro specification may not be in mass production until 2027," Kuo said today.
This makes me glad I bought at launch as opposed to waiting, and even more glad I opted for monthly AppleCare payments as opposed to buying the fixed two-year contract.
I haven't been happy with the size or weight of any of Apple's notebook computers since 12" MacBook. And since moving to 14" and 16" for the MacBook Pro, I've been praying for a 12" MacBook Pro to be released. But now, with the Vision Pro among us, my desire for an ultralight Mac has only increased. In fact, about a month ago, I tried to get DALL•E and Midjourney to generate images of a headless MacBook Pro device (like an Apple II but in a modern industrial design language), but they weren't up to the task.
I had never even considered beheading a MacBook Air, but this guy sure did:
The fact this mod is as straightforward as it is is a real testament to Apple's improvements in repairability over the last few years. The only shame of it is that there's no way to reliably log into the device after a reboot (short of guessing at the state of the password entry screen). If it weren't for that I seriously might consider doing this to my own M2 MacBook Air after its warranty is up.
UPDATE: Rob Carlson inspires some hope that a headless MacBook isn't as unusuable at boot as I might have worried:
I log into a headless MacBook Pro all the time. Just hit the "up volume" key a bunch of times until VoiceOver turns on, then it'll prompt you for username, then password, and give three beeps if you're right.
Accessibility truly is for everyone!
UPADATE: Confirmed
As I discussed on my Breaking Change podcast a couple weeks ago, Apple it's downright impressive how well Apple is executing on a strategy of malicious compliance in response to the EU's Digital Markets Act.
One tidbit from a week ago was that third-party browser rendering engines wouldn't be able to be saved to the home screen and launched as progressive web apps (PWAs). Well, in order to level the playing field, Apple's apparently decided to just disable PWAs entirely for EU customers:
Apple appears to be turning off the ability to use web apps right from the iPhone's homescreen in the European Union. Support for progressive web apps appeared to be broken inside in the EU during the first two betas of iOS 17.4, but today developer Maximiliano Firtman said in a post on X that web apps are still turned off in the third beta, which arrived yesterday. "At this point, it's a feature disabled on purpose," Firtman wrote.
If you haven't heard Firtman's name before, he's been the guy following Apple's largely-undocumented, unnecessarily circuitous path to progressive web app support in iOS, so if that's his read, that's probably the case.
I know a lot of people are angry at Apple's flagrant display of spite as they take their ball and go home on all these issues, but I really can't manage any other reaction but to be impressed by their commitment to the bit. After the DMA passed but before Apple released their response, there was expectation that similar laws would pass in other developed countries over the next 18-24 months. But Apple's ability to outpace this months-old legislation so completely is probably throwing cold water on those plans, if they exist. The EU could choose to deny Apple's proposals, but something tells me that substantive changes would require even more complex legislative changes which would only invite even more sophisticated technical and policy countermeasures by Apple.
This'll be an insta-purchase for me before it can get yanked from the App Store:
Juno delivers a fully native visionOS UI that taps into YouTube's embed API, which is designed to allow videos to be embedded in external webpages. When you want to browse YouTube's video catalog, Juno pulls up a tweaked version of the YouTube website. Apparently, the app is even clever enough to not show ads for YouTube Premium subscribers, though it remains to be seen how Google feels about a third-party developer being in control of an app for one of its biggest services on a new piece of hardware. Selig notes that he didn't use any private/internal APIs to develop the app.
I'm not one to kink shame, but it's starting to feel like this guy's fetish is building businesses whose survival depends on the mercy of hostile platform holders.
Apple PR:
More than 600 apps and games designed to take advantage of the unique and powerful capabilities of Apple Vision Pro will be available this Friday, February 2.
For context, iPad launched with just shy of 5000 apps in its App Store, and that number nearly doubled to 10,000 within a couple months.
Real beaut of a lede in the WSJ (News+ link) today:
Meta Platforms is hoping Apple's launch of the Vision Pro can reinvigorate its $50 billion metaverse effort, which consumers have yet to widely embrace.
Love the optimism here, but I wouldn't be surprised if this ages about as well as the CEO of Palm's, "PC guys are not going to just figure this out," quote prior to the iPhone's launch.
Update: this take by Kristopher Browne on Mastodon raises a great point:
I believe them. Goggles are the last device category where Apple didn't have designs for others to crib from. It's a space like "smart phones" were in before the iPhone, when android was going to be a blackberry clone because it's not like google knew how to design a device or interface.
I almost forgot about this mess. I guess the analogy really holds: just like Google had no clue that touch metaphors like "pinch-to-zoom" were going to be the breakthrough interface paradigm for smartphones, Meta (as indicated in this WSJ article) wasn't seriously chasing down mixed-reality computing of the sort implemented in visionOS until after WWDC.
Ben kicks off a screamer this morning with a quote from a prior interview with Om Malik:
But the thing is you actually have to be mobile-native to actually appreciate something like this. So if you've grown up watching a 75-inch screen television, you probably would not really appreciate it as much. But if you are like me who's been watching iPad for ten-plus years as my main video consumption device, this is the obvious next step. If you live in Asia, like you live in Taiwan, people don't have big homes, they don't have 85-inch screen televisions. Plus, you have six, seven, eight people living in the same house, they don't get screen time to watch things so they watch everything on their phone. I think you see that behavior and you see this is going to be the iPod.
I'll go one further: when I moved to Florida, I went to great pains to create the best home theater I could with a complete blank slate. That landed me with:
- A $6000 ultra-short throw 4K projector
- A $1500, 123" screen situated 5 feet from my eyeholes
- Black out blinds with edge side channels to prevent the Florida sun from sneaking in
- The usual accouterment: surround sound, receiver, streaming box, wired networking, a thousand cables to connect it all, etc.
I think I read six hundred posts on AVS Forum and Reddit before settling on this setup, and three years later it's still the best configuration money can buy. People who come over and watch a movie are usually blown away. Indeed, it is pretty fucking rad.
But, you know what? Vision Pro already smokes this setup. Wider field of view, tighter perceivable dot pitch, better color reproduction. Using an individual headset isn't just better for "mobile-native" users one might imagine in a far-flung Asian country where everyone lives in 200-square foot studio apartments… today's high-end headsets are already better than the best flat displays, period.
The bulk of the post is spent comparing Apple Vision's launch with the introduction of the iPod. iPod's success was enabled by Apple's clever persuasion over the record labels to dramatically devalue their songs on iTunes, while the biggest story about the Vision Pro is that Netflix and YouTube are skipping its launch due to Apple's draconian App Store policies. Wanna watch Netflix? You'll have to use Safari:
Apple may be unhappy that Netflix viewers have to go to the Netflix website to watch the service on the Vision Pro (and thus can't download shows for watching offline, like on a plane); Netflix might well point out that that going to the web is exactly what Apple makes Netflix customers do to sign up for the service.
Ben goes on, mostly about what this conflict means for the short-term. We'll have to see. I think his analysis is pretty mcuh bang on.
However.
One thing I'd suggest is that visionOS is actually Apple's first bimodal platform, in that it has the potential to be the ideal experience for both consumption and creation (something that, to date, Apple has divided with iOS/iPadOS/tvOS/watchOS on one side and macOS on the other). My primary use case for a Vision Pro is to just mirror my Mac screen to it as I develop software, despite having a $3000 6K Dell display a few feet away, simply because the Vision Pro offers a bigger, better screen (despite having fewer pixels).
In Steve Jobs parlance, the Vision Pro is simultaneously a car and a truck.
I think it's exactly because Apple can move the ball down both ends of the field simultaneously with a single futuristic platform that they're so obviously committed to this platform. In the past, the thing that's given Apple leverage over software and media companies has been that they dominate hardware sales to the juiciest demographic of consumers. Today, the conversation is over whether regulators or circumstance will force them to loosen their grip on the app store. But it's unlikely we're going to know how the next chapter of this power struggle will play out any time soon—it'll be at least 5 years before Vision Pro has the kind of gravitational pull to attract either a massive installed base or regulatory scrutiny, so smart money would probably bet on a stalemate (for now).
I suspect Apple is clear-eyed about all this and they're prepared to patiently plug away at this platform for ten years before rethinking their strategy. On that score, there's not really any point in changing course until the underlying technology reaches a tipping point to appeal to a larger, mainstream audience. If and when that happens, if Apple once again owns the platform that powers the market's best product, they'll once again have all the leverage they had with iPhone.
One thing Steve Jobs and Tim Cook have in common is the degree to which they are preternaturally stalwart in their convictions. If Tim Apple is shook about any of this drama, he's sure as hell doing a good job hiding it.
Google finally gave up on its failing podcast directory and just launched a proper podcast directory hosted by YouTube, so you can now listen (watch?) my new podcast from the comfort of your YouTube app.
Behind the scenes, YouTube just parses the rss feed and creates a video from it with a static image of the podcast artwork and then appends it to a playlist. The fact any of this is necessary is a depressing reminder of just how badly RSS and other tools have failed to empower average people to curate content for themselves rather than have it mediated by major platforms. Alas!
(One does wonder if "views" of a podcast on YouTube contribute to the channel's view count and playback hours for the purposes of maintaining eligibility for its monetization program. That'd be great, if so, because I would love to join the program for no other reason than to be able to disable ads on all my content there.)
FedEx announced today that it will launch a new "data-driven commerce platform" this fall called fdx that it says will give online merchants "end-to-end e-commerce solutions." The company's new platform is aimed at helping businesses manage their supply chain, sell to customers, and manage deliveries.
Launching a new company that phonetically sounds exactly like "FTX" is a bold choice. Let's see how this plays out.
MacRumors, citing a Kuo report:
Apple is planning to produce between 60,000 and 80,000 Vision Pro units for the February 2 launch of the device, according to analyst Ming-Chi Kuo. With the small size of the shipment, Kuo believes that the Vision Pro will "sell out soon after the release."
Relative to Apple's current scale (churning out hundreds of iPhones per minute), a launch this small would be downright artisanal.
I read this post after seeing the news that the purchase process will indeed require a face scan and any prescription lens selection (though, mercifully, one can wait to upload a scan of a prescription until after checkout). Suddenly, it became clear that manically force-relaunching the Apple Store app at 8 AM on January 19th might not be enough to ensure a launch day delivery date.
Or would it?
Lately whenever I've found myself uselessly fretting about something, I've asked ChatGPT 4. So let's see what it has to say in response to this query:
Searls: help me estimate how many programmers between age 20 and 50 work in the US, make over $150,000 per year, and would be likely to buy the Apple Vision Pro at launch for $3,499
Of course, my premise is bullshit: people who aren't programmers and people who aren't between 20 and 50 will buy this product. But given that it'll be pretty hard to estimate what proportion of humans writ large will want to fork over $3,499 (or more), narrowing the demographic seemed necessary.
Chat GPT's response was characteristically humorous:
ChatGPT: Estimating the number of programmers in the U.S. who meet specific criteria like age, salary, and interest in a high-end product requires a multi-step approach:
Identify Total Programmers in the U.S.: We need to determine the total number of programmers or software developers in the U.S.
Age Filter (20-50 years): Estimate the proportion of these professionals who are between the ages of 20 and 50.
Salary Filter (>$150,000/year): Identify what percentage of these individuals earn over $150,000 per year.
Likelihood to Purchase High-End Product: Estimate the likelihood of this demographic purchasing a high-end product like the Apple Vision Pro at launch.
Let's start with step one and use available data to estimate the total number of programmers in the U.S. We'll use data from sources like the U.S. Bureau of Labor Statistics (BLS) or similar reports.
After that, we can apply demographic and income filters based on typical distributions in the tech industry. Finally, we'll estimate the purchasing likelihood based on consumer behavior patterns in similar income brackets and interest in technology products.
I'll begin by finding the total number of programmers or software developers in the U.S.
Based on a placeholder number for illustrative purposes, let's assume there are approximately 4,000,000 programmers or software developers in the U.S. Now, we'll proceed to the next steps:
Age Filter (20-50 years): For this, we need to estimate what proportion of these professionals are in the age group of 20-50 years. Let's assume, based on typical age distributions in the tech industry, that about 70% of programmers fall within this age group.
Salary Filter (>$150,000/year): Next, we'll estimate what percentage of these individuals earn over $150,000 per year. This might vary widely, but for the sake of estimation, let's assume that 20% of programmers in this age group earn over $150,000 per year.
Likelihood to Purchase High-End Product: Finally, we'll estimate the likelihood of this demographic purchasing a high-end product like the Apple Vision Pro at launch. This is quite speculative, but for the sake of estimation, let's assume 10% of the high-earning programmers are interested in purchasing such a product at launch.
Let's calculate these estimates.
Based on these estimates:
- Approximately 70% of programmers are aged between 20 and 50.
- About 20% of these programmers earn over $150,000 per year.
- And roughly 10% of this high-earning group might be interested in purchasing a high-end product like the Apple Vision Pro at launch.
The estimated number of programmers in the U.S. who fit all these criteria and would likely buy the Apple Vision Pro at launch is around 56,000.
Please note that these figures are based on assumptions and should be used as a rough estimate. Actual numbers may vary based on more precise data and research.
Incidentally, 60,000-80,000 might be close to the actual launch day market for this thing in the US! Unfortunately, GPT failed to account for the prevalence of scalpers. This makes me wonder whether the face-scanning step was added not only to improve customer experience (recall the Solo loop band fiasco based on shoddy in-app sizing guidelines), but also to slow scalpers down and to reduce the fungibility of the Vision Pro by obfuscating which size light shield someone is actually buying.
Of course, if Apple really wanted to discourage scalpers, they would lock the device to the iCloud account of whoever purchased it for some period of time. But since that would cause significant customer support headaches, it's unlikely Apple would ever bother.
MacRumors gives a rundown of how Vision Pro can interface with your Mac based on what the keynote video indicated. I forget sometimes I've been scrutinizing Apple keynote videos since 2001 when they would talk at length about things like the number of pipeline stages in PowerPC CPUs relative to Intel, and so not everyone has developed the same eagle eye.
In summary, here's what you apparently can do with a Mac from your Vision Pro:
- Control the Mac's main display in a single 4K-resolution screensharing window called Mac Virtual Display (this is almost certainly why macOS Sonoma added High Performance screen sharing)
- Connect Bluetooth accessories like mice, trackpads, and keyboards to Vision Pro
Here's what you probably can't do:
- Screen share to any Mac over the Internet using the built-in screen-sharing feature (doing so will probably require a third-party VNC-like app like Screens, as Apple will likely keep things simple by requiring being on the same local network and leveraging Bonjour and peer-to-peer wifi)
Here's what you almost certainly won't be able to do:
- Control Intel Macs (much less non-Macs)
- Control multiple Macs at once
- Bring up multiple screens on a Mac
- Run at resolutions higher than 4K (surely a bandwidth limit of high-performance screen sharing, which one presumes sports lower latency and a higher refresh rate)
- Break apps out of Mac Virtual Display to be organized in space (similar to how Parallels allowed Windows apps to be run as free-floating Mac windows circa 2007 with the advent of x86 virtualization in Mac OS X)
Important to get expectations sorted before pre-orders. I'm already trying to think through ways I could conceivably travel exclusively with a Vision Pro and without an accompanying Mac. The 13" M2 MacBook Air has always felt way larger and heavier than it should, and definitely would be for this purpose. If there were ever a time for Apple to rescuscitate the 12" MacBook, it would be now. Maybe I'll settle for a VNC app to remotely connect to a Mac in the cloud? It'd be great if someone could build an Apple TV-sized puck that could run macOS for no other purpose but to be a remote display for Vision Pro.
Of course the real solution would be for a future version of the Vision Pro hardware to support virtualization of macOS… but I'm not going to hold my breath.
If you're a Ruby developer and you've never been to RubyKaigi, the 2024 edition from May 15-17 in Okinawa would be a fantastic place to start. Despite being nearly thirty years old, Ruby is still ascendant in Japan—inspiring a diverse and enthusiastic community of Rubyists who are passionate about pushing the language further. Kaigi has played an important role in that success, gradually picking up steam over the last ten years as it travels between many of the best cities in Japan (owed largely to the organizers' impeccable taste in food and drink).
If you've ever considered coming to Kaigi but hesitated out of concern about Japanese language ability, transportation complexity, unfamiliar customs, or whether you'll like the food, Okinawa offers a uniquely comfortable on-ramp to first-time visitors:
- Okinawa's nature and beaches make it one of the most beautiful places in Japan, with the best weather in the nation in May—tropical and sunny, interspersed with brief heat-breaking rain showers
- Despite the laid-back island atmosphere, English is as widely understood and spoken in Okinawa as anywhere else in Japan (one silver lining of the continuous presence of the American military). Additionally, RubyKaigi's organizers try to schedule at least one English-language talk for every time slot and pay for live translation services into English for all Japanese-langauge talks
- While all of Japan is incredibly welcoming to foreigners, Okinawa is one of the few places in Japan designed to be a tourist destination—easy to get around and with the amenities people expect to see on vacation, while refreshingly free of the congestion and strain tourists' presence has placed on other cities like Osaka and Kyoto
- It's always been my perspective that to experience the full range of Japanese culture, one needs to escape the urban sprawl of Tokyo and visit more rural areas. Okinawa is unique in that it's every bit as accommodating to people unfamiliar with Japanese customs as Tokyo is, while still offering a broad and authentic view into Japanese culture and tradition
- If you're not sure whether you'll like Japanese food, Okinawa is such a melting pot that it's become home to a fusion of American and Asian cuisine that should satisfy any palate. When in doubt, there's always taco rice
For more information on what it's like to attend Kaigi, check out the field report I filed for Test Double last year in Matsumoto. It was easily the most educational and fun conference I've attended since the pandemic, with genuinely useful information about still-in-development Ruby features I wouldn't have found anywhere else and absolutely fantastic official and unofficial activities and parties. I'm sure this year will be no different! Tickets aren't on sale yet, but you can sign up to be notified when they go live.
I'm not planning to speak this year, but you should consider submitting to the
CFP before it closes at the end of
January. RubyKaigi's program is geared for giving open source Ruby programmers a
platform to share their work along with their experiences and insights. In
addition to straightforward presentations of new gems and language features, the
conference also celebrates the joy of programming by highlighting creative and
quirky talks like Shioi-san's Ruby fork to add a oft-requested i++
operator
and @tompng's increasingly fantastical
quines. If you write
open source Ruby, I hope you'll consider submitting a talk! (And if you'd like
some free advice on responding to CFPs, I published a video about
speaking last month you might
enjoy.)
Anyway, if you're thinking of going, drop me a line if you have any questions or concerns and I'll be happy to help. Hope to see you in Okinawa this May! 🏝️
Ruby's REPL, irb has gotten a ton of love over the past few years, essentially obviating most popular debuggers in the process. This year is another huge leap forward.
This is my favorite item in this list because I didn't even know show_source
existed:
The show_source command has always been an essential tool for many IRB users, especially for debugging. This year it received two enhancements that will make it even more useful:
- You can now use -s to get the method's super definition if it has one
- It can now display private methods too
Awesome!
Having a great REPL is essential in dynamic languages, so quality-of-life improvements have a direct impact on the speed and quality of feedback during development.
Rounding out what feels like six months of contemplative and retrospective blog posts over at the Test Double blog is this one on my own idiosyncratic approach to finding meaning in my work and how I wish more managers would approach their direct reports' career development.
To wit, I've never accomplished anything I felt proud of by setting a goal. In fact, the surest way to ensure I don't do something is to set a goal. When asked to set goals for myself, I've found that expressing the goal (as opposed to achieving it) becomes my overriding objective. The moment a manager approved my list of goals, I felt that I had completed the work asked of me and I would instantly lose all motivation to pursue the goals themselves.
This explains why planting flags can succeed where goal-setting fails. If what I'm searching for is meaning in my work, setting a goal creates an expectation of where, when, and how my future self should find that meaning. High pressure. Focusing on doing my job well and reflecting on whatever I did in retrospect, however, has allowed me to sift through my experiences, identify patterns, and give meaning to them. Low pressure.
I feel like managers as well as employees find comfort in planning what their future state will look like, as if it reduces any of the fear-inducing uncertainty that lies ahead. It doesn't, but I can't imagine why else people would so quickly reach for goal-setting as their primary career development approach when it almost always fizzles and fails.
I need exactly two things to do great work: awareness of others' expectations of what they think I should be doing and a shared definition of what success will look like when I achieve it. Nothing more is necessary to excavate metaphorical tons of marble from which I can chisel sculptures to mark the moments that mattered along the way.
One more note on what people get wrong about ✨Thought Leadering✨:
But here's the thing: I create these things for me and me alone. When a bunch of people read something I wrote or show up to one of my talks, do I find it encouraging and validating? Sure. But it's not what drives me. I started creating things to punctuate my life's sentences long before anybody took an interest in me and I wouldn't stop even if everyone loses interest in me.
If the lack of an audience is the reason you've failed to apply creativity in expressing the meaning of your life's work, that's a real shame.
Kudos to Instagram for being so confidently unconcerned its creators would consider pushing back against the platform for saying this:
90% decrease in cost per person recalling seeing ads with a mix of virtual creator content and campaign video, compared to ads with campaign video only
I've talked to several people in advertising about the rise of "The Influencer" as a target for ad spend over the last five years and the one thing—literally, the only thing—ad executives seem to like about influencers is that they're cheap. If they like two things about working with influencers, it's that so many users are competing for so few dollars that moral scruples about aren't a factor.
I don't see how AI talent doesn't give ad buyers more of what they want (even lower prices and even fewer scruples) while also escaping many of the downsides of working with influencers (managing amateur talent, qualifying promoted content, ensuring brand safety). It will be interesting to see how durable influencers' place in the market turns out to be.