My counter-cultural iOS 11 wish list
In seven days, it will have been ten years since the keynote to end all keynotes.
A decade hence, I'm uncomfortable with declarations that the iPhone and iOS are "mature," however. Unlike other mature platforms that became commoditized and absorbed into daily life, smartphones have not receded into the periphery of our attention. If anything, the entire planet is more glued to their phones today than ever before. "Mature" is code for "set in stone", and it's my view that any device that's able to so thoroughly capture the attention of the human race as the iPhone-era smartphone demands continuous reinvestment. Past decisions should be constantly reconsidered.
And after a year like 2016, I like to hope that Apple is reconsidering some of the decisions they've made about how their platforms have influenced life on earth, including our politics.
How Apple chooses to market iOS 11 will show us how they see their platform. If they agree that iOS is "mature", then we should expect a release focused on mere quality-of-life improvements. If they see iOS as the highly-influenctial, culturally significant touchpoint it has become, they might make fundamental changes that shake up how we use (and others exploit) their platform.
This post will share one potential direction Apple might take its platforms in these uncertain times.
When Apple's executives began making a public case for prioritizing user privacy and security, their software platforms became overtly political. In the United States, there is no way to avoid politics when advocating for or against an individual's right to privacy. And anyone familiar with encryption technology was not at-all-surprised by the fracas that unfolded in response to Apple's unwillingness (or, had the device been newer than an iPhone 5c, inability) to create a backdoor for the government to retrieve data from a customer's device—even though the customer had committed an act of terrorism.
Many critics of Apple's stance on privacy have said theirs is a high-horse of convenience: Apple makes their money selling tangible products to customers, and their financial success is not impeded when they deny themselves information about who its users are and how they live their lives. That Apple's circumstance is (however bizarrely) rare in the technology world is used as evidence that it is uncompetitive or unfair for Apple to seize upon the opportunity to build an inherently private platform.
What gets missed in all this is that every company gets to choose how it makes its money. Apple has continuously reaffirmed its choice to make money by selling products (and more recently, services) directly to its customers. Google, Facebook, Twitter, and most other startups, initially chose to make their money by selling equity to investors, valued by the popularity of their unpaid services. This exit strategy maneuver has eventually backed each service into the same corner: the only way to appease investors is to turn a profit, but when all you have is popularity, the only available business strategies require you to sell out your users.
This crisis of "how do businesses make money" is not a drama unfolding among the gods on Mount Olympus. It's a choice we all make, even if we don't realize it. For instance, years ago, I learned that I love speaking, teaching, and writing open source. I could have chosen to put my screencasts behind a paywall or charge a speaker's fee, but I wanted to be incentivized to share what I created freely with everyone. That's what led me to consulting—my freely available creative work can serve as both a public good and as a loss-leader for marketing our team's consulting services. If I had naively decided to simply charge others for doing what I loved, I'd have reached a much smaller audience, comprised of people who probably already agree with me. That was just one intentional choice of many; that every new company must find a billion unpaid users to be considered successful is not a foregone conclusion.
We should applaud Apple's old-fashioned-yet-exceptional business model of "charge people money for things," because it has enabled them to take a bold stand on behalf of our own privacy & security. But stop and ask yourself: what else could the owners of iOS (and macOS, tvOS, and watchOS) do for the good of humanity with the profound freedom-of-movement their business model affords them?
I have one answer: challenge the ubiquitous belief that access to information is an unambiguous good.
When the iPhone was first unveiled, it offered people a chance to hold all of the world's information in their hands. For the first time, humans could be connected to the sum of all knowledge everywhere they went and at every moment of the day. This represented unambiguous progress at the time. From the invention of the printing press to the release of the iPhone, we saw ourselves as within grasp of completing our march to the Enlightenment's promised land: unfettered, instantaneous availability of all the world's information.
Before the iPhone, our ideology had framed information in terms of its scarcity or availability. Scarce information was a tool of oppressive fascism, whereas freely-available information was a hallmark of liberty. But more recently, we've started to realize what happens when infinite information comes into contact with the finite human mind: somebody else gets to sort it. Whether it's Google's search, Facebook's Newsfeed algorithms, or our own self-made echo chambers on Twitter, the sampling of the Internet's infinite information that each of us receive is at best arbitrary and at worst manipulated.
I actually think Trump surrogate Scottie Nell Hughes was right when she claimed that there is no such thing as facts. Not anymore. In an era when anyone can confirm their bias by bringing their own sources to the table, facts themselves have lost whatever authority they once had in discourse and debate. It could be decades before the authority of education, science, and facts are restored in society, and we need to start planning for life after the death of the concept of "public trust." And until facts regain their power to settle arguments, technology companies should pause before assuming that more information is inherently better than less.
Unfortunately for technology companies, they've spent the last 15 years building an economy designed to maximize our attention to information.
Trump won the election because an increasing share of our economy is optimized for maximizing user engagement, and Trump exploited its blindspots to literally unbelievable success. He received near-constant attention in liberal and conservative corners of the media alike. He made me pull-to-refresh my Twitter timeline so frequently that my thumb often hurt at the end of the day. And whenever the zeitgeist naturally began to drift to other topics, Trump had the uncanny ability to say something just provocative enough to regain the world's attention.
"It may not be good for America, but it's damn good for CBS," said CBS chief, Les Moonves. He was criticized for the quote, but it could have applied to any engagement-driven business, from the Washington Post to Facebook. In the short-term at least, Trump-the-engagement-phenomenon has represented, for Google and CNN alike, a tremendous windfall in marketing revenue.
In the days that followed Trump's ascension, a lot of attention was paid to fake news on Facebook as a contributing factor. Fake stories were not only peddled by white supremacists, but also by entrepreneurial outrage manufacturers throughout the campaign. The spread of this news was not only the fault of Facebook's faceless engagement-optimizing Newsfeed algorithm, but also by many Americans who feel genuine racial antipathy. The popular solution—remove fake news from people's feeds—suggests that correcting or hiding false information will somehow result in a better-informed citizenry. This approach is doomed to fail, however, because it doesn't take the Streisand effect into account.
What media companies and marketing platforms alike don't want to allow for is the possibility that inaccurate information isn't the enemy, information addiction is. Rather, I propose that systems which attempt to maximize our consumption of information are the true cause for Trump's success. No one should be surprised to learn that researchers found a correlation between the proliferation of broadband Internet and a rise in political polarization. In an engagement-driven economy, dispassionate objective reality doesn't sell, polarizing outrage does. And it's a vicious cycle—as users acclimate to outrageous information, they'll set forth on their own, looking for stronger and stronger doses. Fox News gave way to The Daily Caller gave way to Breitbart gave way to Info Wars gave way to Stormfront.
[To his credit, Andrew Sullivan predicted this pattern of polarization years ago with what he dubbed epistemic closure, the lack of a limiting factor on polarization in an era of infinitely spreadable and accessible information.]
But infiniteness and openness may not be as inherent to how we experience the Internet as we once believed. If anything, the recent success of walled gardens over open protocols has proven that openness doesn't necessarily win. In many ways, we're now at the mercy of a handful of garden owner-operators anyway, so we may as well demand that they design their gardens well. The platform-holders, like Apple, can do more than occasionally weed things out, they can determine what sort of plants are allowed to grow in the first place. This level of corporate control and influence is a frightening thing to think about, but better to debate it in the open than to be left in the dark, as we have in the cases of Facebook's Newsfeed and Google's search results.
In a roundabout way, Apple's success at popularizing always-on, always-connected devices has led us to this point. Regardless of whether that makes them responsible for its effects, it does suggest they may have the power to start stuffing the genie back into the bottle.
If we've all become information addicts, perhaps Apple could build into its platforms a means to start treating ourselves. If it works, Apple would be rewarded with greater customer satisfaction and, one presumes, device sales. And it would be a differentiator that Google and many others would be unlikely to ever full-throatedly imitate, since giving users greater control would stand in direct conflict to the motives of engagement-maximizing businesses.
Here are some ideas of things Apple could do to combat engagement abuse, many of which could pass under the guise of strengthening user privacy by way of undermining unsolicited marketing:
- Apps and services designed to help people ration their own information consumption like Moment, Freedom and Focus are becoming more popular, but are hamstrung by iOS platform limitations. Apple could leverage its role as platform-owner to sherlock all of these services with a simple preference pane that enables users to limit how much time they spend on certain sites and apps
- iOS could build-in a sort of Skinner box detection. In addition to arming self-aware users with the tools to monitor or restrict their access to sites and apps they tend to "over-engage" with, Apple could develop heuristics that help users identify information addiction in the first place. For example, if an app makes the same API request dozens of times an hour, iOS might suggest enabling controls to remind or limit oneself from repeatedly getting sucked into refreshing that app's news feed
- iOS 6's introduction of Do Not Disturb gave users a blunt defense from the ever-increasing deluge of notifications being sent to iOS devices, but it casts far too wide a net to be reasonable for most people to leave Do Not Disturb enabled 24/7. What if iOS 11 raised the floor on when users were disturbed by learning to label and filter out "low-priority" and "junk" notifications just like Mail did? In a similar vein, what if iOS offered users a way to batch all of their notifications at set time intervals, to prevent the constant distraction of taps and banners and buzzes? I have to imagine I'm not alone in responding to pointless notifications with the foolish ritual of also opening Twitter, Facebook, or Instagram for a fresh hit of Like until I come to my senses and realize that 20 minutes have elapsed
- Intention-based unlocking: a dozen times a day I pick up my phone to perform an action, but get distracted by the siren songs of engagement-driven social media apps instead—only to forget why I picked up the phone to begin with! What if the new stickier lock screen introduced in iOS 10 could enable users to register their intent such that iOS could then help them stick to it? Maybe this would look like a more-prominently-located-than-notifications widget of user-initiated activities (e.g. set a reminder, take a note, etc.). Maybe it could be as simple as Siri shortcuts to sandboxed experiences; telling Siri "I want to write a tweet" might open a keyboard-driven tweet composer that spares the user from ever seeing their own timeline, for instance. The building blocks for this already exist with the interactive notifications introduced in iOS 10, the key difference being that actions could be user-initiated instead of reactions to notifications
- Apple could regulate its App Store marketplace by dictating monthly and lifetime spend limits on in-app purchases in an effort to eliminate the abusive whale-based economics of mobile gaming. This would affect Apple's bottom-line in the short run, but (aside from protecting a small class of high-spending IAP users) the catastrophic impact such an action would have on the free-to-play game industry would have the added benefit of dramatically improving the health of the overall app ecosystem
- iOS 11 could expand on the iOS 9 Content Blocker API to include a means for inspecting the content displayed to a user and alerting them about it. Beyond being a simple fake news-detecting browser extension, Apple could wedge in the ability for such extensions to access native user interface elements within apps like Facebook and Twitter themselves. If designed carefully, creative extensions could be designed to report conflicts-of-interest, flag native advertising, and promote opposing viewpoints
- In addition to the web, Apple's mail clients could ship preconfigured not to load images from untrusted senders. Magic pixel images have turned e-mail into a marketing platform driven by open rates and a surprising array of analytics about recipients, and with an action as simple as blocking images by default, Apple could unilaterally put the sleaziest of e-mail marketing companies out of business
- E-mail also remains a huge vector for those who would attack an individual's privacy. Many have claimed this is because e-mail was designed for an earlier era, or that the fact that no one "owns" e-mail means that fundamental improvements (like encryption) can't be made. Hogwash. Apple could use the same technology that enables iMessage users to detect one another as a mechanism for encrypting e-mail by default. First, Apple would quietly generate a PGP key-pair, store it in each user's encrypted keychain, and then invisibly distribute users' public GPG keys to any e-mail address that's also registered with iMessage. Apple's own mail clients, in turn, would encrypt and decrypt e-mail with as little fanfare as iMessage's iconic blue bubbles. Third-party mail clients on Apple's devices would be pushed to get on board, as encrypted e-mail would spread virally—just like iMessage itself did. The public conversation would then shift to Google (who reads users' e-mail to tailor its marketing) and other mail providers: why aren't you encrypting your users' email by default, as well?
These are just the first ideas that popped into my head when given the prompt "what if we stopped believing that more information was always good?" I have to imagine there are much better ideas out there. Unfortunately, Apple (along with perhaps Microsoft) may be the only platform holders with both the power and incentive to do anything about the toxic effect engagement-driven economy has had on us and our world.
For lack of many alternatives, here's to hoping that Apple agrees that iOS is not-quite-yet a mature platform.