Down The Ladder Logo

walled gardens in your mind

Jan 30 2024

technology

Apple needs you to know that you're doing it wrong.

Note: I wrote this article before I had upgraded to iOS 17, where they seem to have patched the specific Siri behavior I am talking about. I maintain that the fact this behavior was ever present still says a great deal about Apple’s design philosophy.

Let’s begin this week with an exercise for the reader. If you’re an Android user, continue reading quietly and indulge in the smug satisfaction that you’re not victim to the petty tyranny I’m about to describe. If you’re an iPhone user, pick it up and say the following phrase:

“Hey Siri, set a timer for five pm.”

At time of writing, Siri will respond:

“Timers can’t be set for a time of day, so I’ve set your alarm for five pm.”

I have been thinking about this for years now. I can’t remember exactly when I noticed how strange this response is, but as soon as I did, I couldn’t help but intentionally trigger it every time I needed to set an alarm just to bask in its absurd passive-aggression. It tells a story about user interface design, the way we interact with artificial “intelligence,” and the way that tech companies frame our thought.

The story starts with a simple technical problem. When designing a service like Siri, Apple has to first decide how to translate all the iPhone’s existing functions into human-readable (and expressible) phrases. The iPhone’s clock app has two related but separate functions: alarms, which trigger a sound at a specific time of day, and timers, which trigger a sound after a certain period has elapsed. The only functional difference between the two is that a timer has a countdown that is visible within the clock app, and alarms merely have a toggle switch that tells you if they’re enabled or not.

Initially, Apple must have decided that this separation of functionality also had to be reflected in the phrases the user has to say to Siri to activate either function. And so, if it’s noon, and you want to set a countdown timer for thirty minutes from now, you say “set a timer for thirty minutes.” If you want to set an alarm for thirty minutes from now, you say “set an alarm for twelve thirty.” All is well and good so far.

But in normal human conversation, which artificial “intelligence” tools like Siri are in theory supposed to emulate, we often use the terms “timer” and “alarm” interchangeably. At some point, either in production or in the testing process, Apple found that a large portion of people were saying “timer” when they really wanted the “alarm” function. We know this because - as about half of those reading this just demonstrated - they have programmed in a fallback for this case: if you set a timer for a specific time of day, Siri will use the “alarm” function instead.

In many cases, that would be the end of it. As a programmer myself, I can confidently say most of the job is trying to translate the spirit of the user’s command into action rather than the letter. But where things get really interesting is how Apple has decided to handle this use case. Before following the command, Siri makes sure you know that you can’t set a timer for a time of day, despite the fact that from the user’s perspective, this is exactly what Siri has done. But it doesn’t matter, because even as Apple understands the practical reality that Siri should just follow the spirit of the user’s command and move on, it was clearly important to them that the user knows they are doing it wrong.

This is the part of the interaction that fascinates me so. There’s no technical reason Apple needs to insist that the user uses the “correct” terminology for the iPhone’s functionality. The motivation for this is entirely ideological.

Discussion of Apple’s “walled garden” - the way that they lock users into buying their products in perpetuity by adding integrations between all of their devices (and making it difficult to integrate others) - is common in tech circles. My personal favorite tech YouTuber, Marques Brownlee, gave the most succinct explanation of it in his video about the topic. In the video, Brownlee talks to other Apple users within the tech journalism space, and he asks them why they continue to use the iPhone. Almost all of them respond with some variation of the “ecosystem,” essentially admitting that they’re trapped within the walled garden, and that the transition costs of moving away from it would be too high to justify.

But this Siri interaction reveals something else: the walls of the garden are not exclusively technological. Siri will correct you on the timer / alarm distinction every single time you ask without fail, forever. And, as such, many users are likely to absorb it - to redefine the concepts “timer” and “alarm” in their own mind to align with the functionality of their phone, rather than human language.

This interaction is small, and on its own, largely inconsequential. But Apple designing software specifically with the purpose of social engineering is not limited to this. Most cynically, they have until last year refused to implement RCS - the new standard for text messages that enables advanced features - to ensure that iMessage retains its social power to divide people into a blue bubble / green bubble caste system. I am aware that description may sound dramatic, but consider that almost every Apple user Brownlee surveyed in that video cited iMessage as the reason they choose to remain in within the walls of the garden. Consider that the demand to circumvent the Apple exclusivity of iMessage is so large that an entire cottage industry of companies have sprouted up to fulfill it, mostly unsuccessfully. Consider that the social ostracization that comes with the green bubble is most powerful among teens.

Most of the time when we discuss the way technology influences our habits of mind, we talk about black-box machine learning algorithms whose directives are to increase engagement. Those algorithms are obviously still given those directives by humans with financial and social priorities - and discussion of how those priorities still carry human prejudices is ongoing and important. But Apple’s decisions about timers, alarms, and iMessage are not algorithmically driven - they are very straightforward and conscious choices to use their market share to shape society in their image. It is not enough for them to build a walled garden around your technology. They’ll build one around your mind, too - and that underscores the importance of staying aware of the way that they’re trying.

sign up for the newsletter to get notified about new posts:

Support me on Patreon!