As a company, Apple has a well-earned reputation for privacy. Apple iPhones have among the most robust protections for user data in the industry, which means that users feel secure using Apple products in a way that is both uncommon and coveted. Users expect their data to be safeguarded—and under CEO Tim Cook, Apple has generally been seen as holding up its end of the privacy bargain.
But recent allegations concerning contractors tasked with improving the performance of Siri, Apple’s virtual assistance software, has threatened to undermine some of that trust. According to the allegations, contractors had broad access to recordings of users’ Siri usage, even in instances when the voice assist software was unintentionally activated.
While this might not represent a specific breach of Apple’s terms of service, it certainly does signify a breach of trust, which threatens to undermine Apple’s reputation for protecting privacy at all costs.
A Program Designed to Improve Siri
Most smart phones and devices these days come with voice-operated “assistants.” Examples include Hello Google or Alexa. On Apple devices, the voice operated assistant is named Siri. Conceivably, Siri listens to your instructions and acts accordingly. You can tell Siri to play music or look up obscure information on the internet (you can ask Siri who played Captain Picard in Star Trek: The Next Generation, for example).
Siri works pretty well, but it’s not perfect. Sometimes it can misunderstand what you really want. In large part that’s because human speech is incredibly complicated. Even people misunderstand each other with high regularity.
That’s why Apple invested in a project designed to improve the performance of Siri (referred to as a “grading” program). In theory, these contractors would evaluate Siri’s responses and behaviors and develop ways to improve interactions. The problem is that users were not aware of just how much information they were giving to these contractors, nor were users given the opportunity to opt in to the program.
What Information Was Compromised?
That Apple would be looking to improve Siri’s performance is not particularly shocking. That’s what they should be doing, and most users were tacitly accepting of Apple using certain data to accomplish that. But what was shocking was that Apple contractors were given access to users’ private conversations. What’s more, these recordings were kept on file (conceivably so engineers could later refer to them, when necessary).
Apple’s response comes roughly one month after the listening program was first detailed in the pages of The Guardian. In late August, Apple announced the following steps, which are slated to go into effect when the Siri grading program resumes this fall:
- Contractors involved were (generally speaking) replaced.
- Audio recordings will no longer be kept and contractors will only have access to digital printouts of conversations or Siri interactions.
- Audio recordings that currently exist will be destroyed.
- Siri users will have to opt in to the program (rather than, as the program was previously construed, needing to opt out).
Combined with a simple apology, these steps seem designed to prevent long term damage to Apple’s reputation for privacy protection. Whether that strategy proves successful in the long run is difficult to predict. It’s entirely possible that Apple has built up enough goodwill among users to remain relatively unscathed.
Not Just an Apple Problem
Handling data gathered by mobile devices—and specifically through these voice assist apps—is not just a problem for Apple. Recently, Google experienced heightened scrutiny because of a similar occurrence. To protect users’ privacy, Google has temporarily suspended human reviews of its own recordings. Likewise, Amazon (which operates Alexa) has made it easier for users to opt out of its recording reviews.
These massive tech giants are navigating some relatively new and murky ethical territories. User privacy is almost universally coveted, but those same users also expect technology to function consistently and without error. In some cases, those two demands might seem incompatible, leaving tech giants in a rather untenable situation.
Phones Are Always Listening
It might be a slight exaggeration to say that our mobile devices and smart speakers are always listening. But it’s true enough. All of these devices have the capability of surreptitious eavesdropping. And that makes trust an essential and vital commodity for these tech giants.
In its response, Apple seems to grasp the seriousness of the issue and is taking steps to safeguard its reputation as a stalwart defender of privacy. The recent developments in the Siri grading program have impugned that reputation. But Apple now has the opportunity to follow through on changes and show its community of users that the company is responsive and responsible.
How that works out for Apple in the long run depends on that follow through. Users will certainly be watching.
Do you have additional information about this Apple News? Is there another hot topic you’d like to see us cover? Tell us in the comments below. You can also contact us for more information! Feel free to shoot us an email to Outreach@ConsiderTheConsumer.com. You can also find us on Twitter, Facebook, Instagram, LinkedIn, or even connect with us directly on our website!