By now, most of you have read about Apple's powerful public statement of refusal to comply with a court order compelling the firm to help the FBI gain access to the data stored in the San Bernardino shooter’s iPhone 5. Specifically, the FBI requires Apple’s help disabling the device’s data auto-erase function after 10 incorrect password attempts, and Apple is refusing to modify the software to enable this.

Over the past three years, Apple has hitched its brand wagon to privacy, because the firm believes that a) customers care enough about privacy to vote with their dollars and b) as the steward of people’s most personal, sensitive data, Apple has an obligation to serve their best interests. While this isn’t the first time that Apple has found itself targeted by regulators over privacy, this is the firm’s staunchest defense yet against government intrusion. Forrester believes that, with this move:

  • Apple is putting its money where its mouth is. Until recently, there has been plenty of debate about whether Apple has simply been paying lip service to privacy. But this move — along with its recent shuttering of the iAds business — proves that Apple is making serious product and business model changes in support of user privacy. Tim Cook is holding fast on the line he drew in the sand last year at the EPIC’s (Electronic Privacy Information Center’s) Champions of Freedom event in Washington, DC, where he said:

“We believe the customer should be in control of their own information. . . . There’s another attack on our civil liberties that we see heating up every day — it’s the battle over encryption. Some in Washington are hoping to undermine the ability of ordinary citizens to encrypt their data. . . . We think this is incredibly dangerous.”

  • Apple is retroactively exercising its current privacy thinking. Since the release of iOS 8, Apple has slowly been beefing up both its hardware and software privacy and security capabilities. Touch ID, Apple Pay, and randomized MAC addresses are all evidence of the firm’s desire to provide its customers with stronger protection against tracking of all kinds. By refusing to create a backdoor in an olderversion of the iPhone and iOS, Apple is signaling that this level of protection isn’t just for customers who own the most recent models; it’s for all Apple customers.
  • Apple is leveraging its technical assets in new ways.The combination of hardware and software digital rights management (DRM) plays a key role in Apple’s control of the iOS ecosystem. Apple uses DRM to lock down available operating system versions, applications, and hardware capabilities, leading many users to jailbreak their devices. But Apple also leverages DRM capabilities to enforce privacy measures; Apple’s decision to use DRM as a privacy enforcer benefits both the source and destination — a unique position given DRM’s shaky history in the hardware and software world.
  • Apple is doing what’s right to protect its devices and its brand. While the FBI’s request isn’t technically difficult, it would create a vulnerability in the iOS operating system which, in the wrong hands, could put user data at serious risk. Apple has already suffered media embarrassment over the vulnerabilities of iCloud. If the public caught wind of a deliberate action to weaken iOS security, Apple’s brand could be irreparably damaged.

What It Means: Many Sides Have A Stake In Apple’s Fight

This case really highlights the current philosophical conflict about surveillance, employer-employee relationships, and the role of the internet giants in society. Many stakeholders will weigh in on these discussions as this case unfolds and for the foreseeable future:

  • Law enforcement gets mixed responses based on the nature of their requests. Apple's cooperation with law enforcement in the San Bernadino case and historical support of iOS forensics with law enforcement agencies shows that the company understands sensitive needs in regard to investigations. However, this is a public case with open records that would require custom software development or engineering to bypass security features. If this signals to other companies that pushing back on expansion of legal boundaries wins customers favor, then law enforcement and the government will see more brands adopting privacy as a cornerstone, eroding what appears to have been tight working relationships between government and tech companies.
  • Tech companies have an example to follow when thinking about privacy. Since US judicial process relies heavily on precedent, the slippery slope argument is reasonable here. Legal precedent already allows for compelled decryption — the main difference here is that Apple doesn’t store the keys, and the person with the ability to decrypt is deceased. A precedent of cooperation here could require Apple to unlock any phone with a valid warrant request, possibly extending to other technologies, certificate providers, encryption providers, and more. The pressure is now on. Based on Tim Cook's message, a reasonable tech company can't say we care about privacy, wave their hands, and then do nothing to actually prove that they do.
  • Apple competitors have a teachable moment. Apple's "walled garden" is often railed against by the tech literate and developers alike. The inability of law enforcement to decrypt the device in question shows other companies and consumers the extent to which a technology platform can impact privacy. The Android operating system, application store, and hardware are all substantially more open than Apple's platform, which creates advantages for skilled technologists and developers but also creates privacy concerns given how different operating system versions, applications, and hardware may operate. If the hardware used in this case were Android or Windows based, would this order be necessary?
  • Apple’s customers may feel conflicted. The truth is the average consumer may not side with Apple on this one. On the one hand, most people value their privacy and wish to have it protected — it’s why they use passcodes and security software, and why they install cookie-blocking extensions in their browsers. On the other hand, the fear of home turf terrorist attacks is real and reinforced every day in the press and on the political stage. This conflict hasn’t fully played out in the court of public opinion, which means that Apple should be prepared for a mild backlash from consumers who choose state security over personal privacy.
  • Employers see their data access and protection capabilities at risk. The device in this case is technically the property of the attacker’s employer — and that organization has granted the authorities permission to examine what’s on the phone. But this begs the question: If the employer had not granted permission, what access could the FBI reasonably require Apple to provide? There’s a very real threat to corporate data and intellectual property under these circumstances.
  • The US economy and tech sector could see long-term negative impact if Apple loses. If US law enforcement succeeds in forcing Apple to break its own encryption, it would create a repeatable mechanism that governments and other law enforcement agencies may use over and over again. In the aftermath of the NSA scandal, consumers’ and businesses’ trust in government plummeted, and many US companies have also seen their revenues shrinking as a result. Other brands that have a similar stake in consumer privacy concerns should watch this case closely.
This post is a collaboration between several Forrester analysts, including: Enza Iannopollo, John Kindervag, Jeff Pollard, and Fatemeh Khatibloo, with Chris McClean, Laura Koetzle, Martin Whitworth, Chris Sherman, and Salvatore Schiano