Narrative: Apple Wins on Privacy

Each narrative page (like this) has a page describing and evaluating the narrative, followed by all the posts on the site tagged with that narrative. Scroll down beyond the introduction to see the posts.

Each post below is tagged with
  • Company/Division names
  • Topics
  • and
  • Narratives
  • as appropriate.
    Narrative: Apple Wins on Privacy (Jan 24, 2017)

    Written on: January 24, 2017.

    One of the defining characteristics of Tim Cook’s leadership of Apple over the past five years has been his commitment to protecting its users’ privacy. The topic has been mentioned again and again in WWDC keynotes, product launch announcements, and in a letter to Apple’s customers accompanied by additional information on Apple’s approach to privacy. Perhaps most famously, Apple stood up to the FBI in early 2016 when it was asked to help break into a phone used by a suspected terrorist, prompting another Tim Cook letter on the subject.

    It’s clear that Tim Cook feels very strongly about protecting user privacy, and that he has made this a priority in the way Apple hardware, software, and services are designed. Hardware features such as Touch ID and the secure enclave which protects its user data, and approaches to cloud services which keep data on user devices are intended to help safeguard privacy and secure that data. Apple sees its commitment to privacy as a positive differentiator against major competitors – notably Google – which have ad-based business models and therefore have to gather and make use of individual user data in order to make money.

    There are two big questions around all of this: firstly, whether users actually care about this approach; and secondly, whether Apple’s strict approach to privacy prevents it from providing the best possible services. It’s worth addressing both of those.

    I’m actually in the middle of conducting a survey on user attitudes towards privacy and security now, and should have some solid data shortly to answer the first question in depth. But from other surveys I’ve done as well as from observing behavior on both a large and personal scale, I’m convinced that the answer is far more complicated than a binary yes/no. Some users care deeply about their privacy and refuse to engage with any service which would collect and use personally identifiable data, while other users are entirely willing to trade some user privacy for either free services, better targeted advertising, or more effective personalization of services. There’s a spectrum here, and people lie along every point on that spectrum. As such, for some people Apple’s privacy stance is critical and a key reason why they buy Apple products and use Apple services in preference to those provided by Google or others. On the other hand, there are others who simply don’t care about the privacy tradeoffs and for whom Apple’s stance seems entirely academic.

    That leads us to the second question, because quite a few observers have suggested that Apple is actually making its services poorer by not collecting individual profile data in the cloud and applying much more powerful computing power to it in order to personalize and improve its services. Apple would respond by saying that device-level collection is fine for profile building and keeps the data out of both Apple’s and third parties’ hands entirely, and that it uses techniques such as differential privacy to aggregate user data in a way that preserves the value of large data sets in the cloud without making any of the data personally identifiable. It has also made its devices increasingly adept at performing computing tasks which others perform in the cloud, notably facial recognition.

    On balance, Apple’s stance is a competitive benefit for at least some users, many of whom are at any rate more likely to be willing to pay a premium rather than preferring the lowest possible cost, so there’s likely a good alignment between those who prefer to pay for privacy and Apple’s base. However, it has little appeal for at least some users, who either prefer free services or the benefits that come from being profiled and targeted in a highly individualized way. Apple undoubtedly has at least some disadvantages versus its competitors in providing such personalization, but for now those are at least as much about will as any insurmountable barriers flowing from its privacy stance. (See Apple Doesn’t Get Services)

    Apple’s Long-Running iOS Account Sign-In Dialogs Create Vulnerabilities (Oct 10, 2017)

    A developer named Felix Krause has surfaced an issue that’s been present in Apple’s iOS for a long time and which I’ve often wondered about myself, which is that the operating system periodically pops up what appear to the user to be random dialog boxes asking users to supply their Apple ID passwords. Because of the seemingly random times and places these dialogs show up, they train users to enter their passwords when using apps, which means that apps could at least theoretically recreate these dialogs with their own and thereby phish users’ Apple ID details, creating a security vulnerability. The post Krause wrote about this suggests several fixes, the most of obvious of which is that these dialogs should direct users to the Settings app rather than prompting for a password directly. In my opinion, it would also be nice if the dialogs explained why the user suddenly had to re-enter their password – the lack of explanation is another long-standing niggle I have with these dialogs. But this feels like a rare goof by Apple, which is normally so strong on privacy and security but has here created a situation which could easily be exploited by malicious parties. It’s easily fixed, though, and hopefully Apple will do so soon.

    via Felix Krause

    Apple General Counsel Bruce Sewell Retires, to be Replaced by Katherine Adams (Oct 6, 2017)

    Apple has announced that its long-standing general counsel, Bruce Sewell, is retiring and will be replaced in the role by Katherine Adams, who joins Apple from a similar role at Honeywell. Normally, the departure of the general counsel at a tech firm wouldn’t be something I’d cover, but this is noteworthy for two reasons. Firstly, Sewell enjoyed a rather higher profile than most general counsels do over the last couple of years because he was a key figure in Apple’s fight with the FBI, among other things, and of course Apple’s lawsuits against Samsung and more recently Qualcomm have also been fairly high profile. Few corporate lawyers get to implement company strategy quite as directly as Sewell did during his time at Apple, especially with regard to privacy. Adams will obviously take over the Qualcomm case and others Sewell was overseeing along with carrying the mantle of protecting privacy in the context of law enforcement. Secondly, the fact that a woman is replacing a man on Apple’s board means that it now has two out of eleven members who are women. As I noted a month ago, the next tier down of eight executives is evenly split, but until now Angela Ahrendts has been the lone woman on the board. It’s good to see that start to change, and I wonder whether other executives who move on from those senior ranks in the coming years will likewise be replaced by women.

    via Apple

    Apple Updates Privacy Site to Reflect Face ID and New Differential Privacy Uses (Sep 27, 2017)

    As new versions of Apple’s operating systems and new iPhone hardware roll out, Apple has updated its website’s privacy section to reflect some of the recent changes and especially to deal with questions users may have about the Face ID feature on the upcoming iPhone X. The site starts with big picture statements about Apple’s commitment to privacy, starting with the assertion that “At Apple, we believe privacy is a fundamental human right” and moves on to more detailed descriptions of Apple’s approach to privacy. In a nutshell, the policy described there is that Apple isn’t interested in your personal data, enables you to determine with whom to share it, and also provides tools for you to protect your information and devices. Apple also addresses its use of differential privacy, which has been in the news lately for a couple of different reasons, including a recent study which asserted that it’s weaker as a privacy protection than Apple says, but also because of changes to Safari data gathering in macOS High Sierra.

    For Apple, the key is that it has no reason to infringe on its users’ privacy, because its business model is best served by protecting that privacy rather than gathering data on its users. That’s a meaningful differentiator for at least some Apple customers, and reinforcing these values will be important to them, but for many other customers Apple, Google, Microsoft, and other companies’ privacy policies are not a matter of significant moment. That could of course change in time as these companies have potential access to more and more personal data including health data, but for now the surveys I’ve seen suggest that trust levels are broadly similar between big companies and most people don’t avoid companies like Google because of their business models and approach to data gathering.

    via Axios

    Researchers Claim Apple’s Differential Privacy Approach is Inadequate (Sep 18, 2017)

    Wired reports on a third party study which claims that Apple’s approach to differential privacy – the method Apple says it uses to obfuscate individuals’ data when uploading it to the cloud – is inadequate to really protect those users’ privacy. That study dug into Apple’s code and on that basis makes claims about the degree to which Apple has added noise to the data, that degree being the single biggest factor in determining how obscured the individual’s private information is. The authors claim that Apple’s differential privacy approach adds far too little noise to data to preserve privacy, while Apple has pushed back, saying that the approach used assumed that it treats all data the same way and that aggregating data across multiple categories would reveal more about users than looking at single data points, assertions Apple disputes.

    One of the most telling lines in the article has one of the researchers saying that the DP approach is based on the assumption that companies will always behave badly, something Apple would clearly dispute too – it prides itself on protecting users’ privacy, generally doesn’t use business models which require it to collate data about users to target advertising, and requires users to opt in to any of this data gathering in the first place. As such, some of the assumptions being made by the researchers may be reasonable in general but not as applicable to Apple as to other companies. The fundamental issue here, though, is that Apple isn’t transparent about its approach, something I would guess it would attribute to competitive sensitivity, but which – like all company claims about privacy – requires users to take many of their privacy claims on trust. Whether you’re OK with Apple’s approach should therefore depend less on claims like those made by these third party researchers and more on whether you trust Apple overall when it comes to privacy. Surveys I’ve been involved with have generally shown high levels of trust on that point among Apple users and the population in general.

    via WIRED

    Apple Partners with Chinese Company for iCloud to Comply with New Regulations (Jul 12, 2017)

    This content requires a subscription to Tech Narratives. Subscribe now by clicking on this link, or read more about subscriptions here.

    Apple News Reportedly Readying Ad Platform Integration for Publishers (Jul 5, 2017)

    This content requires a subscription to Tech Narratives. Subscribe now by clicking on this link, or read more about subscriptions here.

    Apple Announces Developer Preview of Business Chat for iMessage Customer Service (Jun 9, 2017)

    Apple didn’t mention Business Chat explicitly during its WWDC keynote on Monday, but details about it have emerged during the week and it held a session on Friday morning at which it detailed the service for developers. What we know now is that Business Chat is an equivalent to Facebook Messenger for business to allow businesses to perform customer service tasks through iMessage. It won’t launch publicly until next year, but Apple is announcing a developer preview and all the tools necessary for businesses to create customer interactions using iMessage. The platform is pretty fully featured, offering not just text messaging but payments through Apple Pay, pickers for time slots, products, and the like, and integration with custom apps through the iMessage apps platform. Between this and the various other changes we’ve seen announced by Apple around iMessage over the past year, it’s evolving iMessage from a mere app to much more of a platform, very much along the lines I outlined in this article I wrote early last year. I think that’s super smart, and one of the best things about it from a customer perspective is that Apple isn’t doing any of this to drive new revenues or push advertising or any of the other things others in this space – notably Facebook – are doing. Apple is very aware of how personal a space iMessage is, and will prevent businesses from ever sending unsolicited messages – every interaction will be initiated by the user, from the first onwards. The platform looks clever, and giving developers and companies lots of time to implement it should mean that by the time this releases to the public next year, it should be really effective.

    via TechCrunch (see also Apple’s developer page for Business Chat and the WWDC session on Business Chat)

    ★ Apple is Developing a Dedicated AI Chip (May 26, 2017)

    This content requires a subscription to Tech Narratives. Subscribe now by clicking on this link, or read more about subscriptions here.

    Google Develops Federated Machine Learning Method Which Keeps Personal Data on Devices (Apr 6, 2017)

    This is an interesting new development from Google, which says it has created a new method for machine learning which combines cloud and local elements in a way which keeps personal data on devices but feeds back the things it learns from training to the cloud, such that many devices operating independently can collectively improve the techniques they’re all working on. This would be better for user privacy as well as efficiency and speed, which would be great for users, and importantly Google is already testing this approach on a commercial product, its Gboard Android keyboard. It’s unusual to see Google focusing on a device-level approach to machine learning, as it’s typically majored on cloud-based approaches, whereas it’s been Apple which has been more focused on device-based techniques. Interestingly, some have suggested that Apple’s approach limits its effectiveness in AI and machine learning, whereas this new technique from Google suggests a sort of best of both worlds is possible. That’s not to say Apple will adopt the same approach, and indeed it has favored differential privacy as a solution to using data from individual devices without attributing it to specific users. But this is both a counterpoint to the usual narrative about Google sacrificing privacy to data gathering and AI capabilities and to the narrative about device-based AI approaches being inherently inferior.

    via Google