Narrative: Apple Wins on Privacy
Each narrative page (like this) has a page describing and evaluating the narrative, followed by all the posts on the site tagged with that narrative. Scroll down beyond the introduction to see the posts.
Narrative: Apple Wins on Privacy (Jan 24, 2017)
This content is restricted to paid subscribers to the Tech Narratives service. You can sign up on this page for a 30-day free trial, which will give you access to all the content on the site including the daily comments, narrative essays, subscriber forums, and other restricted features. If you’re already a subscriber, you can sign in using the link below.
If you’re already a member, you can sign in here.
Wired reports on a third party study which claims that Apple’s approach to differential privacy – the method Apple says it uses to obfuscate individuals’ data when uploading it to the cloud – is inadequate to really protect those users’ privacy. That study dug into Apple’s code and on that basis makes claims about the degree to which Apple has added noise to the data, that degree being the single biggest factor in determining how obscured the individual’s private information is. The authors claim that Apple’s differential privacy approach adds far too little noise to data to preserve privacy, while Apple has pushed back, saying that the approach used assumed that it treats all data the same way and that aggregating data across multiple categories would reveal more about users than looking at single data points, assertions Apple disputes.
One of the most telling lines in the article has one of the researchers saying that the DP approach is based on the assumption that companies will always behave badly, something Apple would clearly dispute too – it prides itself on protecting users’ privacy, generally doesn’t use business models which require it to collate data about users to target advertising, and requires users to opt in to any of this data gathering in the first place. As such, some of the assumptions being made by the researchers may be reasonable in general but not as applicable to Apple as to other companies. The fundamental issue here, though, is that Apple isn’t transparent about its approach, something I would guess it would attribute to competitive sensitivity, but which – like all company claims about privacy – requires users to take many of their privacy claims on trust. Whether you’re OK with Apple’s approach should therefore depend less on claims like those made by these third party researchers and more on whether you trust Apple overall when it comes to privacy. Surveys I’ve been involved with have generally shown high levels of trust on that point among Apple users and the population in general.
Apple didn’t mention Business Chat explicitly during its WWDC keynote on Monday, but details about it have emerged during the week and it held a session on Friday morning at which it detailed the service for developers. What we know now is that Business Chat is an equivalent to Facebook Messenger for business to allow businesses to perform customer service tasks through iMessage. It won’t launch publicly until next year, but Apple is announcing a developer preview and all the tools necessary for businesses to create customer interactions using iMessage. The platform is pretty fully featured, offering not just text messaging but payments through Apple Pay, pickers for time slots, products, and the like, and integration with custom apps through the iMessage apps platform. Between this and the various other changes we’ve seen announced by Apple around iMessage over the past year, it’s evolving iMessage from a mere app to much more of a platform, very much along the lines I outlined in this article I wrote early last year. I think that’s super smart, and one of the best things about it from a customer perspective is that Apple isn’t doing any of this to drive new revenues or push advertising or any of the other things others in this space – notably Facebook – are doing. Apple is very aware of how personal a space iMessage is, and will prevent businesses from ever sending unsolicited messages – every interaction will be initiated by the user, from the first onwards. The platform looks clever, and giving developers and companies lots of time to implement it should mean that by the time this releases to the public next year, it should be really effective.
★ Apple is Developing a Dedicated AI Chip (May 26, 2017)
Google Develops Federated Machine Learning Method Which Keeps Personal Data on Devices (Apr 6, 2017)
This is an interesting new development from Google, which says it has created a new method for machine learning which combines cloud and local elements in a way which keeps personal data on devices but feeds back the things it learns from training to the cloud, such that many devices operating independently can collectively improve the techniques they’re all working on. This would be better for user privacy as well as efficiency and speed, which would be great for users, and importantly Google is already testing this approach on a commercial product, its Gboard Android keyboard. It’s unusual to see Google focusing on a device-level approach to machine learning, as it’s typically majored on cloud-based approaches, whereas it’s been Apple which has been more focused on device-based techniques. Interestingly, some have suggested that Apple’s approach limits its effectiveness in AI and machine learning, whereas this new technique from Google suggests a sort of best of both worlds is possible. That’s not to say Apple will adopt the same approach, and indeed it has favored differential privacy as a solution to using data from individual devices without attributing it to specific users. But this is both a counterpoint to the usual narrative about Google sacrificing privacy to data gathering and AI capabilities and to the narrative about device-based AI approaches being inherently inferior.
Apple hires Jonathan Zdziarski, an active forensics consultant & security researcher in the iOS community – 9to5Mac (Mar 14, 2017)
Zdziarski was in the news a lot a year ago, when Apple was fighting the FBI over the iPhone used by the San Bernardino shooter, because he was frequently quoted and cited as an expert who backed Apple’s stance. As such, it’s not altogether surprising that he should end up at Apple – he’s been both one of its staunchest supporters around some security and privacy issues and someone who has discovered vulnerabilities in its code. On the one hand, that makes him a useful person to have inside the company – this hire feels a lot like Apple’s hire of Anand Shimpi, another prominent outside expert who was brought inside – but Apple will lose the benefit of having a vocal independent advocate on these issues. It’s also interesting to note Zdziarski’s comments about his hiring and why he’s joining Apple – he cites its privacy stance, which is of course closely tied to security concerns, as a strong motivating factor.
Given that Apple argued precisely that security backdoors almost always make their way into the hands of evildoers, this news is great validation of Apple’s refusal to cooperate with the FBI early last year, even if it’s a private firm rather than the government that’s been hacked in this case. Indeed, that seems to have been the hacker’s motivation in this case. It’s also worrying from an Apple perspective that a provider like Cellebrite should have had such lax security that a hacker could breach its systems and access these tools, assuming the claims being made here are in fact legitimate.
Hacker Steals 900 GB of Cellebrite Data – Motherboard (Jan 12, 2017)
Cellebrite was in the news about nine months ago because Bloomberg reported it was the security firm the FBI used to hack the San Bernardino shooter’s iPhone after Apple refused to help, though the Washington Post contradicted those reports. Whether or not its technology was used in that particular case, that’s exactly the sort of work Cellebrite regularly does for US and other government agencies, and it appears that it has itself now been hacked. It’s not clear that the hack goes beyond some user data, though there’s a vague reference to technical data in the article, but this sort of thing reinforces the sense that no hacks of encryption or other security technologies, even for apparently noble reasons, can ever be deemed 100% safe from being hacked themselves. That, of course, was one of several arguments Apple made in the FBI case.