There has been a lot of talk about Tim Cook (Chief Executive Officer of Apple Inc.) and his policy for data privacy at Apple and his push for better consumer data privacy. For example, there’s this, from a June 10, 2022 article by Kif Leswing for CNBC,
Key Points
- Apple CEO Tim Cook said in a letter to Congress that lawmakers should advance privacy legislation that’s currently being debated “as soon as possible.”
- The bill would give consumers protections and rights dealing with how their data is used online, and would require that companies minimize the amount of data they collect on their users.
- Apple has long positioned itself as the most privacy-focused company among its tech peers.
…
Apple has long positioned itself as the most privacy-focused company among its tech peers, and Cook regularly addresses the issue in speeches and meetings. Apple says that its commitment to privacy is a deeply held value by its employees, and often invokes the phrase “privacy is a fundamental human right.”
It’s also strategic for Apple’s hardware business. Legislation that regulates how much data companies collect or how it’s processed plays into Apple’s current privacy features, and could even give Apple a head start against competitors that would need to rebuild their systems to comply with the law.
…
More recently with rising concerns regarding artificial intelligence (AI), Apple has rushed to assure customers that their data is still private, from a May 10, 2024 article by Kyle Orland for Ars Technica, Note: Links have been removed,
Apple’s AI promise: “Your data is never stored or made accessible to Apple”
And publicly reviewable server code means experts can “verify this privacy promise.”
With most large language models being run on remote, cloud-based server farms, some users have been reluctant to share personally identifiable and/or private data with AI companies. In its WWDC [Apple’s World Wide Developers Conference] keynote today, Apple stressed that the new “Apple Intelligence” system it’s integrating into its products will use a new “Private Cloud Compute” to ensure any data processed on its cloud servers is protected in a transparent and verifiable way.
“You should not have to hand over all the details of your life to be warehoused and analyzed in someone’s AI cloud,” Apple Senior VP of Software Engineering Craig Federighi said.-
…
Part of what Apple calls “a brand new standard for privacy and AI” is achieved through on-device processing. Federighi said “many” of Apple’s generative AI models can run entirely on a device powered by an A17+ or M-series chips, eliminating the risk of sending your personal data to a remote server.
When a bigger, cloud-based model is needed to fulfill a generative AI request, though, Federighi stressed that it will “run on servers we’ve created especially using Apple silicon,” which allows for the use of security tools built into the Swift programming language. The Apple Intelligence system “sends only the data that’s relevant to completing your task” to those servers, Federighi said, rather than giving blanket access to the entirety of the contextual information the device has access to.
…
But you don’t just have to trust Apple on this score, Federighi claimed. That’s because the server code used by Private Cloud Compute will be publicly accessible, meaning that “independent experts can inspect the code that runs on these servers to verify this privacy promise.” The entire system has been set up cryptographically so that Apple devices “will refuse to talk to a server unless its software has been publicly logged for inspection.”
While the keynote speech was light on details [emphasis mine] for the moment, the focus on privacy during the presentation shows that Apple is at least prioritizing security concerns in its messaging [emphasis mine] as it wades into the generative AI space for the first time. We’ll see what security experts have to say [emphasis mine] when these servers and their code are made publicly available in the near future.
Orland’s caution/suspicion would seem warranted in light of some recent research from scientists in Finland. From an April 3, 2024 Aalto University press release (also on EurekAlert), Note: A link has been removed,
Privacy. That’s Apple,’ the slogan proclaims. New research from Aalto University begs to differ.
Study after study has shown how voluntary third-party apps erode people’s privacy. Now, for the first time, researchers at Aalto University have investigated the privacy settings of Apple’s default apps; the ones that are pretty much unavoidable on a new device, be it a computer, tablet or mobile phone. The researchers will present their findings in mid-May at the prestigious CHI conference [ACM CHI Conference on Human Factors in Computing Systems, May 11, 2024 – May 16, 2024 in Honolulu, Hawaii], and the peer-reviewed research paper is already available online.
‘We focused on apps that are an integral part of the platform and ecosystem. These apps are glued to the platform, and getting rid of them is virtually impossible,’ says Associate Professor Janne Lindqvist, head of the computer science department at Aalto.
The researchers studied eight apps: Safari, Siri, Family Sharing, iMessage, FaceTime, Location Services, Find My and Touch ID. They collected all publicly available privacy-related information on these apps, from technical documentation to privacy policies and user manuals.
The fragility of the privacy protections surprised even the researchers. [emphasis mine]
‘Due to the way the user interface is designed, users don’t know what is going on. For example, the user is given the option to enable or not enable Siri, Apple’s virtual assistant. But enabling only refers to whether you use Siri’s voice control. Siri collects data in the background from other apps you use, regardless of your choice, unless you understand how to go into the settings and specifically change that,’ says Lindqvist.
Participants weren’t able to stop data sharing in any of the apps
In practice, protecting privacy on an Apple device requires persistent and expert clicking on each app individually. Apple’s help falls short.
‘The online instructions for restricting data access are very complex and confusing, and the steps required are scattered in different places. There’s no clear direction on whether to go to the app settings, the central settings – or even both,’ says Amel Bourdoucen, a doctoral researcher at Aalto.
In addition, the instructions didn’t list all the necessary steps or explain how collected data is processed.
The researchers also demonstrated these problems experimentally. They interviewed users and asked them to try changing the settings.
‘It turned out that the participants weren’t able to prevent any of the apps from sharing their data with other applications or the service provider,’ Bourdoucen says.
Finding and adjusting privacy settings also took a lot of time. ‘When making adjustments, users don’t get feedback on whether they’ve succeeded. They then get lost along the way, go backwards in the process and scroll randomly, not knowing if they’ve done enough,’ Bourdoucen says.
In the end, Bourdoucen explains, the participants were able to take one or two steps in the right direction, but none succeeded in following the whole procedure to protect their privacy.
Running out of options
If preventing data sharing is difficult, what does Apple do with all that data? [emphasis mine]
It’s not possible to be sure based on public documents, but Lindqvist says it’s possible to conclude that the data will be used to train the artificial intelligence system behind Siri and to provide personalised user experiences, among other things. [emphasis mine]
Many users are used to seamless multi-device interaction, which makes it difficult to move back to a time of more limited data sharing. However, Apple could inform users much more clearly than it does today, says Lindqvist. The study lists a number of detailed suggestions to clarify privacy settings and improve guidelines.
For individual apps, Lindqvist says that the problem can be solved to some extent by opting for a third-party service. For example, some participants in the study had switched from Safari to Firefox.
Lindqvist can’t comment directly on how Google’s Android works in similar respects [emphasis mine], as no one has yet done a similar mapping of its apps. But past research on third-party apps does not suggest that Google is any more privacy-conscious than Apple [emphasis mine].
So what can be learned from all this – are users ultimately facing an almost impossible task?
‘Unfortunately, that’s one lesson,’ says Lindqvist.
I have found two copies of the researchers’ paper. There’s a PDF version on Aalto University’s website that bears this caution,
This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail.
Here’s a link to and a citation for the official version of the paper,
Privacy of Default Apps in Apple’s Mobile Ecosystem by Amel Bourdoucen and Janne Lindqvist. CHI. ’24: Proceedings of the CHI Conference on Human Factors in Computing Systems May 2024 Article No.: 786 Pages 1–32 DOI: https://doi.org/10.1145/3613904.3642831 Published:11 May 2024
This paper is open access.