iPhone Face ID: Privacy issues we should worry about

On: September 25, 2017
Print Friendly, PDF & Email
About Amber de Zeeuw


On September 12th 2017, the new iPhone X was introduced by CEO Tim Cook during an Apple event that took place at the Steve Jobs Theatre in the Apple Park. Due to its features, this smartphone is considered to be the most radical since the launch of the first iPhone in 2007. The new iPhone X is the first of iPhones to consist of an edge-to-edge screen. Besides, it supports wireless charging. Presumably the most noticeable change in design is the disappearance of the characteristic home button. In order to unlock the new iPhone X, Face ID technology in stead of Touch ID is used, which is enabled by the TrueDepth camera.

According to Apple, the Face ID technology it is not simply to be understood as an image recognition system. Apple states that it will not allow a similar-looking person, individuals wearing masks, or two-dimensional pictures to unlock the iPhone, as Face ID is built on technology that works in a three-dimensional way. This technology allows for the recognition of facial features at an extremely high level of detail. As stated by Apple, the technology holds into account the changing nature of facial details. Face ID is therefore able to recognise faces as time passes by and when users for example change hairstyles, eyebrows or undergo plastic surgery.

At the moment the new iPhone X is not for sale yet. Yet, there are some concerns about the Face ID technology. Does Apple collect these unique so-called faceprints ? If this is the case, what can they do with these faceprints? What about user privacy?

Apple: “Don’t Worry!”

Apple’s Senior Vice President Software Engineering, Craig Federighi is convinced that concerns about the privacy issues regarding to the new unlocking system are unnecessary. In an recent interview with Tech Crunch he went through some of the common doubts concerning Face ID technology and explicitly stated:

“We do not gather customer data when you enroll in Face ID, it stays on your device, we do not send it to the cloud for training data.”

So according to Federighi, a faceprint will never leave the iPhone. He states that the data is encrypted and protected by the Secure Enclave as a mathematical model. Therefore, the model can not be reverse-engineered back into a ‘model of a face’. However, for future Face ID users it is important not to take their privacy for granted based on this interview with Craig Federighi.

Why should not take our privacy for granted

Apple has absolute power

Apple’s worldwide annual revenue in 2016 totaled 215 billion dollars and in the same year the company’s CEO Tim Cook announced that Apple recently sold the billionth iPhone. Without a doubt, in the future the Face ID technology will be widely used. With potentially millions of people using it, Apple will have full control over the data of a powerful tool.

Although the Apple representative Craig Federighi assured that faceprints will never leave the iPhone X, this does not mean that this will not happen eventually at some point in the future as Apple’s recent promise is not absolute assurance for the future. Besides, other ways in which faceprint data may be used may exist that have not been discussed yet.

Encryption is always breakable

In 2013 Edward Snowden leaked thousands of classified NSA documents to journalists. Publications of leaked documents revealed the details about global surveillance run by the NSA, the National Security Agency of the United States Department of Defence. The so-called ‘Snowden revelation’ in 2013, raised security and privacy awareness (Ferreira et al. 8). Cybercrime is considered a serious threat, and with the help of Snowden consumers seem to be more aware of how far government agencies can be willing to go with surveillance.

As mentioned before, Apple representative Craig Federighi state that faceprint data is encrypted and can not be reverse-engineered back into a model of a face. However, often scholars have different opinions whether encryption can be broken or not. Some think it is not possible to create a perfect encryption method, as there will always be hackers or government agencies finding ways to break it (Assante; Chau 8). In their opinion the perfect security for personal data does not exist.

However, Morten Bay suggests:

“with the emergence of quantum computing, encryption is becoming so advanced that unbreakable, impenetrable encryption may very well become a reality. Or, the time between new encryption technology becoming available and that same technology getting hacked, will become longer and longer.”

According to Bay hacking advanced encryption is possible, but it may take a considerably long time. With the processing power at hand it may take longer than a lifetime to break the encryption. Therefore, encrypted Face ID data is likely to not be hacked any day soon, but with newer and stronger computers it is likely to be possible somewhere in the future.

So What?

Information about faces can contain a lot of personal information

Popular companies like Facebook (which owns Instagram and WhatsApp as well) and Google can sell and store user’s data for as long as they want to. Without knowing what companies use their information for, consumers are unable to delete or alter personal data (Tsesis 105). Once faceprint data leaves the phone by being updated to the cloud or being hacked, Face ID users will not have control over their faceprint anymore. According to Katie Shirlton, privacy can best be understood as the ability to understand, choose and control what personal information you want to share, with whom and for how long (50). Powerlessness over personal online data frictions with the right to privacy.

Once the Face ID system is enabled, the iPhone X can become a potential technology for users to be spied on without noticing. Information about faces can contain a lot of personal information like age, gender, race but also emotions. Face ID can recognize these emotions and this information can for example be combined with on-screen content like advertisements and websites. Face ID technology might also ‘read’ the environment of the iPhone’s user. The technology might be aware of the user’s specific living conditions.

Decrypted Face ID data can present some unique privacy issues. Omer Tene and Jules Polonetsky state:

“Protecting privacy become harder as information is multiplied and shared ever more widely among multiple parties around the world. As more information regarding individuals’ health, financials, location, electricity use and online activity percolates, concerns arise about profiling, tracking, discrimination, exclusion, government surveillance and loss of control (251).”

This quote by Tene and Polonetsky tells us that individuals’ information is already being widely shared among multiple parties around the world. Highly detailed decrypted Face ID data can be combined with other kinds of available information about the consumer and this can be dangerous for certain groups of people.

For example, recent studies show, that Artificial Intelligence can now identify a person’s sexual orientation by analysing photos of their face. Relying on about 300.000 images of men and women downloaded from a popular American dating website, which makes its profiles public, an algorithm was created to predict sexuality. Highly detailed Face ID data can be combined with this ‘sexuality algorithm’ and can tell something about the sexuality of all Face ID users. In dozens of countries homosexuality is punishable and therefore decrypted faceprints can be used by governments to detect homosexuals. If locations are linked to the faceprints and the iPhone X, it should not be too hard to find potential homosexuals.

For now, if we believe Apple representative Craig Federighi, faceprints will be safely encrypted and protected by the Secure Enclave. Face ID is a powerful tool, and therefore it is important that faceprints never leave the iPhone X.

Please note, that this blogpost was not written to refrain potential buyers from purchasing the new iPhone X and using Face ID.  The main argument I tried to make in this blog post is about privacy awareness. In the wrong hands, our data can be used for (unethical) causes we might not be aware about.


Bay, Morten. “The ethics of unbreakable encryption: Rawlsian privacy and the San Bernardino iPhone.” First Monday, firstmonday.org/ojs/index.php/fm/article/view/7006/5860. Accessed 24 Sept. 2017.

Chau, Jacqui. “Application security – it all starts from here.” Computer Fraud & Security, vol. 2006, no. 6, 2006, pp. 7–9., doi:10.1016/s1361-3723(06)70366-9.

Ferreira, Denzil, et al. “Securacy.” Proceedings of the 8th ACM Conference on Security & Privacy in Wireless and Mobile Networks – WiSec 15, 2015, doi:10.1145/2766498.2766506.

Shilton, Katie. “Four Billion Little Brothers?” Queue, vol. 7, no. 7, Jan. 2009, doi:10.1145/1594204.1597790.

Tene, Omer and Jules Polinetsky. “Big Data for All: Privacy and User Control in the Age of Analytics.” Northwestern Journal of Technology and Intellectual Property, vol. 11, no 5, 2013, pp. 238-273.

Tsesis, Alexander. “The Right to be Forgotten and Erasure: Privacy, Data Brokers, and the Indefinite Retention of Data” Wake Forest Law Review, vol. 2014, no. 48, 2014, pp. 433-484.

Comments are closed.