iPhone Face ID: Privacy issues we should worry about
September 12th Apple introduced the iPhone X, pronounced as the iPhone 10. This new smartphone is considered the most radical since the launch of the first iPhone in 2007. The new iPhone X features an edge-to-edge screen and it supports wireless charging. The characteristic home button disappeared as well as the ability to unlock the iPhone using Touch ID. To unlock the iPhone X, Face ID technology can by used which is enabled by the TrueDepth camera.
Because it works in a three-dimensional way, Face ID is not simply an image recognition system. It does not allow a similar-looking person, a mask and a picture to unlock the iPhone. The system looks at your entire face and is able to recognize features at an extremely high level of detail. It is also able to update your facial details as time passes by. Face ID will still recognize a face when facial appearances have changed. The user can change his or her hairstyle, the eyebrows and can even undergo plastic surgery. Face ID will still recognize the user’s face.
If desired to gain an more extensive, in depth understanding of the new Apple products, and in particular the new iPhone X and its features, the following video made by The Verge is recommended.
At the moment the new iPhone X is not even out yet and there are already some concerns. Does Apple collect these unique so-called faceprints? If they do, what can they do with these faceprints? What about user privacy?
Apple: “Don’t Worry!”
Apple’s Senior Vice President Software Engineering, Craig Federighi is convinced that we should not worry about privacy issues concerning the new unlocking system. In an interview with Tech Crunch he went through some of the common doubts concerning Face ID and very explicitly stated:
“We do not gather customer data when you enroll in Face ID, it stays on your device, we do not send it to the cloud for training data.”
So according to Federighi, a faceprint will never leave the iPhone. The data is encrypted and protected by the Secure Enclave as a mathematical model. He states this model can not be reverse-engineered back into a ‘model of a face’.
Why should not take our privacy for granted
Future Face ID users should not take their privacy for granted. There are multiple factors users have to take into consideration.
Apple has absolute power
Apple’s worldwide annual revenue in 2016 totaled 215 billion dollars and in the same year the company’s CEO Tim Cook announced that Apple recently sold the billionth iPhone. Without a doubt, Face ID will be a widely used system. With millions of people using it, Apple will have control over a powerful tool.
Although the Apple representative Craig Federighi assured that faceprints will never leave the iPhone X, this does not mean that this will not happen eventually at some point in the future. Apple’s promise is not absolute assurance for the future. Besides, there might be other ways faceprint data is used that have not been discussed yet.
Encryption is always breakable
In 2013 Edward Snowden leaked thousands of classified NSA documents to journalists. Publications of leaked documents revealed the details about global surveillance run by the NSA, the National Security Agency of the United States Department of Defence. The so-called ‘Snowden revelation’ in 2013, raised security and privacy awareness (Ferreira et al. 8). Cybercrime is considered a serious threat, and with the help of Snowden consumers seem to be more aware of how far government agencies can be willing to go with surveillance.
As mentioned before, Apple representative Craig Federighi state that faceprint data is encrypted and can not be reverse-engineered back into a model of a face. Scholars have different opinions whether encryption can be broken or not, resulting in a seemingly infinite discussion.
Some think it is not possible to create a perfect encryption method, because there will always be hackers or government agencies finding ways to break it (Assante; Chau 8). In their opinion the perfect security for personal data does not exist.
However, Morten Bay suggests:
“with the emergence of quantum computing, encryption is becoming so advanced that unbreakable, impenetrable encryption may very well become a reality. Or, the time between new encryption technology becoming available and that same technology getting hacked, will become longer and longer.”
According to Bay hacking advanced encryption is possible, but it may take a considerably long time. With the processing power at hand it may take longer than a lifetime to break the encryption. Encrypted Face ID data will not be hacked any day soon, but with newer and stronger computers it will be possible somewhere in the future.
Popular companies like Facebook (which owns Instagram and WhatsApp as well) and Google can sell and store user’s data for as long as they want to. Without knowing what companies use their information for, consumers are unable to delete or alter personal data (Tsesis 105). Once faceprint data leaves the phone by being updated to the cloud or being hacked, Face ID users will not have control over their faceprint anymore. According to Katie Shirlton, privacy can best be understood as the ability to understand, choose and control what personal information you want to share, with whom and for how long (50). Powerlessness over personal online data frictions with the right to privacy.
Once the Face ID system is enabled, the iPhone X can become a potential technology for users to be spied on without noticing. Information about faces can contain a lot of personal information like age, gender, race but also emotions. Face ID can recognize these emotions and this information can for example be combined with on-screen content like advertisements and websites. Face ID technology might also ‘read’ the environment of the iPhone’s user. The technology might be aware of the user’s specific living conditions.
Decrypted Face ID data can present some unique privacy issues. Omer Tene and Jules Polonetsky state:
“Protecting privacy become harder as information is multiplied and shared ever more widely among multiple parties around the world. As more information regarding individuals’ health, financials, location, electricity use and online activity percolates, concerns arise about profiling, tracking, discrimination, exclusion, government surveillance and loss of control (251).”
This quote by Tene and Polonetsky tells us that individuals’ information is already being widely shared among multiple parties around the world. Highly detailed decrypted Face ID data can be combined with other kinds of available information about the consumer and this can be dangerous for certain groups of people.
For example, recent studies show, that Artificial Intelligence can now identify a person’s sexual orientation by analysing photos of their face. Relying on about 300.000 images of men and women downloaded from a popular American dating website, which makes its profiles public, an algorithm was created to predict sexuality. Highly detailed Face ID data can be combined with this ‘sexuality algorithm’ and can tell something about the sexuality of all Face ID users. In dozens of countries homosexuality is punishable and therefore decrypted faceprints can be used by governments to detect homosexuals. If locations are linked to the faceprints and the iPhone X, it should not be too hard to find potential homosexuals.
For now, if we believe Apple representative Craig Federighi, faceprints will be safely encrypted and protected by the Secure Enclave. Face ID is a powerful tool, and therefore it is important that faceprints never leave the iPhone X.
Please note, that this blogpost was not written to refrain potential buyers from purchasing the new iPhone X and using Face ID. The main argument I tried to make in this blog post is about privacy awareness. In the wrong hands, our data can be used for (unethical) causes we might not be aware about.
Bay, Morten. “The ethics of unbreakable encryption: Rawlsian privacy and the San Bernardino iPhone.” First Monday, firstmonday.org/ojs/index.php/fm/article/view/7006/5860. Accessed 24 Sept. 2017.
Chau, Jacqui. “Application security – it all starts from here.” Computer Fraud & Security, vol. 2006, no. 6, 2006, pp. 7–9., doi:10.1016/s1361-3723(06)70366-9.
Ferreira, Denzil, et al. “Securacy.” Proceedings of the 8th ACM Conference on Security & Privacy in Wireless and Mobile Networks – WiSec 15, 2015, doi:10.1145/2766498.2766506.
Shilton, Katie. “Four Billion Little Brothers?” Queue, vol. 7, no. 7, Jan. 2009, doi:10.1145/1594204.1597790.
Tene, Omer and Jules Polinetsky. “Big Data for All: Privacy and User Control in the Age of Analytics.” Northwestern Journal of Technology and Intellectual Property, vol. 11, no 5, 2013, pp. 238-273.
Tsesis, Alexander. “The Right to be Forgotten and Erasure: Privacy, Data Brokers, and the Indefinite Retention of Data” Wake Forest Law Review, vol. 2014, no. 48, 2014, pp. 433-484.