C´mon Apple, you as well!? – The iOS 12 Update will calculate your trust score

By: Luisa Zap
On: September 24, 2018
About Luisa Zap


   

September has been an important month for Apple so far. The new iPhones have launched and the new iOS update has managed to overshadow the excitement of many Apple fans. One month ago, Facebook demonstrated a new way of tracking people by giving users trust ratings and Apple is following in their footsteps. Since September 17, everyone with an iPhone was able to get the latest update with several improvements regarding the privacy policy including new controversial changes as well (Gibbs).
How to protect the privacy of users in a digital world where the infrastructure, the storage of information becomes invisible and the term ´Reality Mining´ becomes more important? Those and many others questions bother politicians, activists, scientists, and users on a daily basis while assessing Big Data. Due to new updates, apps, and inventions, privacy issues need to be addressed over and over again.

The Launch of Trust Scores

For many years talking about Apple was never a talk about Big Data. Apple was known for being more focused on design and hardware and being more secretive about their data infrastructures. This, however, did not last for too long (Marr 285). Nowadays the company can compete with tech giants such as Google and Microsoft with software like Siri which is primarily powered by Big Data. Collecting Data to create an algorithm for personalizing and recognizing unusual behavior became crucial in that business (Marr 256). Therefore, by downloading the iOS 12 update, the device will send information about the number of calls and emails to Apple. Then, Apple will decide, based on the encrypted score, whether they can trust you with transactions or other significant activities on your phone.

Figure 1: Photo by Bernard Hermant

“To help identify and prevent fraud, information about how you use your device, including the approximate number of phone calls or emails you send and receive, will be used to compute a device trust score when you attempt a purchase. The submissions are designed so Apple cannot learn the real values on your device. The scores are stored for a fixed time on our servers” (Apple)

On one hand, the Californian tech giant argues they are using the data exclusively for recognizing fraudulent behavior while the user is making payments and claims to store the information just for a limited period of time(Keach). On the other hand, Apple never made an announcement on how long the personal data will be stored and how the calculation of your trust score is being derived from the number of calls you make and receive. Apple gives very contradictive information. They say the information that they get will not contain any content of emails and phone calls, but in fact, Apple TVs did have the anti-fraud update as well and TVs do not make phone calls and a normal user would not use their TV to write and receive emails (Cuthbertson). So what kind of data are they actually monitoring?

Trying to Do the Right Thing

Apple CEO, Tim Cook, introduced the user-rating system quite gently and secretly even though he was fully committed to transparency in the past (Palmer). He says he would preserve the trust score for being sold to third parties. Nevertheless, they collect a tremendous amount of personal data through iCloud or iTunes which already challenged the boundaries of human rights. The big adjustment now is the additional data about offline behavior, such as phone calls, that will divine whether payments you make are legitimate. The Apple product and the Apple data warehouse are constantly exchanging knowledge.
The invention of mobile phones indused a flow of data sets which opened a discourse on reality mining. “Reality mining is the collection and analysis of machine-sensed environmental data pertaining to human social behavior, with the goal of identifying predictable patterns of behavior” (Wikipedia).
Nathan Eagle, who was assisted by the MIT Human Dynamics Lab, was a pioneer in investigating into Reality-Mining by tracking people and their personal mobile phones. The outcome was the first mobile data set with personal behavior. According to him, the rapid evolution of smartphones provides more opportunities to “collect a much larger dataset on human behavior“(255). Previously, Reality Mining was based on the “App installation behavior” (Frey et al.), today Apple is doing Reality Mining based on call and email behavior, additional to their previous data collection sources.
In another publication, Eagle and Greene held on to the idea that “applications of Reality Mining is tracking and predicting disease and epidemics” (143) – an initially positive approach. This idea can also be applied to Apple`s intention. With an update that communicates back to the warehouse, non-trustful transactions can be predicted and ´diseases` can be fixed. However, those predictions might be incorrect since the human nature is very complex and abstract data can rarely guide one to the reality on the ground (Eagle and Greene 144).

Figure 2: Photo by Franki Chamaki

We Trust, They Do Not

Some might compare the new update and say it is a shift towards having a Social Credit System as is it branded in China and on one of the big Netflix shows (Hopkins). One important difference is, though that in China the user decides on the app what information they type in manually. As far as the consumers know, Apple is currently taking a massive amount of sensitive information for the ´trust score` with little transparency toward the user. There is no place on the device where you can look up the score to check how high or low it might be. In addition, not knowing how long they keep the information and why tvOS devices have a ´trust score` as well makes the iPhone owners listen more attentively. It is clear in principle to everyone that in order to protect and improve the safety of people, organizations will collect data about users to detect scammers and frauds. To identify those ´bad guys` they will always need to have an eye on the ´behaving people`.

Figure 3: Photo by Henrik Dønnestad

“People have pretty varying opinions of how or when their data can be shared and used by companies. Most Terms of Service agreements, however, specify the use and collection of data, and by agreeing to them, most of us agree to the `If I make information public, it is okay for anyone to use that data,` statement”(van Rijmenam).

But even if the use and collection of data are not being specified we agree to the terms, don`t we?

 

 

References:

Apple. “ITunes Store & Privacy.” Apple Support, 17 Sept. 2018, https://support.apple.com/en-us/HT208477.

Bamberger, Kenneth A., and Deirdre K. Mulligan. Privacy on the Ground: Driving Corporate Behavior in the United States and Europe. The MIT Press, 2015.

Cuthbertson, Anthony. “Apple Is Quietly Giving People Black Mirror-Style ‘trust Scores’ Using Their IPhone Data.” The Independent, 20 Sept. 2018, https://www.independent.co.uk/life-style/gadgets-and-tech/news/apple-trust-score-iphone-data-black-mirror-email-phone-fraud-a8546051.html.

Eagle, Nathan, and Kate Greene. Reality Mining: Using Big Data to Engineer a Better World. The MIT Press, 2014.

Eagle, Nathan, and Alex (Sandy) Pentland. “Reality Mining: Sensing Complex Social Systems.” Personal and Ubiquitous Computing, vol. 10, no. 4, May 2006, pp. 255–68. Crossref, doi:10.1007/s00779-005-0046-3.

Frey, Remo Manuel, et al. Reality-Mining with Smartphones: Detecting and Predicting Life Events Based on App Installation Behavior. p. 10.

Gibbs, Samuel. “IOS 12: Everything You Need to Know about New IPhone Features.” The Guardian, 5 June 2018. www.theguardian.com, https://www.theguardian.com/technology/2018/jun/05/ios-12-iphone-new-operating-system-everything-you-need-to-know-notifications-privacy-emoji.

Hopkins, Matt. “Apple’s Fighting Fraud By Giving You A ‘Trust Score’ Based On Calls & Emails.” Pedestrian TV, 21 Sept. 2018, https://www.pedestrian.tv/tech/apple-assigning-trust-scores-black-mirror/.

Keach, Sean. Apple Gives You a TRUST Rating – and It’s Based on Your Phone Call and Email Habits. 20 Sept. 2018, https://www.thesun.co.uk/tech/7303020/apple-trust-score-phone-calls-emails/.

Marr, Bernard. Big Data in Practice: How 45 Successful Companies Used Big Data Analytics to Deliver Extraordinary Results. Wiley, 2016.

Palmer, Annie. “Apple Will Track Calls and Emails in IOS 12 to Fight Fraud.” Mail Online, 19 Sept. 2018, https://www.dailymail.co.uk/sciencetech/article-6186097/Apples-iOS-12-update-gives-users-trust-score-tracking-calls-emails.html.

“Reality Mining.” Wikipedia, 10 Sept. 2018. Wikipedia, https://en.wikipedia.org/w/index.php?title=Reality_mining&oldid=858879336.

van Rijmenam, Mark. “Big Data Ethics: How Does It Affect Your Privacy?” Datafloq, https://datafloq.com/read/big-data-ethics-affect-your-privacy/241. Accessed 22 Sept. 2018.

 

 

Figure 1: —. Sign, Round, Red and Ring HD Photo by Bernard Hermant (@bernardhermant) on Unsplash. https://unsplash.com/photos/OLLtavHHBKg. Accessed 23 Sept. 2018.

Figure 2: Unsplash. Data Has A Better IDEA Photo by Franki Chamaki (@franki) on Unsplash. https://unsplash.com/photos/1K6IQsQbizI. Accessed 23 Sept. 2018.

Figure 3: —. Neon Pop Photo by Henrik Dønnestad (@spaceboy) on Unsplash. https://unsplash.com/photos/4UwRjnnWt90. Accessed 23 Sept. 2018.

Comments are closed.