Social Media Discrimination: TikTok as a platform that was openly hiding disabled people’s content.
What is TikTok?
In the 21st century, we have witnessed the birth and rise of a diversity of new social media platforms, which all aim at bringing all people together and let them share themselves, their views, and their dreams. The latest addition to this group of networks is the social media giant, TikTok. It has gained quick success and has over 1 billion users worldwide. TikTok has been launched in 2016 by a Chinese company named ByteDance and was merged with the lip-synching application Musical.ly in 2017 to become the platform it is now. (Anderson) It has ever since attracted users to publish short videos lasting from three to 60 seconds in which they most often dance, sing, or share stories.
What are users able to find on their main feeds?
TikTok is not a platform that presents content based on accounts that are followed by the users themselves but is heavily controlled by its algorithm. (Anderson) This means there is not a precise choice about what to see or not to see like it can be experienced on Instagram for instance. TikTok owns a sort of automated content-recommendation system. Users can scroll through an endless stream of videos and with an upward swipe, they can skip a video they don’t enjoy. Every time they decide to do so, the app’s algorithm adapts what is served up next. (M.S) This controlled way in which TikTok presents its content is not only problematic because users have no real possibility of curating their main feeds and, therefore, the views and opinions they see, but also because TikTok has officially been moderating the videos in controversial manners up until September 2019. (Köver, Reuter)
To what extent was the content moderation controversial?
A leaked extract from TikTok’s rulebook has shown a section which talks about forms of content moderation called „imagery depicting a subject highly vulnerable to cyberbullying “. The network has ordered their moderators in the past to look out for particular characteristics of their users. The features that were to keep an eye out for were:
- disabled people
- people with facial disfigurements
- people with other “facial problems” such as a birthmark or squint
- Down’s syndrome
- Autism
As it can be identified from this list, TikTok has been forcing their workers to look out for people with any kind of disability. This differentiation is identified as discriminatory behaviour against people with disabilities as there is a separation of ‘healthy’ people from ‘disabled’ people. (Mattila)
This discrimination of disabled people has, however, as the social media giant would like to specify it, a precise reason. The reason was to protect users from cyber-bullying and from hate being spread towards them. (Kelion) In fact, their guidelines assert that such users are “susceptible to bullying or harassment based on their physical or mental condition”. It appears that the platform intended to just hide the content in order to avoid these issues from arising.
The website Netzpolitik, who was the first to publish this scandal, stated that the moderators were instructed to limit viewership of designated users’ posts. Furthermore, if the moderators believed the user to be particularly vulnerable, they would need to prevent their content from being exposed on the main feed. To perhaps make this process less obvious and noticeable for the general users, the platform still allowed their videos to reach between 6.000 and 10.000 views before hiding it. After hitting this number, the videos were put into the ‘not recommended’ section, which then made sure that they had basically no reach anymore. Additionally, these users were not only hardly visible in general, they were also kept from being capable to share their content with the world. Their content was then limited to be solely played in the country they published it in.
What can be taken away from TikTok’s method of protecting the more vulnerable users?
First of all, people with disabilities are being hidden from the public in a time and age, where they already have a hard time being represented in the public sphere and the media especially. They have been separated from the general public for a long time and social media actually could have represented a way of equalizing societal classes and groups. Anyone can sign up and show themselves and who they are, from where they come from, and what they are interested in. TikTok actively worked against this and separated disabled people once again. While they want to stand up for themselves and mingle with everybody, the platform picks them out and puts them into a bubble they never desired to be in. These users became a victim of ghosting, deliberately ignoring someone, and the most concerning fact is that the community of disabled persons was completely unaware of this practice. This method to ‘protect’ them is not just a form of censorship and discrimination but also, as they were unaware of it, means they also had no possibility to work against it and solve the situation.
To put this discourse into perspective, cyberbullying and who needs to be held responsible is not about the victims but about the trolls. The trolls are who need to be censored and punished for what they do to others on the platforms. In fact, as Ellcessor explains it, “physical impairments are not the cause of disability, but a society that cannot accommodate physical difference produces disability as an experience of oppression.”(3)
What TikTok has to say…
Representatives of the platform spoke out about this issue and they admitted their fault. They stated that this discriminating and active hiding was only a temporary solution to fight cyberbullying. (Köver, Reuter) The company reported to Netzpolitik’s journalists that the „technology for identifying bullying has been further developed and users are encouraged to interact positively with each other.“ Yet, it is not known how and to what extent this has changed the condition on TikTok for the troubled users. Whether communities and associations of and for disabled people have been consulted is not acknowledged.
„We encourage users to celebrate what makes them unique, while finding a community that does the same. We deeply value that our users come from a huge breadth of nationalities and cultures, and we take into account the cultural norms and local regulations of the countries we operate in.“ (TikTok Guidelines)
This quote was taken from TikTok’s community guidelines. These words, after reading about their past discrimination policy come across very ironically. It is hard to believe that the platform now stands for inclusion and deeply cares about every single community. Nonetheless, people with disabilities, as they have been treated in controversial kinds of ways for a long time throughout history, have learned to fight. If they are not supported by platforms like TikTok, people with disabilities have learned to stand up for who they are and are even more likely to take different forms of actions supported by a salient group identity (Hogg and Abrams).
Bibliography:
- Anderson, Katie Elson. “Getting acquainted with social networks and apps: it is time to talk about TikTok.” Emerald Publishing Limited. Library Hi Tech News, 2020-02-08, Vol.37 (4), p.7-12
- Community Guidelines. <www.tiktok.com/community-guidelines. 2020.>
- Elizabeth Ellcessor. Restricted Access : Media, Disability, and the Politics of Participation. NYU Press, 2016.
- Hogg, Michael, Abrams, Dominic (1988) Social Identifications: A Social Psychology of Intergroup Relations and Group Processes. London: Routledge.
- Kelion, Leo. TikTok Suppressed Disabled Users’ Videos. 3 Dec. 2019, <www.bbc.com/news/technology-50645345.>
- Köver, Chris. “Discrimination – TikTok Curbed Reach for People with Disabilities.” Netzpolitik.org, 2 Dec. 2019, <netzpolitik.org/2019/discrimination-tiktok-curbed-reach-for-people-with-disabilities/.>
- Mattila, Mikko, and Achillefs Papageorgiou. “Disability, Perceived Discrimination and Political Participation.” International Political Science Review, vol. 38, no. 5, Nov. 2017, pp. 505–519
- Smith, Adam. TikTok Censored ‘Ugly, Poor, or Disabled’ People to Attract More Users. 19 Mar. 2020, <medium.com/pcmag-access/tiktok-censored-ugly-poor-or-disabled-people-to-attract-more-users-1b8166f0b1b9.>
- S, M. Why Worry about TikTok? <www.economist.com/the-economist-explains/2020/05/06/why-worry-about-tiktok.>