Monopolies of Knowledge Facilitated by Black-Box Algorithms

On: September 23, 2018
Print Friendly, PDF & Email
About Holly Foxton


   

Whether you search for something on Google or Twitter, or scroll down an Instagram or Facebook feed, a vast amount, if not a majority, of online content is present to users via algorithms. These algorithms, the code of which is usually composed of by employees of these companies, present the medium’s content to a user based on a huge variety of factors, broadly from the input of the user (based on things such as settings, activity and/or preferences), as well as the judgments coming internally from the programming of the algorithm itself. This latter input category to a algorithmic ranking system will usually exist as a ‘black-box’ algorithm, meaning “a system whose workings are mysterious; we can observe its inputs and outputs, but we cannot tell how one becomes the other” (Pasquale 3). As such, the “interlocking technical and legal prohibitions prevent anyone outside such a company from understanding fundamental facts about it” (Pasquale 8). This, arguably, allows these technology giants significant sources of power, as can be interpreted from the work of Harold Innis, a Canadian political economy scholar who died in 1952 (Heyer 1).

The most referred to aspect of Innis’ contributions to media and communication theories is in his observation that mediums tend to be bias in terms of space or time. He posited that mediums that were “more durable in character” possessed a time bias, and historically supported hierarchical societies that held traditional and religious value in esteem (Innis Bias 7). This is opposed to societies that predominantly used space-bias mediums to communicate which, being lighter and easier to transport, facilitated political aims and the growth of empires (Bias 7). Whichever power structure a civilisation’s dominant medium supported, according to Innis, would allow these certain types of institutions power, as they would possess control over the information and knowledge content it mediated.

Innis wrote and lived in an era before even televisions were a staple of most homes, and as such his theories were based around pre-digital communication mediums. In the years since his death, a number of works have made attempts at applying his concepts of time and space bias to contemporary society’s current dominant mode of communication (e.g. Frost; Corosky). However, today’s complex and turbulent media environment provides little clear guidance as to what the dominant medium is, let alone whether it possesses a time or space bias. Can the dominant medium be considered to be social media, where an ever increasing proportion of Western society obtain their news (Shearer and Gottfried)? Or perhaps it could be, more broadly, ‘the web’ or ‘the internet’?

It is possible to interpret a time bias in the nature of social media and the internet due to their decentralised, peer-to-peer characteristics (Frost 15), as well as how content posted online has few means to decay at the speed that paper might. On the other hand, when you consider users’ ephemeral consumption practices on some platforms (e.g. ten-second to twenty-four-hour stories on Snapchat and Instagram as well the constantly updated Facebook Newsfeed), this could be indicative of inattention to the temporal longevity affordances of the technology. In addition, the ability of the Internet to practically instantly transport packets of information across the world is certainly reminiscent of a space-bias. The point that is being made here is that today’s communication medium environment does not neatly fit into the defining characteristics that Innis specified. As a result, it might not be immediately clear what faction of society stands to gain power from the medium, in the way that time-bias mediums supported religious power and space-bias mediums supported political powers. It may, therefore, be valuable to use Innis’ “monopolies of knowledge” concept to interpret where power lies in the new media environment, and thus far this angle has been seemingly neglected.

A Complex System of Writing

Innis proposed that a “monopoly or oligopoly of knowledge” develops when a dominant medium was too weighted in its time or space bias, and when the “complex system of writing” that composes the content of the medium becomes inaccessible to a majority of the civilisation (Innis Bias 4). The medium’s language would then be in the “possession of a special class” that has mastery over it (Innis Bias 4). Right away, there are clear parallels in this concept to the impermeability of understanding how users consume media online via the black-box algorithms.

It is crucial to emphasise the significance that these black-box algorithms have in dictating “which information will appear in our infosphere, how many and which of your friends will see your posts, what kind of content will become part of your reality and what will be censored or deleted” (ShareLab). One study by Kulshrestha et al. attempted to research the bias of the Twitter search result ranking algorithm, and they found that there was a potential for undecided voters to perhaps be swayed by the biases of the search result rankings. Whilst they were able to discern a number of user input factors that contributed to the search results, they found that they were limited in their findings as the black box algorithm prevented them “from being able to pin-point the exact feature(s) of the algorithm which might be leading to the bias being introduced in the search results” (429). If this is the precedent set for entering into the era of the Internet and social media as dominant mediums, there is reason to question how assured the power relations in today’s society are. The elite aristocracy that has authority on the monopoly of knowledge is in a precarious position by Innis’ analysis, as their inability to expand access to the “complex system of writing” would typically lead to the fall or limitation of their domain (e.g. Innis Empire 28-29).

In the last year, calls have been made from within the AI industry for greater transparency in the black-box algorithms that influence a number of government decisions. The experts claim it is possible to “disclose information about systems and their performance without disclosing their code,” indicating that these systems do not have to be completely closed to avoid undermining intellectual property (Simonite). If the black-box algorithms that are a core element of popular communication technologies today continue to be so opaque, and if Innis’ remarks that “history tends to repeat itself but in the changing accents of the period in which it is written” (Innis Bias 61) are to be believed, then it may be critical to question the legal impenetrability of these ubiquitous algorithms’ code in the pursuit of a stable society.

Bibliography

Corosky, Gregory. ‘Global Public Discourse in the Internet Age’. McGill Journal of Political Studies, 2013, pp. 52–65.

Frost, Catherine. ‘How Prometheus Is Bound: Applying the Innis Method of Communications Analysis to the Internet’. Canadian Journal of Communication, vol. 28, no. 1, Jan. 2003. CrossRef, doi:10.22230/cjc.2003v28n1a1338.

Heyer, Paul. Harold Innis. Rowman & Littlefield, 2003.

Innis, Harold. Empire and Communications. Oxford University Press, 1950.

—. The Bias of Communication. University of Toronto Press, 1951.

Pasquale, Frank. The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press, 2015.

Sharelab. Immaterial Labour and Data Harvesting: Facebook Algorithmic Factory (1). 2016, https://labs.rs/en/facebook-algorithmic-factory-immaterial-labour-and-data-harvesting/.

Shearer, Elisa, and Jeffrey Gottfried. News Use Across Social Media Platforms 2017. Pew Research Center, 2017, http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/.

Simonite, Tom. ‘AI Experts Want to End “Black Box” Algorithms in Government’. Wired, Oct. 2017, https://www.wired.com/story/ai-experts-want-to-end-black-box-algorithms-in-government/.

 

Comments are closed.