Book Review: The Filter Bubble by Eli Pariser
What is the Filter Bubble?
The Filter Bubbler refers to the personalization processes taking place on the Web, which shape what content you see and more important what content you don’t get to see. Big players like Google and Facebook feed you what they think you want, based on secret profiles of you to which you have no access. In short, you may end up in a bubble of fabricated interest, based on your personal profile of which you have no control and which you cannot correct. This book addresses these personalization processes and what their future implications might be, ranging from the doing of good to the pure evil.
But who better to explain the contents of a book than its author? Take a nine minute break and watch this excellent speech by the author of the Filter Bubble, Eli Pariser, at TED. After that, continue reading below for a more in depth analysis and my personal view on the book’s content.
The Mosaic of Subcultures
Pariser restates the main point of his book very clearly by drawing an analogy between the Web on the one hand and urban planning, design and programming on the other hand. Drawing from the book A Pattern Language, Pariser describes two models of metropolises: the “heterogeneous city” and the “city of ghettos”. The heterogeneous city is basically a place where there is a lot of variety, but this variety comes in the form of an undifferentiated mass and because of this variety being all around you, every part of the city looks the same. The city of ghettos on the other hand consists of small worlds of single subcultures without connections between them, resulting in stagnation and intolerance. Pariser argues that the Web should be like the third possibility the authors of The Pattern Language describe: “a happy medium between closed ghettos and the undifferentiated mass of the heterogeneous city”, the so-called mosaic of subcultures. You can think of the Web as a heterogeneous city in the pre-indexing area (before directories and search engines), a place full of different information but with no one helping you to find it. The Web as it may become because of the effects of the Filter Bubble could be seen as the city of ghettos, made up of closed worlds. The Web should become a mosaic of subcultures in order to fulfill the two premises of human life: “a person can only fully become him- or herself in a place where he or she ‘receives support for his idiosyncrasies from the people and values which surround him'” and “you have to see lots of ways of living in order to choose the best life for yourself.” The mosaic of subcultures allows for its citizens to experience a wide array of cultures and helps souls to find their ways to cultures in which they feel at home.
Escaping the City of Ghettos
In his final chapter, Pariser offers possible solutions to escape the haunting effects of the Filter Bubble, categorized into actions for individuals, companies and the government. The proposed solutions for individuals are nicely thought of, but may not be that effective. The author proposes breaking your habit of just visiting the same websites each day by forcefully stretching your online interests (as in visiting websites you would not normally visit). Next he talks about erasing or even disabling cookies, but naturally many websites require cookies to function properly and even if you disable cookies, there is still lots of information available about you simply through your location, browser, operating system etc. As a third option he talks about choosing websites that give you more control and visibility over their policy of filtering and personalization. However, this means giving up on Facebook. His fourth point actually goes beyond the problems of the Filter Bubble, namely recognizing the fact that we live in an increasingly algorithmic society and thus we all need to develop a basic level of algorithmic literacy. I believe this to be a very valid point, but it’s also a tier two solution: you first need to develop a more than basic level of digital literacy (which I believe too many people still lack nowadays) in order to be able to build up a level of algorithmic literacy.
The proposed actions for companies are straightforward: they should be more transparent about their filtering systems, starting by clearly showing where and when personalization filters are at work. They should also provide clear information about how these systems work in order to give people control over them (when it comes to their own filtering profiles). Next, the companies should focus on optimizing for different variables, including an active promotion of public issues and cultivation of citizenship. This in itself is a quite a problematic statement, because it includes ‘hard coding’ public issues into personalization. Yet who decides what the value of a certain public issue is? Clearly, this solution is not without problems of itself. Rounding it up, Pariser states that personalization algorithms should include expose people to topics outside their normal interest and experience, sacrificing optimalization in the process.
Finally, he proposes a set of actions for governments (and citizens). Governments should require companies to give Web users control over their personal information. Pariser summarizes the 1973 Fair Information Practices which give clear instructions about the use of personal information, including that you should know who has your personal data, what it includes, how and for what purpose it is used, that it should be secure and if it happens to be incorrect you should be able to correct it. Sadly enough, these principles aren’t enforced in today’s world, even though personal data is being widely used for all kinds of purposes. To lay more weight on this problem, Parises proposes to think of personal information as a kind of property. Finally, he argues that an agency might be put in place to oversee the use of personal information.
What You Want, Whether You Want It or Not
Whether you agree with Pariser’s scenarios and solutions or not, he gives one striking example of how the Filter Bubble might influence your every move: when enough data about you has been collected, companies may be able to come up with your “persuasion profile”. A system that knows what arguments you respond to, what your interests are, even what mood you are in. With this system in place, companies can target your weak spots (including your compulsive buying patterns). Though this can be used for the good, for example by saying exactly the right things to get you to eat more healthily or run that extra mile during your work out, consider the possible implications of evil use: companies will be able to manipulate you by targeting your weak spots, at exactly the right time thanks to their knowledge of your mood. And since there is no enforcement in place which requires the companies to show when and where they are using this persuasion profiling, you might never notice that the decision you just took was never really yours to make.
The Filter Bubble at the Book Depository
Audio book of the Filter Bubble
All the lastest news from the book’s website
10 Ways to Pop Your Filter Bubble
Pariser, Eli. The Filter Bubble. What the Internet is Hiding from You. New York: The Penguin Press, 2011.