User feedback and the automated News Feed

On: October 9, 2013
About Serkan Yildizeli


   

User feedback and the automated News Feed

The Facebook Edgerank

Recently, Facebook announced a change, or in other words a ‘tweak’, in its News Feed ads algorithm. The goal of this update, which is going to be effective in the coming weeks, is to show users more relevant ads in their News Feed. This change could be considered as part of a more general objective to “the right content to the right people at the right time” [1].

As the title of the Newsroom announcement implies, this tweak is clearly related to the ads that are being showed to users. Together with being ‘more relevant’, this ‘improvement’ should thus enhance the effectively of the ads. As Hong Ge, Engineering Manager of News Feed Ads, puts it: “[t]o choose the right ad, we listen to both people and marketers”. This way, Facebook is supposedly trying to improve the user experience and advertising effectively. However, emphasizing the marketing affordances of the platform could drive away users from the service.

Through this News Feed update, the company states that the relevancy and quality of the ads people see will improve. In this process, user feedback is emphasized and forms a key input. By hiding and/or reporting certain ads, the News Feed algorithm will show fewer irrelevant ads. The News Feed algorithm currently chooses between thousands of ads by looking at things such as user’s interests and the Page likes.

The News Feed as a (manually managed) filter bubble

Filter Bubble, a book by Eli PariserThe platform specificities of Facebook that take into account user feedback on ads, interests and page like are then used to show relevant content to the users. This mechanism of Facebook, and in particular its continuously updated algorithm, could be related to the Filter bubble. This term is coined by internet activist Eli Pariser in his book with the same name [2]. Pariser defines the concept as follows:

“The new generation of Internet filters looks at the things you seem to like—the actual things you’ve done, or the things people like you like—and tries to extrapolate. They are prediction engines, constantly creating and refining a theory of who you are and what you’ll do and want next. Together, these engines create a unique universe of information for each of us—what I’ve come to call a filter bubble—which fundamentally alters the way we encounter ideas and information” [3].

A prediction engine like the Edgerank is algorithmically determining which stories appear first to users. Through manually managing and giving feedback on content, the user can, to a certain extent influence this one sidedness and automation of the system. People then, have to hand-select and manually manage their ad hides and reports. This intervening in the recommendation/prediction system of Facebook, gives the user a certain empowerment to improve the relevancy of the presented content. Still, this empowerment will be illusive until Facebook has fully integrated the tweak and if its functioning will enhance to the user experience.

Criticism

Facebook is criticized because it would offer its users “too much candy, and not enough carrots” [4], however there are several academics and other influential thinkers that criticize the filter bubble as a concept. In a scientific research by Kartik Hosanagar et al, the authors stress that the filter bubble is ultimately “an assumption.” Their analysis has shown that the filter bubble does not have a “narrow-minded and hyper-personalized” aspect. Instead they argue, “personalization appears to be a tool that helps users widen their interests” [5].

Filter

Harvard law professor Jonathan Zittrain argues that personalization isn’t really doing what Pariser says it is. Zittrain studies web censorship, but states that the effect of a filter bubble is overestimated. “The effects of search personalization have been light”, he writes [6].

Google

Like Facebook, Google has received criticism for its accurate user profiles, consisting of huge amounts of collected data on people’s online activities. Those profiles are said to be used by search engines to provide personalized search results [7]. However, comparing the results on Google of several users has not shown much difference between one or another.  [8]. Jason Weisberg shows “there were only minor discrepancies in (…) these queries.” [9] Google stresses that the company promotes variety. The search giant says it limits personalization where it should. The company even allows you to turn off the search history personalization [10].

Marshall Kirkpatrick, a lead blogger for TechCrunch, argues the filter bubble might even be positive. “If the Facebook vision of ‘instant personalization’ comes true (…) you’ll be shown first and foremost content on topics that you have expressed an interest in already, which is described in the same ways you describe your interests and that is deemed by people you trust.” The filter bubble may lead to ability to “deep dive into specialized news and analysis on the topics that are most important to you (…) making them easier to discover than ever before.”

Bibliography

[1] ‘News Feed FYI: A Window Into News Feed’, Facebook, August 6th 2013 https://www.facebook.com/facebookforbusiness/news/News-Feed-FYI-A-Window-Into-News-Feed

[2] Pariser, Eli. The Filter Bubble: What The Internet Is Hiding From You. New York: Penguin Books Ltd, 2011.

[3] Pariser, Eli. The Filter Bubble: What The Internet Is Hiding From You. New York: Penguin Books Ltd, 2011: 10.

[4] Facebook, Google giving us information junk food, Eli Pariser warns http://www.huffingtonpost.com/2011/03/07/eli-pariser-facebook-google-ted_n_832198.html

[5] Hosanagar, Kartik et al. ‘Will the global village fracture into tribes. Recommender systems and their effects on consumers’. NET Institute working papers series, October 27th 2012. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1321962

[6] O’Connor, Rory. Friends, followers and the future. How social media are changing politics, threatening big brands and killing traditional media. San Fransisco: City Lights Bookstore.

[7] Speretta, Mirco and Susan Gauch. ‘Misearch’. Computer Society. http://www.csce.uark.edu/~jarora/misearch.pdf

[8] Your results may vary. Will the information superhighway turn into a cul-de-sac because of automated filters? http://online.wsj.com/article/SB10001424052748703421204576327414266287254.html

[9] Bubble trouble. Is web personlization turning us into solipsistic twits? http://www.slate.com/articles/news_and_politics/the_big_idea/2011/06/bubble_trouble.html

[10] Turn off search history personalization, Google https://support.google.com/accounts/answer/54048?hl=en

Comments are closed.