HW #3 – Escaping from the Filter Bubble

    Web personalization or customizing provides users many benefits and it has become an irreversible trend in information society. However, this trend constrains the scope of personal thinking, and limits users’ exposure to a diversity of information. In chapter 8 of The Filter Bubble, the author, Eli Pariser proposes several solutions for users of the filtering system to escape from the filter bubble. He argues that individuals should try not to confine themselves in personalization algorithms by using the internet autonomously; companies should enhance a transparency in filtering policy and application of personal information; and finally the government should enforce more exhaustive regulation and legislation concerning companies’ use of personal information.

    The weaker counteractions against filter bubble are the companies’ actions including the disclosure of filtering algorithm and of how gathered information is used. Through these solutions, users would have more control and power regarding their personal information and personalization. To be honest, disclosing the filtering algorithm would be the strongest method if there were more realistic possibilities but it is unlikely that companies would change their policies which might risk both social and economic benefits. Pariser states that “Whether or not it makes the filterer’s products more secure of efficient, keeping the code under tight wraps does do one thing: It shields the companies from accountability for the decision they’re making”(230). To make these solutions working, companies should first admit that they are personalizing each user of the filter bubble using personal information without adequate consents from users. However, Pariser says that“There are strong reasons in some cases for businesses not to do so. But they’re mostly commercial reasons, not ethical ones (232)”. He also suggests that company engineers can “solve for serendipity, by designing filtering systems to expose people to topics outside their normal experience”. But, the decision on how new topics are selected is still in the hands of companies. Furthermore, to what extent would personal information be used to decide what is “new” or not? This system would be still lack of transparency on how filtering system works and would not solve privacy issues.

Most importantly, companies’ new policies on filtering system will never happen without users’ or consumers’ demands to make changes. Pariser states that “Corn syrup vendors aren’t likely to change their practices until consumers demonstrate that they’re looking for something else” (222). Therefore, the strongest resistance to the filter bubble could start with individuals’ simple actions. The most effective and strongest method is to educate ourselves by learning basics of programming and how filtering system works. If you know your enemy and yourself, you can win every battle. “Stop being a mouse” (223) by broadening our interests is not enough.  Once we have a better and clear understanding of filtering algorithms, we are more likely to address weaknesses and problems of the system. Also since we have more depth knowledge, we are less vulnerable to “tyranny of defaults” (226). Additionally, as Pariser prefers Twitter over Facebook, we can also choose internet system where provides more apparent and open filtering system. Pariser said that what individuals can do is “limited use unless the companies that are propelling personalization forward change as well” (229). And this is why it is so important that we educate ourselves to raise our voices to let companies acknowledge that their consumers are concerned about the filtering system and demand more transparency. Most importantly, if we want to be out of the bubble, first we should know how the bubble is built up.

McLuhan once said that “We shape our tools then our tools shape us”. However, We don’t want to lose control to what we’ve created for our benefits.  As we live in technology based society, filter bubble is just another huddle we have to overcome. As individuals, we should have stronger and profounder consideration regarding personalization and filtering system, so that we can convince large companies to reveal filtering algorithms, hopefully resulting in legislation regarding companies’ use of personal information of users – and finally, so that we use our technology freely without a fear of the filter bubble.