HW#3 Your Filter Bubble

In chapter 8 of The Filter Bubble, Eli Pariser outlines a few ideas that may help mitigate the effects of personalization. He explains the actions individual users can take as well as what companies, the government, and individuals as citizens can do to combat the rise of a filter bubble. While many of the ideas presented by Pariser to lessen the propagation of a filter bubble are respectable, some are better and more realistic than others.

“Stop being a mouse.”(223) is probably the best, simplest, and hardest idea offered by Pariser to put into action.  This runs off the assumption that we are creatures of habit.  That “we all kinda do the same thing over and over again most of the time. And jumping out of that recursion loop is not easy to do.”(223)  Pariser admits that even he is “pretty mouselike”(223) in his information habits. It’s hard to break habits and routines since we like the comfort and ease that comes from familiarity. By actively diversifying how and what you spend time doing on the internet, you make it harder for the algorithms to “pigeonhole” your profile. This may be the best method for offsetting the effects of personalization but it’s not very realistic. Generally people use the internet sparingly to catch up on things that are most important to them. Most people will not spend their time reading or searching for a topic that they’re sort of but not really interested in even if they find it important. This is only possible if you make a conscious choice to be critical, inquisitive, and to not be afraid of feeling uncomfortable about what you read or see. This is not only good for deflecting the negative consequences of personalization but it’s also a good way of becoming a better-rounded person.

One of the weaker ideas I felt presented by Pariser were the “fully algorithmic solutions.” This takes everyone’s opinion of what they believe is important and should be seen into account. Even though I like the idea of bringing personalization to the public eye and putting it in the user’s hands, I can’t help but think you’re just creating your own filter bubble. This doesn’t leave much room for you to be exposed to different things. I feel that most people don’t know what they want but rather what they should want. I also feel that based on the way our media works and how we consume it, the general populous is not equipped with the right set of skills to discern important news from those that aren’t. I can only see problems with an “important” button as Pariser mentions on page 235. It reminds me of the Kony 2012 campaign which was essentially a viral video that over sensationalized the severity and importance of a relatively old war criminal.

The most realistic idea presented by Pariser was the one where “the engineers of the filter bubble…can solve for serendipity, by designing filtering systems to expose people to topics outside their normal experience.”(235) I was very fond of this idea because it integrates exposure into your everyday life with very little effort from the user. There’s a service called StumbleUpon, which in its early stages did exactly this. You would click a button and a random web page would appear. Though recently they have adopted the algorithmic method to determine which websites you are exposed to that is probably based on clicks, your own predefined interests, how long each user stays on a certain page, as well as user ratings of websites; which is to be expected since it is a business.

If you want things to change you should look towards yourself first and ask whether or not you’re living that change. As Mahatma Gandhi put it “Be the change you want to see in the world.” Ultimately the extent of your filter bubble is decided by you, online or off.