Category Archives: Homework

HW3

Chapter 8 of the Filter Bubble talks about solutions to all of the problems that were explained throughout the book. In these solutions we see both positives and negatives, some with which I agree and one with which I do not.

The idea of “stop being a mouse” in opinion isnt the best and i definitely do not agree with it because it makes people seem that they should change their opinions and wants. With the wide variety of likes that people have asking them to change them is just not feasible.

The idea of more transparent filtering sytsems are are better but still in my opinion not the best there could be. Currently there are many companies that are already very transparent but still this doesnt help. Making the idea of the filter bubble transparent isnt that much different. People knowing its there doesnt eliminate the problem.

The best solution is the government requiring the companies to give us the people control over our personal information. This is because this give us as people more control about what is out there and this is better because it makes sense that we know more about ourselves than anyone or any other company would know about us.

Assignment#3. Solutions

Identify what you think is the strongest idea and the weakest idea that he has for combating the filter bubble and the excesses of personalization.

In the last chapter of The Filter Bubble, ‘Escape from the City of Ghettos’, Eli Pariser provides solutions to help ‘combat the filter bubble and the excesses of personalization. Pariser provides an array of recommendations on, what ‘we’ as individuals should do, what companies should do and lastly what the government should do, so as to help blow off the steam from the growing bubble.

Pariser provided a great number of good ideas, but I felt that none of the recommendations, despite being great for good reasons, would work efficiently on its own. It is impossible to bring and see change if we picked and implemented one of his solutions only. I believe that if we worked cohesively with one another, then we will be able to achieve some sort of a result.

What Individuals Can Do? : The most critical thing a person can do is to just be cautious and alert. We are becoming lazier and impatient by the day, and almost always agree to anything without even looking or reading to the agreement. This is only making it easier for companies to trap us right into their paws. An example being that while using Twitter, unless ‘you go out of your way to lock your account, everything you do is public to everyone’ (225). Therefore, the best advice Pariser gives us individuals, is to ‘change our own habits’ (222) first, by being more careful and taking more time and effort in looking into the rules and regulations online.

What Companies Can Do? : The biggest responsibility, undoubtedly, falls on the companies that are entrusted with millions of people’s personal information. Companies like Google, Facebook, Twitter, to name a few, need to take a huge step forward to understand and ‘realize their responsibilities’ (229). As Larry Lessig puts it, ‘a political response is possible only when regulation is transparent’ (229), therefore, companies need to be more public friendly and not keeps their codes under tight wrap. This is because doing so only shields ‘companies from accountability for the decisions they are making, because the decisions are difficult to see from the outside’ (230). So Pariser encourages companies to opt for open systems and take responsibility for their actions.

What Governments and Citizens Can Do? : Almost all companies are working and moving ahead with the main objective of earning profits than genuinely serving and doing good for the people. Therefore it is risky to sometimes, leave problems that are of huge magnitude, ‘in the hands of private actors with profit seeking motives’ (237). This is where the government makes use of its status and bring into play rules and regulations that limit such companies from trampling over their customers. An unsettling example I came across was while ‘it is illegal to use Brad Pitt’s image to sell a watch without his permission, Facebook is free to use your name to sell one to your friends’ (239).

The weakest idea that Pariser recommends for us would be a solution he provides for the individuals. Pariser advises us to ‘stop being a mouse’ (223), and that with us routinely checking certain sites only, allows the network to track us more easily. So Pariser advises us to do otherwise. Despite Pariser making perfect sense, this would be difficult to work to. Firstly, I check my emails every morning. Does that mean I should stop checking my emails or should I have to open five email accounts just to throw off the network from trapping me? Then comes the dilemma of when should I check my emails. If checking them every morning, when I am free, is not a good idea, then when is a good time? Also, after checking my emails, I check the weather report. The reason I check a certain website is because that website actually gives accurate information. It would be risky for me to check a different weather report every day. Even though I agree with Pariser, on how companies are able to identify users because of the way they routinely check certain websites, his advice does not work that well.

“Sorry, the system’s down”

This morning was a complete and utter disaster. Every morning that I have classes I take the Long Island Railroad to get to class on time. Today, when I arrived at the station I found out that they’re system was down and was not able to tender credit or debit transactions. Although, I usually have cash on me, today I didn’t because I was running late and didn’t have time to stop at the bank. So I was pretty much stranded at the LIRR station with no where to go. How ironic?

This got me thinking about how much we really do rely on technology. To place important phone calls, to operate the trains and cars we use, to make simple transactions, everything is powered by technology. If one little thing goes wrong, it can reconfigure our entire day or quite possibly our life. This might seem like a bit of a stretch, but just think about it. How much do you rely on transportation? Probably more than you think.

In fact, people today rely on technology for things that we could do for ourselves. For example, before the development of the GPS people got around just fine. They would drive from the east coast to the west coast with just a map. Now I am a sensible person; I acknowledge that technology has helped people in numerous ways, but perhaps we have become too dependent on it. What do you think? Does the good outweigh the bad?

 

Lorenzo~HW3 Solutions to the Filter Bubble

Throughout the years, the internet as we know has been and still is rapidly changing in ways unforeseeable by various users. The reason that it is not being realized by many is because of the consistent rate of innovation that these programs have. This means that internet users are so used to seeing different types of changes on websites and webpages that they do not really mind looking for the consequences that might be at hand. Writer and political activist Eli Pariser brought this to his audiences’ attention in his book The Filter Bubble. Pariser states that what makes up the Filter Bubble is the internet’s personalization, which basically makes its own perception of various users by using cookies and algorithms. In the chapters of this book, he weeds out all of the undesirable effects of the Filter Bubble, but in the last chapter he suggests different solutions to this problem of personalization on the web. One of these solutions has dominance over the others, while another seems to be impossible to achieve.
As humans, we tend to be more redundant with day-to-day activities without even knowing it. We wake up every day using the same procedures that sometimes are arranged in different orders, but still the same objectives and routine. This same tendency shadows how we use the online atmosphere. For example, one person might go online to check their Facebook, Twitter, Instagram, then check out the NY time’s website for top stories and that will be their main online sequence of events for a large amount of time. This is a problem that Eli Pariser calls “Mousetrap” and addresses it by the solution of “Stop being a mouse.” (223) He says, “Most of us are pretty mouselike in our information habits” meaning that we tend to circle around the same information, mousetrap, because of our natural habit of redundancy. (223) This happens because it is convenient for us to stay in that circle called the Filter Bubble and we do not like being forced out of this scheduled routine of grabbing information. Like in the example I stated above, that person would be unwilling to use another news website because he/she is overwhelmed by the original source of information with-in the Filter Bubble. If we stop being a mouse we would be able to broaden our horizon by using different domains and databases to retrieve information. The more sources we use would benefit us extremely because of the different perceptions that we are retracting information from.
We have seen what I thought to be a strong solution to the Filter Bubble, but now here is what I think is the weakest solution that Eli Pariser mentions. I do not think that using algorithmic solutions would stop this fire of personalization on the web. He used the example that, “Why not rely on everyone’s idea of what’s important.” (235) What he means by this is in regards to the Facebook “Like” button why don’t they add another component to that with the “Important” button. This would clarify the difference between what individuals would like and what they think is important. My reaction to this is that instead of dosing the fire (personalization) with water, this idea would actually be the reciprocal of that. It would be like adding more gasoline or igniting fluid to it because it is adding on more personalization by showing what we really think is important. This would still push us deep into the Filter Bubble probably deeper than before. Although some algorithms that Pariser talks about may open up people’s eyes to differentiation, they might also strengthen the beliefs that people already have in the Filter Bubble.
In conclusion, there are some solutions to how we can solve this problem of the Filter Bubble that Eli Pariser has brought to our attention. However, it all depends on the person’s awareness of their personalized internet interface on how they want to address the issue of being eased by the Filter Bubble and the information with-in it. Because it is so convenient in today’s day and age of personalization and post-materialistic views, people do not mind getting the exact information that want as quick and also as specific to their preference as possible.

HW #3 – Escaping from the Filter Bubble

    Web personalization or customizing provides users many benefits and it has become an irreversible trend in information society. However, this trend constrains the scope of personal thinking, and limits users’ exposure to a diversity of information. In chapter 8 of The Filter Bubble, the author, Eli Pariser proposes several solutions for users of the filtering system to escape from the filter bubble. He argues that individuals should try not to confine themselves in personalization algorithms by using the internet autonomously; companies should enhance a transparency in filtering policy and application of personal information; and finally the government should enforce more exhaustive regulation and legislation concerning companies’ use of personal information.

    The weaker counteractions against filter bubble are the companies’ actions including the disclosure of filtering algorithm and of how gathered information is used. Through these solutions, users would have more control and power regarding their personal information and personalization. To be honest, disclosing the filtering algorithm would be the strongest method if there were more realistic possibilities but it is unlikely that companies would change their policies which might risk both social and economic benefits. Pariser states that “Whether or not it makes the filterer’s products more secure of efficient, keeping the code under tight wraps does do one thing: It shields the companies from accountability for the decision they’re making”(230). To make these solutions working, companies should first admit that they are personalizing each user of the filter bubble using personal information without adequate consents from users. However, Pariser says that“There are strong reasons in some cases for businesses not to do so. But they’re mostly commercial reasons, not ethical ones (232)”. He also suggests that company engineers can “solve for serendipity, by designing filtering systems to expose people to topics outside their normal experience”. But, the decision on how new topics are selected is still in the hands of companies. Furthermore, to what extent would personal information be used to decide what is “new” or not? This system would be still lack of transparency on how filtering system works and would not solve privacy issues.

Most importantly, companies’ new policies on filtering system will never happen without users’ or consumers’ demands to make changes. Pariser states that “Corn syrup vendors aren’t likely to change their practices until consumers demonstrate that they’re looking for something else” (222). Therefore, the strongest resistance to the filter bubble could start with individuals’ simple actions. The most effective and strongest method is to educate ourselves by learning basics of programming and how filtering system works. If you know your enemy and yourself, you can win every battle. “Stop being a mouse” (223) by broadening our interests is not enough.  Once we have a better and clear understanding of filtering algorithms, we are more likely to address weaknesses and problems of the system. Also since we have more depth knowledge, we are less vulnerable to “tyranny of defaults” (226). Additionally, as Pariser prefers Twitter over Facebook, we can also choose internet system where provides more apparent and open filtering system. Pariser said that what individuals can do is “limited use unless the companies that are propelling personalization forward change as well” (229). And this is why it is so important that we educate ourselves to raise our voices to let companies acknowledge that their consumers are concerned about the filtering system and demand more transparency. Most importantly, if we want to be out of the bubble, first we should know how the bubble is built up.

McLuhan once said that “We shape our tools then our tools shape us”. However, We don’t want to lose control to what we’ve created for our benefits.  As we live in technology based society, filter bubble is just another huddle we have to overcome. As individuals, we should have stronger and profounder consideration regarding personalization and filtering system, so that we can convince large companies to reveal filtering algorithms, hopefully resulting in legislation regarding companies’ use of personal information of users – and finally, so that we use our technology freely without a fear of the filter bubble.

HW#3 Your Filter Bubble

In chapter 8 of The Filter Bubble, Eli Pariser outlines a few ideas that may help mitigate the effects of personalization. He explains the actions individual users can take as well as what companies, the government, and individuals as citizens can do to combat the rise of a filter bubble. While many of the ideas presented by Pariser to lessen the propagation of a filter bubble are respectable, some are better and more realistic than others.

“Stop being a mouse.”(223) is probably the best, simplest, and hardest idea offered by Pariser to put into action.  This runs off the assumption that we are creatures of habit.  That “we all kinda do the same thing over and over again most of the time. And jumping out of that recursion loop is not easy to do.”(223)  Pariser admits that even he is “pretty mouselike”(223) in his information habits. It’s hard to break habits and routines since we like the comfort and ease that comes from familiarity. By actively diversifying how and what you spend time doing on the internet, you make it harder for the algorithms to “pigeonhole” your profile. This may be the best method for offsetting the effects of personalization but it’s not very realistic. Generally people use the internet sparingly to catch up on things that are most important to them. Most people will not spend their time reading or searching for a topic that they’re sort of but not really interested in even if they find it important. This is only possible if you make a conscious choice to be critical, inquisitive, and to not be afraid of feeling uncomfortable about what you read or see. This is not only good for deflecting the negative consequences of personalization but it’s also a good way of becoming a better-rounded person.

One of the weaker ideas I felt presented by Pariser were the “fully algorithmic solutions.” This takes everyone’s opinion of what they believe is important and should be seen into account. Even though I like the idea of bringing personalization to the public eye and putting it in the user’s hands, I can’t help but think you’re just creating your own filter bubble. This doesn’t leave much room for you to be exposed to different things. I feel that most people don’t know what they want but rather what they should want. I also feel that based on the way our media works and how we consume it, the general populous is not equipped with the right set of skills to discern important news from those that aren’t. I can only see problems with an “important” button as Pariser mentions on page 235. It reminds me of the Kony 2012 campaign which was essentially a viral video that over sensationalized the severity and importance of a relatively old war criminal.

The most realistic idea presented by Pariser was the one where “the engineers of the filter bubble…can solve for serendipity, by designing filtering systems to expose people to topics outside their normal experience.”(235) I was very fond of this idea because it integrates exposure into your everyday life with very little effort from the user. There’s a service called StumbleUpon, which in its early stages did exactly this. You would click a button and a random web page would appear. Though recently they have adopted the algorithmic method to determine which websites you are exposed to that is probably based on clicks, your own predefined interests, how long each user stays on a certain page, as well as user ratings of websites; which is to be expected since it is a business.

If you want things to change you should look towards yourself first and ask whether or not you’re living that change. As Mahatma Gandhi put it “Be the change you want to see in the world.” Ultimately the extent of your filter bubble is decided by you, online or off.

Chapter 8 Ideas. HW

HW #3

Strongest / Weakest Ideas

In this modern information age, relying on technology has become a second nature to most of us. Leaving our houses with a sense of security in having Google maps on our phones, constantly having phone cameras ready to capture fascinating sights, Facebook and Twitter apps always prepared for a new status update from us and many more similar actions are now a part of our everyday life. As the time goes on and reliance on the internet gains more and more importance, we can’t help but neglect its’ main problem: the filtered search results falsely identifying our personalities and creating inaccurate snapshots of who we are and what we want. In the final chapter of The Filter Bubble, Eli Pariser summarizes the problems with privacy on the internet and discusses several solutions and actions people can take to avoid becoming “bubbled in.”

The strongest point Pariser made about fighting the filtering process was when he addressed corporations urging engineers to “solve for serendipity by designing filtering systems.” I completely agree with this stand because users of the internet are simply consumers, they purchase the product and take it as it is, reaping the benefits of what it offers. Users can’t necessarily identify the hidden issues within the product and consequently definitely cannot combat them. In the case of search engines specifically, masterminds behind the algorithms have the power to alter the way we search by blurring the lines of this “bubble” we enter when viewing personalized results. Pariser admits that having less personalization in our results might decrease the popularity of search engines because “personalization system with an element of randomness will (by definition) get fewer clicks.” As this may be true, times are constantly changing and what we rely on today may be replaced by the newest trend in researching and technology. Currently, the problems with filtered results are less known and don’t yet concern many users, but as Pariser explained with an example of increased attractiveness of newspapers, the way people search is bound to change. In my opinion, this idea is particularly strong because it is human nature to follow trends and if corporations and engineers make alterations to researching processes, the public is bound to follow and enjoy search results and advertisements with much more diversity.

The weakest point made by Pariser has to do with personally breaking out of our habits. As previously mentioned, I believe it is human nature to follow trends. There are top engineers and leaders in IT for different corporations that hold the key to less personalization and decreased bias in what we view on the web. Consumers alone have minimal power over broadening their interests when they are constantly pressured with what the web believes they want to see. Pariser states that “just by stretching your interests in new directions, you give the personalizing code more breadth to work with.” I completely agree with this statement; however it needs to be taken into consideration that habits aren’t easily broken and we can’t control ourselves as effortlessly. The search results we use and the ads we view aren’t there by choice and searching for multiple topics just to cause a sort of “confusion” to the algorithm is almost impossible. For my part, I have experienced personalized ads based on a shoe shopping website I visited a very long time ago. Although, I have been to many other websites since then, ads based on that one specific search continue showing up on the sides of my browser and there is no way for me to control that aspect of my internet. Pariser does suggest regularly deleting cookies, but I believe there needs to be a much more concrete solution to this filtering problem than such a simple action, which only takes care of a small part of this huge problem.

HW#3 – Outsmarting The Personalization Algorithms

In chapter 8 of The Filter Bubble by Eli Pariser, the author provides suggestions about ways we can address the problems of the filter bubble that widely exists today in our daily lives. Eli Pariser explains to us that there are three primary categories for which we can respond to the effects of the filter bubble. The first are actions from companies, the second are actions from the government, and the third are actions from individuals.

Actions from companies in theory, is the best solution for addressing the concerns relating to personalization and privacy. If companies were transparent in their motives and they admit the use of personalization then users would be knowledgeable about the effects from using search engine X (Google) vs. search engine Y (DuckDuckGo). In the latter the search engine provider clearly and explicitly states that they do not track your behavior and they do not filter your searches whereas in the former there is no mentioning of such details that is clearly visible to the user and by clearly I am referring to not having to go through heaven and earth to be able to find the disclaimer notice. “A visitor to a personalized news site could be given the option of seeing how many other visitors were seeing which articles…of course, this requires admitting to the user that personalization is happening in the first place, and there are strong reasons in some cases for businesses not to do so. But they’re mostly commercial reasons, not ethical ones” (Pariser 232). So in a perfect world, if companies were transparent in their behavior, then the users of their services will have, at the least know what other users are engaging in and what sources others have visited. However, the world is not perfect and companies need to act in regards to their best interests and in the interests of their stakeholders, hence, foregoing the interests or their customers (the users).

Actions from government are the weakest form for addressing the issues relating to personalization and privacy. I am currently registered on the do not call list and have been for the past four years or so, however, in the past three years I have been bombarded with countless calls from telemarketers and surveyors. One might ask if the government is really monitoring these lists or have they just given up and have moved onto more important matters? In my opinion this is why I believe the Do Not Track list is the weakest method in addressing the problems of privacy and personalization. “But Do Not Track would probably offer a binary choice – either you’re in or you’re out – and services that make money on tracking might simply disable themselves for Do Not Track list members…And as a result, the process could backfire – “proving” that people don’t care about tracking, when in fact what most of us want is more nuanced ways of asserting control” (Pariser 238). If you use Facebook daily, and fall into the category of being a mouse (visiting the same site daily for an X amount times) and have opted in for the Do Not Track list, the worst of the worst may occur; you may be blocked from using Facebook. Who are we to say that they cannot ban/lock/disable your account for not “complying” with their user policy agreement? And as Pariser has said “most companies reserve the right to change the rules of the game at any time” (Pariser 239). Facebook could have the right to disable your account due to noncompliance with their policies. Let’s also not forget that if users are exhibiting the mouse like behavior that Pariser outlines in the chapter then it is most probable that the users in our previous Facebook example will continue to use Facebook vs. opting in the Do Not Track list. If users need their daily digest of their Facebook news feed, they will most likely continue to use it even if it means having Facebook tracking your every move. Think of smoking for a moment; people do it even though they know it is bad for their health, yet why do people continue to smoke? Because it is ADDICITNG – This is exactly the same reason for why as much as we love using Facebook and Google, we will continue to use their services even though they are tracking us because without it, we would be essentially going through “withdrawal.” Just like with the smoking example, in essence we are sacrificing our health (our privacy) for a moment of satisfaction and pleasure –using Facebook, Google, etc.

Now the only solution left for defending ourselves against the effects of the filter bubble are actions from individuals. “In courts around the world, information brokers are pushing this view – everyone’s better off if your online life is owned by us” (Pariser 239). This may be the most frightening comment to be posed to the users of the internet. It is as though they are saying that the cyber world (the internet) should be governed by a dictator or an oligarchy. The internet is supposed to be a leveled playing ground for all users; a world where the users are granted certain rights (rights to privacy). But again this is not a perfect world, and users continue to make the same mistakes of not knowing what information is being collected from us before it is too late. We have to ask ourselves – Who knows myself more than me? – No one, and this is why in order for us to be able defend ourselves against the filter bubble, we need to take actions into our own hands and not leave it to the government and companies because if users truly want control over their activity on the internet, then the users should be responsible for such control.

It is my strong belief that to be educated and knowledgeable about the filter bubble and the existence of such privacy and personalization concerns is one of the most important factors for being able to address these issues raised. And this is why I believe that the strongest idea for addressing the problems of the filter bubble is to develop Algorithmic Literacy. “It doesn’t take long to become literate enough to understand what most basic bits of code are doing” (Pariser 229). If Facebook tells us transparently that they are collecting our information for reasons x and y, it still doesn’t tell us how and what type of data is being collected. The most effective way for users to be able to understand how Google filters our search results or how Facebook filters our news feed is to understand the fundamentals and the basics of how their algorithm works. The personalization algorithms that companies use are all very similar in ways in which the server (the host) sends information from the users into a database (where all your data is stored) based off set patterns of behavior. We can begin to understand how such algorithm is formulated by first knowing the language of coding.

By understanding how coding works, and what a certain line of code does, it ultimately gives you the behind the scenes all access pass into how a company’s personalization algorithm works. “We need to recognize that societal values about justice, freedom, and opportunity are embedded in how code is written and what it solves for. Once we understand that, we can begin to figure out which variables we care about and imagine how we might solve for something different” (Pariser 227). If everyone knew coding then we could in reality alter our behavior to counteract the existing code that is already in place. To put it simply, if Google’s personalization algorithm does X, then we the users should do Y in order to prevent us from falling into their personalization trap. For example, when we enter a query into the Google search bar, instead of clicking on the top results, or even the first page for that matter, we should click on the second page and start clicking on those links; that way we can avoid being trapped in their personalization algorithm because we can strongly assume that the top results will be 99% tailored for each individual based on their previous behavioral preferences. By understanding code we can understand more of our behavior on the internet, we can begin to understand how a site works, how data is gathered, retrieved, stored, and assimilated. If users don’t want to be in the passenger seat anymore and having the companies personalize their preferences, then the users should start sitting in the driver’s seat – Take control of your behavior, acknowledge and admit to yourself that tracking exists, and is unavoidable to a certain extent because even if you select the second page on a Google search result every time, Google may start personalizing this newly altered behavior of yours. However, by recognizing that such algorithms exists to impede on your privacy and to create this filter bubble of yours, we can at the least be able to avoid it for a period of time. Understanding the code behind how any algorithm is executed is crucial because a code is simply line(s) of text that does either X or Y. Codes are definitive and will not do something that isn’t defined within the lines of the text. So if we know the basics for how a generic personalization algorithm works, then we (the users) will have a chance to be able to take advantage of this knowledge and alter our behavior in order to avoid being filtered – we can outsmart the personalization algorithms!

HW # 3: Combating the Filter Bubble by Camille Hart

Chapter 8 of The Filter Bubble delineated a host of possible actions individuals, the government and companies can take to combat the filter bubble and the downsides of personalization. One of the first ideas he put forth were for individuals to stop being a mouse. Meaning, to drop the routine of checking three or four websites a day and start “stretching your interests in new directions,” which gives the “personalizing code more breadth to work with.” (Pariser 223) I thought that this recommendation was the least likely to stop the filter bubble because it is impractical. Eli Pariser urges people to basically try new things that they otherwise would not be interested in. That way the filtering code won’t over-personalize you and ascribe specific information to you.

I don’t agree with Pariser because he is asking people to change their interests. He says, “Someone who shows interest in opera and comic books and South African politics and Tom Cruise is harder to pigeonhole than someone who just shows interest in one of those things.” (Pariser 223,224) And while this may true, the obvious truth is that people may not have so many diverse interests and therefore they would not search for such diverse interests. Pariser’s suggestion of stop being a mouse is impractical and would prove least effective in combating the filter bubble.

Another idea Eli Pariser puts forth is for companies and new filterers to “start making their filtering systems more transparent to the public, so that it’s possible to have a discussion about how they’re exercising their responsibilities in the first place.” (Pariser, 229) I think this is a good proposition but it is not quite the best. “There’s plenty that the companies that power the filter bubble can do to mitigate the negative consequences of personalization, but ultimately, some of these problems are too important to leave in the hands of private actors with profit-seeking motives. That’s where governments come in.” (Pariser 237)

The best idea Pariser presents to combat the filter bubble is for the government to require “companies to give us real control over our personal information.” (Pariser 238) I think this is the best solution because only we truly know our interest, inspirations, dreams, and beliefs whether we decide to share it with the world or not. We know what’s best for us; therefore we should be the sole determinants of our personalization. No algorithm will ever understand who we really are.

HW, Filter Bubble Chapter 8

For the first seven chapters of The Filter Bubble, we learn about all of the horrible consequences that occur when we are trapped in the bubble.  Over-personalization of the Internet puts us on isolated cyber-islands, where we are denied the information we need to hear.  Rather than becoming a location of serendipitous learning, the web has become a scene that is painted for each one of us, and usually incorrectly.  In chapter 8, Pariser lays out his plan of actions on how to pop this bubble of isolation.  Dividing up the responsibilities between what government, corporations, and individuals should do, he shows us how we can return to a more balanced cyber experience.  Two of these ideas stand out, one being an extremely clever and easy way to solve for the Filter Bubble, and one that is weak and impossible to enforce.

Why do we get trapped in Filter Bubble in the first place?  “Most of us are pretty mouselike in our information habits” (223).  What this means, is that we go to the same three or four websites multiple times a day.  Every time we open the computer, we have the identical routine of visiting the same string of pages.  Also, we rarely add a new page to the routine.  This means it is very easy for web services and data aggregators to zero in on our habits, and therefore deliver information to us that they think we’ll consume.  Driven by advertising revenue, they can form an image of what we think we are and subsequently bombard us with information that they think we want.  Pariser’s solution to this problem is extremely simple.

As individuals, the main way we can lessen the power of the Filter Bubble is to stop being mice. By breaking our little routines, we become impossible to pigeonhole, or become stuck in an image of what Internet companies think we are.  Visiting a large variety of websites about wildly different information expands our Filter Bubbles until they are so large, that we are hardly in an isolation bubble at all.  The reason this is such an effective solution for the Filter Bubble is how easy it is to accomplish.  As we’ll see, some other solutions require complicated government regulation or the voluntary cooperation of a multi-billion dollar corporation.  However, it is extremely easy to quit being a mouse.  All we have to do is read about a basketball game, a mathematical discovery, elections in the Philippines, and who Taylor Swift is dating this week.

Unfortunately, all of Pariser’s solutions are not as strong as “Stop being a mouse” (223).  Although it is relatively simple to have individuals work on breaking out of the Filter Bubble, it is extremely difficult to get big Internet companies to change their policies.  Driven by enormous advertising revenues, companies such as Google and Facebook won’t change just because they should.  This is especially true of one of the solutions that Pariser promotes, that Internet companies should become more transparent, and use the history of newspapers as a model.

Pariser writes, “Google and Facebook and other new media giants could draw inspiration from the history of newspaper ombudsmen” (230).  At a newspaper, an ombudsman answers to challenges of the newspaper’s reporting from the public.  They were put in place in 1970’s in order to boost the credibility of newspapers in the eyes of the public, and create an atmosphere of transparency.  There are many reasons why this is not a legitimate model for Internet companies to follow.  First, Internet companies are not facing any loss of business because of transparency issues.  In general, everything from Google News is relatively trusted, and Google’s profits are strong and consistent.  Next, is the manner in which the public interacts with digital media.  In the past, newspapers told everyone the news.  There was no interaction, or if there was questioning, it could not be corrected until days later.  Today on the web, we interact with our news.  We can comment on a questionable Yahoo News story, or choose not to share a friend’s news story on Facebook.

In general, it is up to us to solve for the Filter Bubble.  Although Pariser calls for corporate and government actions, it is our personal responsibility to not get caught up in a Filter Bubble.  No corporation is going to voluntarily change its policies until the revenues begin to dry up.  Instead, we should force them to not pigeonhole us, all while consuming diverse news at the same time.

 

 

Work Cited

 

Pariser, Eli.  The Filter Bubble: How the New Personalized Web is Changing What we Read and How we Think.  New York: Penguin Group, 2011.  Print.

A Little Something to Brighten Up Our Day

I am not a user of Tumblr nor am I a cat lady, but when I stumbled upon this article I thought to myself, why not share something pleasant with everyone.

Once again our Internet crazed society found something to obsess about, either if it’s a catchy song like ” Gangham Style,” popular dance like ” The Harlem Shake”, now there is a new tumblr alert and trend, this is where cats are compared as lookalikes to male models.  Des Hommes et des Chatons, http://deshommesetdeschatons.tumblr.com/.

Check it out.

Did we really become this lazy?

Ever since we watched the videos in class about the “Google glass”, I became fascinated with the idea but even more obsessed with the parodies. Who would have honestly thought that technology would evolve the way it did over the past decade. I have a little niece, who is almost 3 years old, she already knows how to use a cellphone, iPad, you name it. Adolescents now know how to use these items better than most adults do. It  frightens me to contemplate of what our future generation is being raised upon… technology wise. The “Google glass” may even be one of the utmost technologies created so far, but right now I am not adjusting to the idea. I stumbled upon this article http://www.wired.com/gadgetlab/2013/04/code-in-google-glass-features/ which is summarizing some of the things Google is trying to accomplish. “Google is working on a handful of cool new features for its smart frames, including two-finger touch-to-zoom and winking to take a photo.” Really? I don’t know if I am being skeptical or what the youngsters these day would refer to as “old school”, but I just can not wrap my mind over how lazy we as people may possibly convert into when even the simplest things like taking pictures with our hands may be taken away from our society.

Homework #2 on the Final Chapter in the Filter Bubble

Due April 25 (by the start of class)

Consider the suggestions that Pariser makes in the final chapter (8) of his book about ways that we can address the problems of the filter bubble. Then, on the course blog, write a long post (1-2 pages) in which you identify what you think is the strongest idea and the weakest idea that he has for combatting the filter bubble and the excesses of personalization. Fully explain and defend the two ideas you select and use quoted material from the book as part of your argument. You will be graded on the quality of your defense, the creativity in your writing, and the polish you put into your writing (i.e., you don’t want to lose points on this because of typos or grammar and spelling errors).

Congrats we are Behavioral Study Lab Rats!

http://vator.tv/news/2013-04-13-google-picks-up-behavioral-sensing-company-behavio

Yay!! so in order to upgrade and make Google glass practical, Google acquired the company Behavio…this is a company which uses peoples phone signals in order to track human behavior and then make an ‘educated determination’ to predict what we will do next. I used to think all this technology was in order to make our lives easier that was up until I realized what the hell is going on, I kinda want to escape from this world where Im feeling more like a game pawn and lab rat than a person with feelings, thoughts, and ideas. Seriously this Behavio company seems like they are going to make our Filter bubbles even more solid and impossible to break out of or anything new to break into them.

Thoughts anyone? Am I overreacting?

My digital identity after I die? You gotta be kidding…

http://www.computerworld.com/s/article/9238354/Google_lets_users_plan_their_digital_afterlife

“You can tell us what to do with your Gmail messages and data from several other Google services if your account becomes inactive for any reason.”

“We hope that this new feature will enable you to plan your digital afterlife — in a way that protects your privacy and security — and make life easier for your loved ones after you’re gone,”

 

Uhm ok, really? This really really got to me, its not enough that they want to control what we see what we buy and who sees us while were still kicking they want to control our ‘data’ after we die? This all seems a little absurd to me. Reading this article doesnt even make me feel human anymore, just another line of 0’s and 1’s int heir computers. What if I dont want anything to happen to my emails when I die, and I dont want there to be an automatic response sent to my friends, and family, my coworkers, my employer.

“Just in case you’re inactive and not actually dead, Google is set to send you a warning via an email to a secondary address, and a text message to your cellphone.”

How nice of them, in all honesty I would love to get a text from google saying “If you are alive please reply by texting “ALIVE” back to this number”…what if I am away and on a lengthy sabbatical for 9 months and change my number? Then according to Google i’m dead? woohoo good to know.

 

 

Anyone gonna use this feature? Does this actually make anyone happy to know google offers this?

Google,Samsung,Apple? Now, Soon, Later?

http://www.forbes.com/sites/anthonykosner/2013/04/14/phone-plan-google-now-samsung-soon-or-apple-later/?partner=yahootix

This article I came across seemed rather interesting, me being obsessed with my Samsung Note 2 I am not even remotely excited about any of the phones listed. When getting the Note 2 i saw the nexus and was not impressed, the samsung 4 that is coming out isnt all that crazy and seems like its open to alot of glitches, and finally the iphone is extremely played out. As a apple veteran I can honestly say the iphones no longer excite me, on the contrary since converting to Samsung I get easily irritated by the limited things that Apple offers. Anyway I thought this was interesting that the google phone wasnt selling as much and that people are waiting for the other two.

Thoughts? Would you wait for the Samsung 4, the iphone “5S”/”6”, or go with the google phone?

Facebook Refines Ad Targeting

Based on what I have read from this article, it is clear to see that Facebook is simply trying to increase their advertising revenue from last year. Facebook rolled out a new way to advertise and market products via Facebook. By partnering up with data companies that track online and offline purchase behavior, (Acxiom, Blue Kai, Epsilon, and Datalogix) these Facebook partner categories are able to predict what consumers purchase the most and what consumers would buy again, based on previous purchases. By simply swiping or entering the 16 digits on your credit card, we are basically handing out a whole lot of information to companies without even knowing it. Talk about invasion of privacy. This relates to Pariser’s book heavily, seeing as how social networks are finding more and more concealed and profitable ways to obtain personal information. Coming from a marketing perspective I would say that this is a clever and a different way to increase revenue. However coming from someone who uses Facebook on a daily basis, I would say that this completely invades our privacy, even more than how it already does.

Any thoughts?

Source: http://bits.blogs.nytimes.com/2013/04/10/facebook-refines-ad-targeting/

Online Commenters Identify Criminal

So as many of you have heard, 56 year old Dina Perez was mugged at approximately 2:40 am in a NYC F train subway. 21 year old Aidan Folan was arrested for the brutal mugging and robbery because he was connected to the assault and robbery through social media.

Allegedly, Aidan Folan was wearing the Alpha Di Delta fraternity sweater, that he was caught on camera wearing when he assaulted Perez. He claims that his APD sweater was stolen the night of the assault and he is being framed.  Online commenters  from Gawker linked the person in the video to Aidan Folan’s facebook account as soon as the video was released. The commenter identified Folan by the pledge name printed on the back of his sweater, ‘Stugotz’ which is Italian slang for “balls.” Numerous photos on Facebook show Folan wearing the sweater. Following the linkage, a number of commenters flooded Folan’s Facebook account calling him a loser, etc.

According to Folan’s Facebook account, he graduated from St. Francis College with a degree in Broadcast Journalism and has worked as a counselor at the Center for Family Life in Sunset Park. I don’t know why someone would rob a woman when they have a sweater on that easily identifies. It is either pretty fishy or just plain stupid. Maybe he was out of his mind or maybe he was framed. What implications does the social media and online commenters have for the future of crime? Let me know what you think.

You can find this article at: http://www.huffingtonpost.com/2013/04/03/aidan-folan-arrested-for-subway-mugging-fraternity-sweatshirt_n_3009024.html?ir=New+York

“The online ad business is what we would call a ‘dark market'”

Are online advertisements a part of the “dark market?” And would companies go to any lengths to reach their online target audience? An article titled “U.S. Army, Target, others advertising on pirate sites” explores the efforts of putting ads on illegal websites, while the advertisers themselves continue to be the most trusted and respected firms and organizations in the world.

In this information age, online advertisements are one of the most effective ways to reach the right viewers. Connecting interests to displayed ads, pop-ups and sponsored search results are nothing new to us: we can guess what the company is trying to tell us and we are no longer questioning these occurrences, perceiving them to be normal part of the internet. Seeing an ad for a shoe company you often make purchases from or an offer for a magazine subscription you would be interested in doesn’t alarm us: we trust the firms being advertised. This article talks about the issue that comes up when this trust is no longer there.

Ads from reliable organizations such as the National Guard, Windows 8, Allstate, AT&T, Chevrolet, Neiman Marcus, Wal-Mart and consistently show up on illegal piracy websites. With lots of finger pointing going on, the culprit for this incidence wasn’t found. Maybe the Ad Council, responsible for ad distribution is to blame; maybe the firms themselves, attempting to make extra profit from reaching new customers on these websites. One thing is for certain, when you hear the Head of the Transparency Project aimed to eliminate ads on piracy sites exclaim that you don’t know “where the ads are coming from, where they’re going and how they’re accounted for,” you no longer even consider clicking on them.

Full Article: http://www.csoonline.com/article/730916/u.s.-army-target-others-advertising-on-pirate-sites

“Harlem shake” turns into “Suspension Date”

I know you all have heard of the “Harlem Shake” and at least has seen one or two videos of the dancing phenomena.. After what i am about to share with you i hope you have not made any yourself O.o.. According to Hayley Tsukayama of the Washington Post, 100 students around the United States have been suspended because they posted their own version of the Harlem Shake video on YouTube or other Social entities online. In her article, Harlem Shake’ videos lead to school suspensions,  these 100 students were suspended because some school districts believed that these videos showed inappropriate dancing. The National Coalition against Censorship (NCAC) found these suspensions ridiculous because these videos are just made for self-expression. Joan Bertin, NCAC Director,  says  “It seems a rather disproportionate response by educators to something that, at most, I would characterize as teenage hijinks.” In Eli Pariser’s book, Filter Bubble, this would be categorized as post-materialism at its best. As post-materialist we feel the urge to satisfy our self-image by expressing who we are through different behaviors and actions. Bertin also says “With more forms of expression, there are more reasons to engage in censorship if the people in charge are uncomfortable with forms of expression that younger generations are using,” which i find to be very true because the things that posted online are outrageous.

  1. Do you think the Harlem Shake video is really that bad?
  2. How do you feel about higher authority taking action against online content that people post?