Underground Cell Service: Good or Bad

I came across the article, “Underground Cellphone Service Expands, but Some Call for Quiet.” And I thought, well that’s strange, who wouldn’t be happy about this? I mean personally, I think this is great. If people are able to communicate underground it could increase security and lessen crime. People would be able to call 911 and potential theives/muggers/perverts would realize this as well. Secondly, if you need to get in contact with someone you can. In the past, I experienced forgetting to send a text or place a call, but it’s already too late once I’m already beyond the turnstiles. Also, if the train is delayed, people could let others know they will be running late. The list goes on and on.

Potential objectors to this new expansion might argue that the train rides will become even less pleasant with people able to talk on the phone now. But I think this is an easy fix, wear headphones. I mean, if you look for the solice of a quiet ride home on a busy train, that’s your mistake. Why should everyone else have to suffer? Expanding cellphone service benefits certainly outweigh the downsides. And quite frankly, I think it’s about time they do this. They keep increasing the fare so it’s nice to see some improvements. The United States has the money and capabaility to make these changes to the MTA. And we spend money on wars and on everything everywhere else so I think this improvement is long overdue.

Check out the link and let me know what you think. Does the good outweigh the bad, or not?

http://www.nytimes.com/2013/04/26/nyregion/30-more-new-york-subway-stations-get-cellphone-service.html?ref=technology&_r=0

HW #3

In Chapter 8 of the Filter Bubble, Pariser makes a strong idea when he says, “Meanwhile, in the city of ghettos, some people get trapped in the small world of a single subculture that really doesn’t represent who they are.” The idea of comparing a person’s online identity given by the internet based on history or what they may look at online, traps someone in that one specific subculture that makes it difficult to get out of, like a ghetto. He used references from Christopher Alexander’s book. Alexander’s book is about a new way of thinking and Pariser shown throughout his entire book that he is also a believer of new thinking. An idea Pariser and Alexander both share is the “Mosaic of Subculture.” This idea referenced a happy medium.  The medium represents a person fully happy when he receives support and values those around him as well as seeing the various choices of lifestyles before being satisfied in his own. Alexander uses this idea to compare the ghettos from upper class and middle class. However, Pariser relates this to the way people see themselves in their online world.  People are comfortable when they are in their small, protected spaces that make us feel at hone and some of the websites we use, take this knowledge and use to against us.

A weak idea used by Pariser is when he began to talk about Facebook ‘s power and privacy settings shifting and not being ideal for the average person. I don’t think he should’ve talked so poorly about a company then directly after mention, “Facebook’s PR department didn’t return my emails requesting and interview (perhaps because MoveOn’s critical view of Facebook’s privacy practices is well known).” This shows that all of his comments are based off his prejudice of them not talking to him and that the company that Pariser brought up in his first chapter. His ideas of giving people control and programming our devices cant be linked well to Facebook since he already mentioned that Facebook didn’t support him.

Popping The Filter Bubble

Chapter 8 of Eli Pariser’s The Filter Bubble focuses in on what can be done to eliminate, or at least reduce, the personalization of the internet. Suggestions of action are targeted towards three major groups; individuals, companies, and, governments & citizens. Some suggestions provided seem to be likely feasible and based on individual efforts, while some suggestions are near impossible due to conflicting interests, lack of incentive or absence of regulation.

The strongest idea that Pariser has for popping the filter bubble is “Stop being a mouse.” (P223) This suggestion is directed towards all individuals and can be successfully attained by the sole efforts of an individual. The mouse traps currently existent on the internet are very efficient, given the average user of the internet is interested in merely a handful of websites, sources, and, topics therefore putting themselves right in the “bubble” or mouse trap. This does not necessarily mean individuals are narrow-minded with few interests, that small loop of websites might be used for a purpose such as business processes, and in business time is money so there is no time to take the long route. Even on a personal interest level, “habits are hard to break” (P223) but extending interests into new territory, increases the playing field of your personalization algorithms and discourages being closed in a small filter-bubble.

The weakest idea that Pariser has for popping the filter bubble is the proposed idea of the FTC to create a Do Not Track list (P238). At some point on the internet we have all attempted “private browsing” or reject access to cookies, to realize that many websites and applications do not work as they should, or may deny access altogether. Ebay, Amazon, Facebook, Google Dashboard are not the same without personalization characteristics and are therefore a MUST to provide. The Do Not Track list would “offer a binary choice-either your in or you’re out” (P238). So now if we are members of the DNT list, seems like websites (that use personalization) will not work properly or not even at all. If Google no longer collects our personal info in exchange for free tools and email, will we have a monthly email subscription and pay-per-google-search? The Do Not Track idea works for phone numbers because there is no exchange going on, its eliminating the spam while maintaining functionality, but on the internet one cannot always expect something-for-nothing. In exchange for data, individuals do receive a lot of tools and convenience to make the internet what it is today.

 

AugCog

I presented about AugCog, which was the monocle like device that woud be similar to an advanced version of Google Glass. OkCupid owner wanted to develop his own which would allow us to go into a bar and know which people are a good match for us. That sounds so fantastic we would not have to waste our time talking to the wrong person. However, maybe it is because we spoke to that wrong person we know how special the new person is. I think the more we eliminate gives us a lisser appreciation for the things we like. Also I had a previous post about searching the internet and how because we do less reading we are less intelligent. Knowing if you have a connection with someone else I don’t believe can be better achieved by a computer. We know what we like, we often get lazy and don’t want to spend time looking for it. Came across this article which reiterates some of the cues that are right out in front of us.

http://elitedaily.com/elite/2013/will-she-bang-you/

Chapter 8- Esu

In the Filter Bubble Pariser focuses on many of the problems that derive from personalization. The complexity of the internet makes it very difficult to even understand that information is being personalized and even more difficult to be able to control.

As Pariser mentions about defaults “If people will let defaults determine the fate of our friends who need lungs and hearts, we’ll certainly let them determine how we share information a lot of the time”(p.224). We are psychologically lazy and that gives internet companies the ability to take advantage of us. We will not go out of our way to attempt to be untraceable until it becomes an immediate problem in our lives. Pariser’s solution of online tracking is to delete or erase your cookies or only use websites that and transparent. That is not the default on our browsers and I still would not know how to go about deleting them. So to believe that the average person is going to research and learn how to accomplish this is not that logical.

The best solution that Pariser mentions in the Filter Bubble is the governments intervention in companies online presence. It took the government many years to develop regulations for news corporations. The Internet has not been around for that long but it is definitly time for governmental regulations. Pariser mentions that “the U.S. Federal Trade Commision is proposing a Do Not Track list, modeled after the highly successful Do Not Call list”. I remember the do not call list and I can confirm that it was successful. As I said before people are psychologically lazy. It was not difficult to add yourself to the list and it became automatic. If the same can be done for the Internet that will be a viable solution to some of the issues we face with our online presence.

HW3

Chapter 8 of the Filter Bubble talks about solutions to all of the problems that were explained throughout the book. In these solutions we see both positives and negatives, some with which I agree and one with which I do not.

The idea of “stop being a mouse” in opinion isnt the best and i definitely do not agree with it because it makes people seem that they should change their opinions and wants. With the wide variety of likes that people have asking them to change them is just not feasible.

The idea of more transparent filtering sytsems are are better but still in my opinion not the best there could be. Currently there are many companies that are already very transparent but still this doesnt help. Making the idea of the filter bubble transparent isnt that much different. People knowing its there doesnt eliminate the problem.

The best solution is the government requiring the companies to give us the people control over our personal information. This is because this give us as people more control about what is out there and this is better because it makes sense that we know more about ourselves than anyone or any other company would know about us.

Assignment#3. Solutions

Identify what you think is the strongest idea and the weakest idea that he has for combating the filter bubble and the excesses of personalization.

In the last chapter of The Filter Bubble, ‘Escape from the City of Ghettos’, Eli Pariser provides solutions to help ‘combat the filter bubble and the excesses of personalization. Pariser provides an array of recommendations on, what ‘we’ as individuals should do, what companies should do and lastly what the government should do, so as to help blow off the steam from the growing bubble.

Pariser provided a great number of good ideas, but I felt that none of the recommendations, despite being great for good reasons, would work efficiently on its own. It is impossible to bring and see change if we picked and implemented one of his solutions only. I believe that if we worked cohesively with one another, then we will be able to achieve some sort of a result.

What Individuals Can Do? : The most critical thing a person can do is to just be cautious and alert. We are becoming lazier and impatient by the day, and almost always agree to anything without even looking or reading to the agreement. This is only making it easier for companies to trap us right into their paws. An example being that while using Twitter, unless ‘you go out of your way to lock your account, everything you do is public to everyone’ (225). Therefore, the best advice Pariser gives us individuals, is to ‘change our own habits’ (222) first, by being more careful and taking more time and effort in looking into the rules and regulations online.

What Companies Can Do? : The biggest responsibility, undoubtedly, falls on the companies that are entrusted with millions of people’s personal information. Companies like Google, Facebook, Twitter, to name a few, need to take a huge step forward to understand and ‘realize their responsibilities’ (229). As Larry Lessig puts it, ‘a political response is possible only when regulation is transparent’ (229), therefore, companies need to be more public friendly and not keeps their codes under tight wrap. This is because doing so only shields ‘companies from accountability for the decisions they are making, because the decisions are difficult to see from the outside’ (230). So Pariser encourages companies to opt for open systems and take responsibility for their actions.

What Governments and Citizens Can Do? : Almost all companies are working and moving ahead with the main objective of earning profits than genuinely serving and doing good for the people. Therefore it is risky to sometimes, leave problems that are of huge magnitude, ‘in the hands of private actors with profit seeking motives’ (237). This is where the government makes use of its status and bring into play rules and regulations that limit such companies from trampling over their customers. An unsettling example I came across was while ‘it is illegal to use Brad Pitt’s image to sell a watch without his permission, Facebook is free to use your name to sell one to your friends’ (239).

The weakest idea that Pariser recommends for us would be a solution he provides for the individuals. Pariser advises us to ‘stop being a mouse’ (223), and that with us routinely checking certain sites only, allows the network to track us more easily. So Pariser advises us to do otherwise. Despite Pariser making perfect sense, this would be difficult to work to. Firstly, I check my emails every morning. Does that mean I should stop checking my emails or should I have to open five email accounts just to throw off the network from trapping me? Then comes the dilemma of when should I check my emails. If checking them every morning, when I am free, is not a good idea, then when is a good time? Also, after checking my emails, I check the weather report. The reason I check a certain website is because that website actually gives accurate information. It would be risky for me to check a different weather report every day. Even though I agree with Pariser, on how companies are able to identify users because of the way they routinely check certain websites, his advice does not work that well.

“Sorry, the system’s down”

This morning was a complete and utter disaster. Every morning that I have classes I take the Long Island Railroad to get to class on time. Today, when I arrived at the station I found out that they’re system was down and was not able to tender credit or debit transactions. Although, I usually have cash on me, today I didn’t because I was running late and didn’t have time to stop at the bank. So I was pretty much stranded at the LIRR station with no where to go. How ironic?

This got me thinking about how much we really do rely on technology. To place important phone calls, to operate the trains and cars we use, to make simple transactions, everything is powered by technology. If one little thing goes wrong, it can reconfigure our entire day or quite possibly our life. This might seem like a bit of a stretch, but just think about it. How much do you rely on transportation? Probably more than you think.

In fact, people today rely on technology for things that we could do for ourselves. For example, before the development of the GPS people got around just fine. They would drive from the east coast to the west coast with just a map. Now I am a sensible person; I acknowledge that technology has helped people in numerous ways, but perhaps we have become too dependent on it. What do you think? Does the good outweigh the bad?

 

Lorenzo~HW3 Solutions to the Filter Bubble

Throughout the years, the internet as we know has been and still is rapidly changing in ways unforeseeable by various users. The reason that it is not being realized by many is because of the consistent rate of innovation that these programs have. This means that internet users are so used to seeing different types of changes on websites and webpages that they do not really mind looking for the consequences that might be at hand. Writer and political activist Eli Pariser brought this to his audiences’ attention in his book The Filter Bubble. Pariser states that what makes up the Filter Bubble is the internet’s personalization, which basically makes its own perception of various users by using cookies and algorithms. In the chapters of this book, he weeds out all of the undesirable effects of the Filter Bubble, but in the last chapter he suggests different solutions to this problem of personalization on the web. One of these solutions has dominance over the others, while another seems to be impossible to achieve.
As humans, we tend to be more redundant with day-to-day activities without even knowing it. We wake up every day using the same procedures that sometimes are arranged in different orders, but still the same objectives and routine. This same tendency shadows how we use the online atmosphere. For example, one person might go online to check their Facebook, Twitter, Instagram, then check out the NY time’s website for top stories and that will be their main online sequence of events for a large amount of time. This is a problem that Eli Pariser calls “Mousetrap” and addresses it by the solution of “Stop being a mouse.” (223) He says, “Most of us are pretty mouselike in our information habits” meaning that we tend to circle around the same information, mousetrap, because of our natural habit of redundancy. (223) This happens because it is convenient for us to stay in that circle called the Filter Bubble and we do not like being forced out of this scheduled routine of grabbing information. Like in the example I stated above, that person would be unwilling to use another news website because he/she is overwhelmed by the original source of information with-in the Filter Bubble. If we stop being a mouse we would be able to broaden our horizon by using different domains and databases to retrieve information. The more sources we use would benefit us extremely because of the different perceptions that we are retracting information from.
We have seen what I thought to be a strong solution to the Filter Bubble, but now here is what I think is the weakest solution that Eli Pariser mentions. I do not think that using algorithmic solutions would stop this fire of personalization on the web. He used the example that, “Why not rely on everyone’s idea of what’s important.” (235) What he means by this is in regards to the Facebook “Like” button why don’t they add another component to that with the “Important” button. This would clarify the difference between what individuals would like and what they think is important. My reaction to this is that instead of dosing the fire (personalization) with water, this idea would actually be the reciprocal of that. It would be like adding more gasoline or igniting fluid to it because it is adding on more personalization by showing what we really think is important. This would still push us deep into the Filter Bubble probably deeper than before. Although some algorithms that Pariser talks about may open up people’s eyes to differentiation, they might also strengthen the beliefs that people already have in the Filter Bubble.
In conclusion, there are some solutions to how we can solve this problem of the Filter Bubble that Eli Pariser has brought to our attention. However, it all depends on the person’s awareness of their personalized internet interface on how they want to address the issue of being eased by the Filter Bubble and the information with-in it. Because it is so convenient in today’s day and age of personalization and post-materialistic views, people do not mind getting the exact information that want as quick and also as specific to their preference as possible.

HW #3 – Escaping from the Filter Bubble

    Web personalization or customizing provides users many benefits and it has become an irreversible trend in information society. However, this trend constrains the scope of personal thinking, and limits users’ exposure to a diversity of information. In chapter 8 of The Filter Bubble, the author, Eli Pariser proposes several solutions for users of the filtering system to escape from the filter bubble. He argues that individuals should try not to confine themselves in personalization algorithms by using the internet autonomously; companies should enhance a transparency in filtering policy and application of personal information; and finally the government should enforce more exhaustive regulation and legislation concerning companies’ use of personal information.

    The weaker counteractions against filter bubble are the companies’ actions including the disclosure of filtering algorithm and of how gathered information is used. Through these solutions, users would have more control and power regarding their personal information and personalization. To be honest, disclosing the filtering algorithm would be the strongest method if there were more realistic possibilities but it is unlikely that companies would change their policies which might risk both social and economic benefits. Pariser states that “Whether or not it makes the filterer’s products more secure of efficient, keeping the code under tight wraps does do one thing: It shields the companies from accountability for the decision they’re making”(230). To make these solutions working, companies should first admit that they are personalizing each user of the filter bubble using personal information without adequate consents from users. However, Pariser says that“There are strong reasons in some cases for businesses not to do so. But they’re mostly commercial reasons, not ethical ones (232)”. He also suggests that company engineers can “solve for serendipity, by designing filtering systems to expose people to topics outside their normal experience”. But, the decision on how new topics are selected is still in the hands of companies. Furthermore, to what extent would personal information be used to decide what is “new” or not? This system would be still lack of transparency on how filtering system works and would not solve privacy issues.

Most importantly, companies’ new policies on filtering system will never happen without users’ or consumers’ demands to make changes. Pariser states that “Corn syrup vendors aren’t likely to change their practices until consumers demonstrate that they’re looking for something else” (222). Therefore, the strongest resistance to the filter bubble could start with individuals’ simple actions. The most effective and strongest method is to educate ourselves by learning basics of programming and how filtering system works. If you know your enemy and yourself, you can win every battle. “Stop being a mouse” (223) by broadening our interests is not enough.  Once we have a better and clear understanding of filtering algorithms, we are more likely to address weaknesses and problems of the system. Also since we have more depth knowledge, we are less vulnerable to “tyranny of defaults” (226). Additionally, as Pariser prefers Twitter over Facebook, we can also choose internet system where provides more apparent and open filtering system. Pariser said that what individuals can do is “limited use unless the companies that are propelling personalization forward change as well” (229). And this is why it is so important that we educate ourselves to raise our voices to let companies acknowledge that their consumers are concerned about the filtering system and demand more transparency. Most importantly, if we want to be out of the bubble, first we should know how the bubble is built up.

McLuhan once said that “We shape our tools then our tools shape us”. However, We don’t want to lose control to what we’ve created for our benefits.  As we live in technology based society, filter bubble is just another huddle we have to overcome. As individuals, we should have stronger and profounder consideration regarding personalization and filtering system, so that we can convince large companies to reveal filtering algorithms, hopefully resulting in legislation regarding companies’ use of personal information of users – and finally, so that we use our technology freely without a fear of the filter bubble.

Yelp for the DMV?

There’s a great interview from last month’s Fortune Magazine with the Lieutenant Governor of California, Gavin Newsom.  He has taken his experience from the restaurant industry and wants to apply it to running a government.  More specifically, he discusses how Yelp changed the restaurant industry.  Restaurants went from serving customers how they wanted, to fearing bad reviews online.  This meant that the diners were now participants in the restaurants success, rather than the subjects of the restaurants desires.  Newsom thinks this is applicable to government because todays American citizen is more of a subject of government, not a participant in it.  He says that things are done to us, not for us.

I think this is a brilliant idea, but one that is difficult to implement.  It is smart because many Americans think that government can solve issues by throwing tax dollars at them.  This is not necessarily true, as some issues require better and smarter solutions, not gobs of money.  Also, there should be more accountability in government services, and it can be achieved in a “Yelp” like way.  For example, the DMV’s in the NYC area should all be reviewed online by users, and then rewards and punishments can be distributed accordingly.

The issue with this, is it is extremely hard to implement it on a large scale.  Sure, its easy to review the service at the DMV, or how clean your local county park is.  However, what happens if the president or congress get bad online reviews?  Do we just kick them out?  So in general, I think that Newsom’s idea is great for small government services, but far from revolutionary.  Thoughts?

HW#3 Your Filter Bubble

In chapter 8 of The Filter Bubble, Eli Pariser outlines a few ideas that may help mitigate the effects of personalization. He explains the actions individual users can take as well as what companies, the government, and individuals as citizens can do to combat the rise of a filter bubble. While many of the ideas presented by Pariser to lessen the propagation of a filter bubble are respectable, some are better and more realistic than others.

“Stop being a mouse.”(223) is probably the best, simplest, and hardest idea offered by Pariser to put into action.  This runs off the assumption that we are creatures of habit.  That “we all kinda do the same thing over and over again most of the time. And jumping out of that recursion loop is not easy to do.”(223)  Pariser admits that even he is “pretty mouselike”(223) in his information habits. It’s hard to break habits and routines since we like the comfort and ease that comes from familiarity. By actively diversifying how and what you spend time doing on the internet, you make it harder for the algorithms to “pigeonhole” your profile. This may be the best method for offsetting the effects of personalization but it’s not very realistic. Generally people use the internet sparingly to catch up on things that are most important to them. Most people will not spend their time reading or searching for a topic that they’re sort of but not really interested in even if they find it important. This is only possible if you make a conscious choice to be critical, inquisitive, and to not be afraid of feeling uncomfortable about what you read or see. This is not only good for deflecting the negative consequences of personalization but it’s also a good way of becoming a better-rounded person.

One of the weaker ideas I felt presented by Pariser were the “fully algorithmic solutions.” This takes everyone’s opinion of what they believe is important and should be seen into account. Even though I like the idea of bringing personalization to the public eye and putting it in the user’s hands, I can’t help but think you’re just creating your own filter bubble. This doesn’t leave much room for you to be exposed to different things. I feel that most people don’t know what they want but rather what they should want. I also feel that based on the way our media works and how we consume it, the general populous is not equipped with the right set of skills to discern important news from those that aren’t. I can only see problems with an “important” button as Pariser mentions on page 235. It reminds me of the Kony 2012 campaign which was essentially a viral video that over sensationalized the severity and importance of a relatively old war criminal.

The most realistic idea presented by Pariser was the one where “the engineers of the filter bubble…can solve for serendipity, by designing filtering systems to expose people to topics outside their normal experience.”(235) I was very fond of this idea because it integrates exposure into your everyday life with very little effort from the user. There’s a service called StumbleUpon, which in its early stages did exactly this. You would click a button and a random web page would appear. Though recently they have adopted the algorithmic method to determine which websites you are exposed to that is probably based on clicks, your own predefined interests, how long each user stays on a certain page, as well as user ratings of websites; which is to be expected since it is a business.

If you want things to change you should look towards yourself first and ask whether or not you’re living that change. As Mahatma Gandhi put it “Be the change you want to see in the world.” Ultimately the extent of your filter bubble is decided by you, online or off.

Chapter 8 Ideas. HW

HW #3

Strongest / Weakest Ideas

In this modern information age, relying on technology has become a second nature to most of us. Leaving our houses with a sense of security in having Google maps on our phones, constantly having phone cameras ready to capture fascinating sights, Facebook and Twitter apps always prepared for a new status update from us and many more similar actions are now a part of our everyday life. As the time goes on and reliance on the internet gains more and more importance, we can’t help but neglect its’ main problem: the filtered search results falsely identifying our personalities and creating inaccurate snapshots of who we are and what we want. In the final chapter of The Filter Bubble, Eli Pariser summarizes the problems with privacy on the internet and discusses several solutions and actions people can take to avoid becoming “bubbled in.”

The strongest point Pariser made about fighting the filtering process was when he addressed corporations urging engineers to “solve for serendipity by designing filtering systems.” I completely agree with this stand because users of the internet are simply consumers, they purchase the product and take it as it is, reaping the benefits of what it offers. Users can’t necessarily identify the hidden issues within the product and consequently definitely cannot combat them. In the case of search engines specifically, masterminds behind the algorithms have the power to alter the way we search by blurring the lines of this “bubble” we enter when viewing personalized results. Pariser admits that having less personalization in our results might decrease the popularity of search engines because “personalization system with an element of randomness will (by definition) get fewer clicks.” As this may be true, times are constantly changing and what we rely on today may be replaced by the newest trend in researching and technology. Currently, the problems with filtered results are less known and don’t yet concern many users, but as Pariser explained with an example of increased attractiveness of newspapers, the way people search is bound to change. In my opinion, this idea is particularly strong because it is human nature to follow trends and if corporations and engineers make alterations to researching processes, the public is bound to follow and enjoy search results and advertisements with much more diversity.

The weakest point made by Pariser has to do with personally breaking out of our habits. As previously mentioned, I believe it is human nature to follow trends. There are top engineers and leaders in IT for different corporations that hold the key to less personalization and decreased bias in what we view on the web. Consumers alone have minimal power over broadening their interests when they are constantly pressured with what the web believes they want to see. Pariser states that “just by stretching your interests in new directions, you give the personalizing code more breadth to work with.” I completely agree with this statement; however it needs to be taken into consideration that habits aren’t easily broken and we can’t control ourselves as effortlessly. The search results we use and the ads we view aren’t there by choice and searching for multiple topics just to cause a sort of “confusion” to the algorithm is almost impossible. For my part, I have experienced personalized ads based on a shoe shopping website I visited a very long time ago. Although, I have been to many other websites since then, ads based on that one specific search continue showing up on the sides of my browser and there is no way for me to control that aspect of my internet. Pariser does suggest regularly deleting cookies, but I believe there needs to be a much more concrete solution to this filtering problem than such a simple action, which only takes care of a small part of this huge problem.

HW#3 – Outsmarting The Personalization Algorithms

In chapter 8 of The Filter Bubble by Eli Pariser, the author provides suggestions about ways we can address the problems of the filter bubble that widely exists today in our daily lives. Eli Pariser explains to us that there are three primary categories for which we can respond to the effects of the filter bubble. The first are actions from companies, the second are actions from the government, and the third are actions from individuals.

Actions from companies in theory, is the best solution for addressing the concerns relating to personalization and privacy. If companies were transparent in their motives and they admit the use of personalization then users would be knowledgeable about the effects from using search engine X (Google) vs. search engine Y (DuckDuckGo). In the latter the search engine provider clearly and explicitly states that they do not track your behavior and they do not filter your searches whereas in the former there is no mentioning of such details that is clearly visible to the user and by clearly I am referring to not having to go through heaven and earth to be able to find the disclaimer notice. “A visitor to a personalized news site could be given the option of seeing how many other visitors were seeing which articles…of course, this requires admitting to the user that personalization is happening in the first place, and there are strong reasons in some cases for businesses not to do so. But they’re mostly commercial reasons, not ethical ones” (Pariser 232). So in a perfect world, if companies were transparent in their behavior, then the users of their services will have, at the least know what other users are engaging in and what sources others have visited. However, the world is not perfect and companies need to act in regards to their best interests and in the interests of their stakeholders, hence, foregoing the interests or their customers (the users).

Actions from government are the weakest form for addressing the issues relating to personalization and privacy. I am currently registered on the do not call list and have been for the past four years or so, however, in the past three years I have been bombarded with countless calls from telemarketers and surveyors. One might ask if the government is really monitoring these lists or have they just given up and have moved onto more important matters? In my opinion this is why I believe the Do Not Track list is the weakest method in addressing the problems of privacy and personalization. “But Do Not Track would probably offer a binary choice – either you’re in or you’re out – and services that make money on tracking might simply disable themselves for Do Not Track list members…And as a result, the process could backfire – “proving” that people don’t care about tracking, when in fact what most of us want is more nuanced ways of asserting control” (Pariser 238). If you use Facebook daily, and fall into the category of being a mouse (visiting the same site daily for an X amount times) and have opted in for the Do Not Track list, the worst of the worst may occur; you may be blocked from using Facebook. Who are we to say that they cannot ban/lock/disable your account for not “complying” with their user policy agreement? And as Pariser has said “most companies reserve the right to change the rules of the game at any time” (Pariser 239). Facebook could have the right to disable your account due to noncompliance with their policies. Let’s also not forget that if users are exhibiting the mouse like behavior that Pariser outlines in the chapter then it is most probable that the users in our previous Facebook example will continue to use Facebook vs. opting in the Do Not Track list. If users need their daily digest of their Facebook news feed, they will most likely continue to use it even if it means having Facebook tracking your every move. Think of smoking for a moment; people do it even though they know it is bad for their health, yet why do people continue to smoke? Because it is ADDICITNG – This is exactly the same reason for why as much as we love using Facebook and Google, we will continue to use their services even though they are tracking us because without it, we would be essentially going through “withdrawal.” Just like with the smoking example, in essence we are sacrificing our health (our privacy) for a moment of satisfaction and pleasure –using Facebook, Google, etc.

Now the only solution left for defending ourselves against the effects of the filter bubble are actions from individuals. “In courts around the world, information brokers are pushing this view – everyone’s better off if your online life is owned by us” (Pariser 239). This may be the most frightening comment to be posed to the users of the internet. It is as though they are saying that the cyber world (the internet) should be governed by a dictator or an oligarchy. The internet is supposed to be a leveled playing ground for all users; a world where the users are granted certain rights (rights to privacy). But again this is not a perfect world, and users continue to make the same mistakes of not knowing what information is being collected from us before it is too late. We have to ask ourselves – Who knows myself more than me? – No one, and this is why in order for us to be able defend ourselves against the filter bubble, we need to take actions into our own hands and not leave it to the government and companies because if users truly want control over their activity on the internet, then the users should be responsible for such control.

It is my strong belief that to be educated and knowledgeable about the filter bubble and the existence of such privacy and personalization concerns is one of the most important factors for being able to address these issues raised. And this is why I believe that the strongest idea for addressing the problems of the filter bubble is to develop Algorithmic Literacy. “It doesn’t take long to become literate enough to understand what most basic bits of code are doing” (Pariser 229). If Facebook tells us transparently that they are collecting our information for reasons x and y, it still doesn’t tell us how and what type of data is being collected. The most effective way for users to be able to understand how Google filters our search results or how Facebook filters our news feed is to understand the fundamentals and the basics of how their algorithm works. The personalization algorithms that companies use are all very similar in ways in which the server (the host) sends information from the users into a database (where all your data is stored) based off set patterns of behavior. We can begin to understand how such algorithm is formulated by first knowing the language of coding.

By understanding how coding works, and what a certain line of code does, it ultimately gives you the behind the scenes all access pass into how a company’s personalization algorithm works. “We need to recognize that societal values about justice, freedom, and opportunity are embedded in how code is written and what it solves for. Once we understand that, we can begin to figure out which variables we care about and imagine how we might solve for something different” (Pariser 227). If everyone knew coding then we could in reality alter our behavior to counteract the existing code that is already in place. To put it simply, if Google’s personalization algorithm does X, then we the users should do Y in order to prevent us from falling into their personalization trap. For example, when we enter a query into the Google search bar, instead of clicking on the top results, or even the first page for that matter, we should click on the second page and start clicking on those links; that way we can avoid being trapped in their personalization algorithm because we can strongly assume that the top results will be 99% tailored for each individual based on their previous behavioral preferences. By understanding code we can understand more of our behavior on the internet, we can begin to understand how a site works, how data is gathered, retrieved, stored, and assimilated. If users don’t want to be in the passenger seat anymore and having the companies personalize their preferences, then the users should start sitting in the driver’s seat – Take control of your behavior, acknowledge and admit to yourself that tracking exists, and is unavoidable to a certain extent because even if you select the second page on a Google search result every time, Google may start personalizing this newly altered behavior of yours. However, by recognizing that such algorithms exists to impede on your privacy and to create this filter bubble of yours, we can at the least be able to avoid it for a period of time. Understanding the code behind how any algorithm is executed is crucial because a code is simply line(s) of text that does either X or Y. Codes are definitive and will not do something that isn’t defined within the lines of the text. So if we know the basics for how a generic personalization algorithm works, then we (the users) will have a chance to be able to take advantage of this knowledge and alter our behavior in order to avoid being filtered – we can outsmart the personalization algorithms!

Discussing the Reading for Tomorrow (April 25)

Just a reminder that for class tomorrow you should have read the encyclopedia entry on “Information.” If you need a copy of it again, you can find it in the Gale Virtual Reference Library.

Schement, Jorge Reina. “Information.” Encyclopedia of Communication and Information. Ed. Jorge Reina Schement. Vol. 2. New York: Macmillan Reference USA, 2002. 421-426. Gale Virtual Reference Library. Web. 24 Apr. 2013.

HW # 3: Combating the Filter Bubble by Camille Hart

Chapter 8 of The Filter Bubble delineated a host of possible actions individuals, the government and companies can take to combat the filter bubble and the downsides of personalization. One of the first ideas he put forth were for individuals to stop being a mouse. Meaning, to drop the routine of checking three or four websites a day and start “stretching your interests in new directions,” which gives the “personalizing code more breadth to work with.” (Pariser 223) I thought that this recommendation was the least likely to stop the filter bubble because it is impractical. Eli Pariser urges people to basically try new things that they otherwise would not be interested in. That way the filtering code won’t over-personalize you and ascribe specific information to you.

I don’t agree with Pariser because he is asking people to change their interests. He says, “Someone who shows interest in opera and comic books and South African politics and Tom Cruise is harder to pigeonhole than someone who just shows interest in one of those things.” (Pariser 223,224) And while this may true, the obvious truth is that people may not have so many diverse interests and therefore they would not search for such diverse interests. Pariser’s suggestion of stop being a mouse is impractical and would prove least effective in combating the filter bubble.

Another idea Eli Pariser puts forth is for companies and new filterers to “start making their filtering systems more transparent to the public, so that it’s possible to have a discussion about how they’re exercising their responsibilities in the first place.” (Pariser, 229) I think this is a good proposition but it is not quite the best. “There’s plenty that the companies that power the filter bubble can do to mitigate the negative consequences of personalization, but ultimately, some of these problems are too important to leave in the hands of private actors with profit-seeking motives. That’s where governments come in.” (Pariser 237)

The best idea Pariser presents to combat the filter bubble is for the government to require “companies to give us real control over our personal information.” (Pariser 238) I think this is the best solution because only we truly know our interest, inspirations, dreams, and beliefs whether we decide to share it with the world or not. We know what’s best for us; therefore we should be the sole determinants of our personalization. No algorithm will ever understand who we really are.

HW, Filter Bubble Chapter 8

For the first seven chapters of The Filter Bubble, we learn about all of the horrible consequences that occur when we are trapped in the bubble.  Over-personalization of the Internet puts us on isolated cyber-islands, where we are denied the information we need to hear.  Rather than becoming a location of serendipitous learning, the web has become a scene that is painted for each one of us, and usually incorrectly.  In chapter 8, Pariser lays out his plan of actions on how to pop this bubble of isolation.  Dividing up the responsibilities between what government, corporations, and individuals should do, he shows us how we can return to a more balanced cyber experience.  Two of these ideas stand out, one being an extremely clever and easy way to solve for the Filter Bubble, and one that is weak and impossible to enforce.

Why do we get trapped in Filter Bubble in the first place?  “Most of us are pretty mouselike in our information habits” (223).  What this means, is that we go to the same three or four websites multiple times a day.  Every time we open the computer, we have the identical routine of visiting the same string of pages.  Also, we rarely add a new page to the routine.  This means it is very easy for web services and data aggregators to zero in on our habits, and therefore deliver information to us that they think we’ll consume.  Driven by advertising revenue, they can form an image of what we think we are and subsequently bombard us with information that they think we want.  Pariser’s solution to this problem is extremely simple.

As individuals, the main way we can lessen the power of the Filter Bubble is to stop being mice. By breaking our little routines, we become impossible to pigeonhole, or become stuck in an image of what Internet companies think we are.  Visiting a large variety of websites about wildly different information expands our Filter Bubbles until they are so large, that we are hardly in an isolation bubble at all.  The reason this is such an effective solution for the Filter Bubble is how easy it is to accomplish.  As we’ll see, some other solutions require complicated government regulation or the voluntary cooperation of a multi-billion dollar corporation.  However, it is extremely easy to quit being a mouse.  All we have to do is read about a basketball game, a mathematical discovery, elections in the Philippines, and who Taylor Swift is dating this week.

Unfortunately, all of Pariser’s solutions are not as strong as “Stop being a mouse” (223).  Although it is relatively simple to have individuals work on breaking out of the Filter Bubble, it is extremely difficult to get big Internet companies to change their policies.  Driven by enormous advertising revenues, companies such as Google and Facebook won’t change just because they should.  This is especially true of one of the solutions that Pariser promotes, that Internet companies should become more transparent, and use the history of newspapers as a model.

Pariser writes, “Google and Facebook and other new media giants could draw inspiration from the history of newspaper ombudsmen” (230).  At a newspaper, an ombudsman answers to challenges of the newspaper’s reporting from the public.  They were put in place in 1970’s in order to boost the credibility of newspapers in the eyes of the public, and create an atmosphere of transparency.  There are many reasons why this is not a legitimate model for Internet companies to follow.  First, Internet companies are not facing any loss of business because of transparency issues.  In general, everything from Google News is relatively trusted, and Google’s profits are strong and consistent.  Next, is the manner in which the public interacts with digital media.  In the past, newspapers told everyone the news.  There was no interaction, or if there was questioning, it could not be corrected until days later.  Today on the web, we interact with our news.  We can comment on a questionable Yahoo News story, or choose not to share a friend’s news story on Facebook.

In general, it is up to us to solve for the Filter Bubble.  Although Pariser calls for corporate and government actions, it is our personal responsibility to not get caught up in a Filter Bubble.  No corporation is going to voluntarily change its policies until the revenues begin to dry up.  Instead, we should force them to not pigeonhole us, all while consuming diverse news at the same time.

 

 

Work Cited

 

Pariser, Eli.  The Filter Bubble: How the New Personalized Web is Changing What we Read and How we Think.  New York: Penguin Group, 2011.  Print.

A Little Something to Brighten Up Our Day

I am not a user of Tumblr nor am I a cat lady, but when I stumbled upon this article I thought to myself, why not share something pleasant with everyone.

Once again our Internet crazed society found something to obsess about, either if it’s a catchy song like ” Gangham Style,” popular dance like ” The Harlem Shake”, now there is a new tumblr alert and trend, this is where cats are compared as lookalikes to male models.  Des Hommes et des Chatons, http://deshommesetdeschatons.tumblr.com/.

Check it out.