Category Archives: Check This Out

Who Google Thinks I Am

In last Thursday’s class, I was unable to share with you what Google thinks I am interested in. As we discussed, Google places a tracking cooking on your computer’s browser so that as you use various Google services, data gets aggregated. If you use a different computer, then a different tracking cookie gets generated for that computer and the aggregated data will look different (at least, I think this is true, as Google doesn’t state whether it combines all the tracking cookie data for you across the various machines you use). I’m not sure if the tracking cookie is unique for each browser on your computer, too. So if you use Firefox sometimes on your laptop, then switch to Chrome or Safari at other times, I’m not sure if there are separate tracking cookies collecting separate sets of data about you in each browser.

At any rate, here’s how Google has me pegged according to the activities on my work computer using the Chrome browser:

Who Google Thinks I Am

 

If you want to see your profile amassed by your activity within Google, here’s how to get there:

  1. Go to the “Ads Preferences” page in Google (if you’re not already logged into your Google account, you’ll be asked to as you try to get to this page)
  2. On the left side of the “Ads Preferences” page, look for the link for the “Ads on the Web” page and click it. This should show you Google’s demographic profile of you that it uses for serving up ads.

PEDIATRIC CANCER WALK: TEAM: CHECKMATE, CANCER!

The announcement from class:

 

http://fundly.com/checkmate-cancer

http://www.pcfwalk.org/

The walk is on Sunday, April 28 I will add details soon (registration)

this is a great cause that can help so many children, please im letting all my friends know way earlier in advance so don’t make plans!! Come out and support PCF and Checkmate, Cancer!

The first link i have posted is the story of the inspirational team leader Elona Karafin, take the 5 minutes from your busy day to watch her story and if you can donate to the cause. If not that’s ok, share the link, OR just click on the Fundly page and become a supporter! 🙂 and come do the walk!!!

199121_10152648213890103_137962196_n
535596_10151538268719104_1407455290_n

Is the Facebook News Feed Increasingly Less Useful

An article in yesterday’s New York Times looks at some of the criticisms that Facebook has been facing as the novelty of the site has worn off for some and the pressures as a public company to make money have increased:

  • the feed should offer more relevant content
  • feed content should engage users more deeply (so they stick around and notice ads more)
  • the sponsored stories in the News Feed have turned some users off
  • more people are taking breaks from Facebook for various reasons
  • teens are no longer flocking to the site and turning instead to Instagram, a company Facebook acquired last year
  • the promoted posts that companies and individuals pay for have raised questions about how the algorithm is being secretly adjusted in ways that suppress the posts from your friends
  • some feel like there’s too much junk in their News Feed and content that they really care about is being hidden or drowned out in content that’s not relevant or interesting to you

Source

Sengupta, Somini. “Face-Lift at Facebook, to Keep Its Users Engaged.” New York Times.  New York Times, 6 Mar. 2013. Web. 7 Mar. 2013.

Anonymity Online

I was watching the Today show and there was a debate about whether or not people should be allowed to leave anonymous comments on various websites and forums. A couple of people thought that it might be a good idea to have to sign into websites(such as news and polling sites)using your Facebook or LinkedIn accounts to keep the commenters from leaving hateful and offensive comments anonymously versus more productive and informative ones that they will be held accountable for. A few sites already have this as either a mandatory process to commenting on certain posts or as an option including Youtube and the New York Times website, which has a verified commenter option given to intelligent and polite frequent commenters by invitation only. These verified commenters can leave as many comments as they want without moderation.

If a lot of websites started asking for verification of identity before you could leave a comment would that really stop hateful and idiotic comments from plaguing the internet or would it simply stop a lot of people, including polite and intelligent ones, from giving their opinions for fear of being found out or judged by friends and employers that may come across their comments?

Would the information we give be used against us or shared?

Do you think it would be a good idea for more sites to use a verification process in their comments sections to create a friendlier and more productive web where people can’t hide behind their anonymity or would it lead to more privacy issues and less comments altogether?

 

 

Results of the Survey from Class Today

You can view the results of the survey we did in class today. I’d be interested in hearing your analysis of it. As a point of comparison, you may also want to take a look at the study from the Pew Internet and America Life Project where I got the set of questions from:

Miller, Carolyn et al, “How People Get Local News and Information in Different Communities.” Pew Internet. Pew Internet and Life Project, 26 Sep. 2012. Web. 26 Feb. 2013.

When Your Likes on Facebook Spiral Beyond Your Control

Well-known media theorist, Douglas Rushkoff, wrote a post on his blog at CNN in which he explains why he’s giving up on Facebook after mounting frustrations with it over the years:

Through a new variation of the Sponsored Stories feature called Related Posts, users who “like” something can be unwittingly associated with pretty much anything an advertiser pays for. Like e-mail spam with a spoofed identity, the Related Post shows up in a newsfeed right under the user’s name and picture. If you like me, you can be shown implicitly recommending me or something I like — something you’ve never heard of — to others without your consent.

For now, as long as I don’t like anything myself, I have some measure of control over what those who follow me receive in my name or, worse, are made to appear to be endorsing, themselves. But I feel that control slipping away, and cannot remain part of a system where liking me or my work can be used against you.

Chapter 2 in The Filter Bubble is entitled “The User Is the Content.” What do you think Pariser would make of Rushkoff’s concerns?

Sources

Ruskhof, Douglas. “About.” Rushkoff. N.d. Web. 26 Feb. 2013.

Rushkoff, Douglas. “Why I’m Quitting Facebook.” CNN. CNN, 25 Feb. 2013. Web. 26 Feb. 2013.

Your Trail of Personal Data

A study a few years ago by computer scientists at Stanford University shows just how often personal data can be harvested on sites that you visit. An article from the New York Times about the study notes that “[y]our online travel — your clickstream, as it’s poetically known — is not always anonymous. It can often be traced right back to rather precise parts of you, including your name and e-mail address.”

The article discusses other studies showing similar issues relating to the way that your trail of personal data can be gathered and repurposed in all sorts of ways without your ever being aware of it.

How worried should we be?

Sources

Sengupta, Somini. “Stanford Researcher Finds Lots of Leaky Web Sites.” New York Times.  New York Times, 11 Oct. 2011. Web. 25 Feb. 2013.

Firefox’s New Browser Rejects 3rd Party Cookies

This week, Firefox is in the news after developers of the browser announced that the next version will automatically block third-party cookies. Advertisers have fired back, saying that the cookies are harmless and that ad revenues are essential to the growth and development of the web.

Are the advertisers right? Do they have a point?

Sources

Sengupta, Somini. “In the Tracking Wars, It’s Browser Makers vs. Advertisers.” New York Times.  New York Times, 25 Feb. 2013. Web. 25 Feb. 2013..

 

Oled Revolutionizing Technology

Hey all so I saw this video on youtube and though OLED technology has been out for a while this was the 2013 Samsung CES Keynote speech. The fact that they have working prototypes is pretty awesome and to think that it will be out within the next year or so is exciting. I thought this applies to the Filter Bubble because with technology growing so fast will this just increase the bubble we already live in?

 

Thoughts?

 

http://www.youtube.com/watch?v=7LlH6ZjEhKk

Google Glass and Overload

It seems to me that if you’re going to strap on a set of Google Glasses (or is it now just called Glass?) and deal with info scrolling in front of you as you move around your world, you’re going to be a bit distracted. Check out some of these videos from Google about this not yet released product and let me know how useful/dangerous/inspiring/crazy you think the Glasses are and how they connect up with some of the themes in Pariser’s book, The Filter Bubble.

Love in the Time of Algorithms

There was a very interesting interview with Dan Slater in last weeks Wall Street Journal. He has written a book called Love in the Time of Algorithms, which is a complete analysis of the online dating industry (eHarmony, Okcupid, etc.)  I thought that this topic was relevant to what we’ve been reading for two reasons.  The first reason is timing.  The interview touches on how the online dating industry is about to be rocked by Facebook’s graph search.  Their experts are anticipating that people looking for love interests will be utilizing the powerful Graph search for free, rather than paying for expensive monthly subscriptions to their dating services.  The second reason Slater’s book is relevant is a little more complicated and it has to do with the methods that information is delivered to customers of online dating services.

I always wondered how the business model of eHarmony was successful.  If they perform their job well one time, they are rewarded by losing two customers.  How can they make money if by doing their job they lose business?  Slater explains that they have to deliver inefficient and efficient information to their customer.  They have to deliver efficient information (legitimate dating prospects) in order to satisfy and retain the customer.  However, if all they did was deliver efficient information, the customer would find a match and quit before paying for a few months of subscription fees.  The answer to this is inefficient information.  These inefficiencies are calculated by computer programming, and presented to the customer as profiles of members who don’t use the site anymore, or people who have only created a free profile without in depth information.  These fake dating prospects keep the customer distracted and engaged in the service, all while they are paying their monthly subscription fees. By presenting their customers with these dead ends, the computers programming of the online dating services keeps the business profitable.

I have never used an online dating service, but I was wondering if anyone in the class has?  If so, were you presented with inefficient information?

Here’s a link to the interview.  If it requires a subscription login, I can pull it up in class for anyone who wants to check it out.

Love in the Time of Algorithms 

How We Physically Interact with Techology

A group of design students at an art school created a fascinating ebook that catalogs many of the odd little gestures and behaviors we perform when interacting with our technology. Take, for example, the “baboon’s face,” that some people adopt when having a phone conversation that they want to keep private:

Curious Gestures--The Baboon Face

Check out an overview of all the gestures on this post on the Co.DESIGN blog (connected to the Fast Company magazine’s website). You can also download a PDF of the entire book, Curious Rituals: Gestural Interaction in the Digital Everyday.

Some questions to respond to:

  • Which of these gestures do you engage in?
  • Are there other gestures that didn’t get mentioned here?
  • Can you think of any gestures from our analog everyday that might be worth noting (how about the pen/pencil spinner, that dextrous move that people make on their knuckles with their writing instrument when they’re reading or engaged in thought)?

Sources

Wilson, Mark. “15 Weird Postures Forced Upon Us By Technology.” Co.DESIGN, 13 Feb. 2013. Web. 14 Feb. 2013.

Nova, Nicholas, Katherine Miyake, Walton Chiu, Nancy Kwon. Curious Rituals: Gestural Interaction in the Digital Everyday. Curious Rituals, 2012. Web. 14 Feb. 2013.

 

Not Everyone Wants the Latest Technology

A front page story in the New York Times today about the enduring popularity of fax machines in Japan raises some interesting questions:

  • What is the relationship between technological change and social attitudes?
  • The reasons why some people to opt out of moving on to the next technology wonder are usually defensible. Can you think of other examples where a seemingly outdated technology has endured and why that might be the case?

Sources

Fackler, Martin. “In Japan, the Fax Machine Rolls On.” New York Times. New York Times, 14 Feb. 2013. Web. 14 Feb. 2013.

 

Privacy Information on the Web

Author and biotechnology expert Lori Andrews was a guest on one of my favorite radio shows last week.  Her book, I Know Who You Are and I Saw What You Did: Social Networks and the Death of Privacy, discusses many of the same things Eli Pariser does, yet she looks at it through the lens of privacy rights.  Although her philosophies are a little sensational and extreme, she touched on two things I found very interesting.  First, she discusses how information on the web is moving from the public sphere to the private sphere.  Her example was Blueservo.  Blueservo is a web service where normal citizens can become virtual deputies, and watch webcams of the Texas-Mexico border.  If they see illegal immigrants crossing the border, they can report it to the authorities.  Andrews says that this act of public work (policing) being done by private citizens will just expand to many more areas such as neighborhood watches.

The second thing that Andrews discusses is how eventually we will lose so much trust in the internet that we will stop using the great things the web does.  Everything from credit card companies, employers, schools, and the government now have to ability to make decisions on our lives based on what we do on the internet.  If this trend continues to get more invasive, then a point will eventually come that the scale will be tipped and we’ll stop using the web for great things such as crowd funding and medical diagnoses and support.  Where do you think this point is?

Lori Andrew’s Web Page

Blueservo

Using Technology to Get a Leg Up

Interesting story in the New York Times today about a handful of students at Baruch College who created a computer script to repeatedly check availability for a much-in-demand course they wanted to register for. Here’s some questions to respond to:

  • Is it fair to punish the students if the college makes it so hard to get into certain courses  that some will engage in technology tricks to work around the system?
  • If you think they should be punished, what is a reasonable punishment? How would punishment for that differ for a student who hacked into the email system and accessed people’s email accounts?

Kaminer, Ariel. “Tech-Savvy Baruch College Students Seek an Edge in Registration, and Find Trouble.” New York Times 5 Feb. 2013: A19(L). Academic OneFile. Web. 5 Feb. 2013.

Is Racial Stereotyping Part of Google AdSense

An interesting story was on the BBC News website yesterday summarizing a report from Latanya Sweeney, a professor of government and technology at Harvard University and the director and founder of the Data Privacy Lab there. Professor Sweeney’s research suggests that when names commonly used by African Americans are part of a search query in Google, the results are accompanied by advertisements on the right column that are more likely to contain ads for companies that will help you locate arrest records from public records. Google’s search system has long been paired up with its AdSense program that offers ads on the side that are related in some way to your search words.

Take a look at the BBC News story, “Google Searches Expose Racial Bias, Says Study of Names” and at Professor Sweeney’s published report, “Discrimination in Online Ad Delivery” (pdf). What do you think of her findings? What could be making Google’s algorithms work this way? Can you think of any other places where seemingly “neutral” search tools might be encoded in such a way that reveals less-than-neutral assumptions about people based on race, class, gender, sexual preference, etc.?