Surveillance is privacy. Algorisms know best. Facebook is Friend.
It’s hard to tell which dystopian novel we’ve wondered into recently. We’re somewhere between Orwell’s 1984 (Big Brother could be the NSA or Facebook or Google?), Atwood’s A Handmaid’s Tale (so long reproductive rights!), a dash of Veronica Roth’s Divergent (Factions divided sounds like a familiar narrative), and hopefully not quite at Hunger Games levels.
Storytelling is as historically old as hominins. Constructing our own narratives is a biological drive according to Darwin’s theory of sexual selection (as opposed to his idea of natural selection). Facebook gives us a prefect platform to shout our stories from the proverbial mountaintop. “Here I am! Pay attention to me! What I’m saying is unique and important!” Then we sit back and count the likes, the loves, the smiley faces, the validation that someone cares.
Facebook is designed with this need in mind. Every week a new update rules out that knows us better, is able to predict what we want, shows us certain friends and not others, and caters ads to us specifically. This is what “algorithmic gatekeeping” means. Facebook is the lens in which we view our social media lives, we are subject to Facebook bias, and we are blind to Facebook’s control. We may be the head, but Facebook is the neck.
The author explained the concept of gatekeeping in the following excerpt: “In this analogy, your phone would algorithmically manipulate who you heard from, which sentences you heard, and in what order you heard them—keeping you on the phone longer, and thus successfully serving you more ads”. Facebook’s algorithms show you what they want you to see for whatever purpose they want to and you have no way of knowing the degree of this manipulation.
The internet is the last frontier, the modern Wild, Wild West. Laws, ethics, and oversight are playing catchup to the climate of social media. Events like “gamergate” in which “trolls” threatened self-identified female gamers, incidents of revenge porn and general bullying, hate speech and culturally insensitive remarks have received public backlash with critics called for social media companies to better monitor the content that users share. The question remains, to what degree should a social media company censor content or users? Public backlash has lead to debates about how much social media and the consequences of a person’s actions on those sites should bleed over into a person’s “real life”. Social media lifts up idols to watch them crash. Last election, Ken Bone, a bespeckled man in red sweater has hailed a hero, only to have internet history dug up and the public tide to turn on discovery that he was not as innocent as his visage alluded.
Facebook seems to be following the “don’t ask for permission, ask for forgiveness later” mantra. According to the article, Facebook revealed after the 2010 U.S. election that it ‘promoted’ voting by ‘nudging’ people to vote by suggesting that their friends had also voted. Encouraging civic engagement is fairly innocuous. Hiding events like the Ferguson protests could be harmful and suggests that Facebook algorithms suffer from the same racial bias as its creators.
For any group creating campaign pieces for social justice issues, especially if those issues are intersectional with structural racism and classism, it will be important to remember the example of Ferguson. Facebook’s algorithm will likely filter out your content. At this time, it is hard to determine how many people will see your content. Put a cheery spin on a social justice issue and you might just sneak past the algorithm.
Until Facebook is held to some measure of accountability and transparency by an oversight organization (possiblly the FCC), we will be ruled by the almighty algorithm.
There is still hope. The algorithm isn’t perfect. An artist’s rendition of a robin on a Christmas card was recently flagged as “indecent”. The controversy has now given her more coverage than her post would have originally received. You can decide for yourself here:
For additional information on our evolutionary history with storytelling, go here:
For more information on how social media and smartphones are turning you into a validation zombie:
For an more about racial bias in algorithms: