Last week around this time, I heard about the Cambridge Analytica scandal which now I think is almost hilariously ironic given this reading about algorithmic harms. Before I start stressing how the scandal showcases both positive and negative manipulations, I think it’s important to understand algorithmic gatekeeping and agency.
When Tufekci talks about gatekeeping, I think of it in a literal sense. With a gatekeeper, you have someone that controls whether or not someone/something comes in or whether it comes out. This gatekeeping could be part of a familiar process with someone that typically comes and goes, or it could be an unexpected visitor which then the gatekeeper has to decide whether or not to allow someone in. But unlike people, algorithmic gatekeeping is the flow of information in and out, and it’s decided by computational processes. This algorithm is usually hidden from the public which makes the flow of information much more complex. With algorithmic gatekeeping, it can’t necessarily decide what’s right and what’s wrong. It only corrects where the information goes, or revises what information is released, or who can see what information in order to manipulate data and people’s thoughts. Algorithmic gatekeeping in the generic sense is controlling what content people see, but generally is computed in a set way for large groups of people. Instagram is a great example considering gatekeeping, because rather than keeping information chronological, the platform has a specific algorithm to help get information out that a person might typically miss.
Computational agency also derives from algorithmic gatekeeping, but it specifically focuses on how the information and these algorithms act with outside agencies, this meaning decision making. When I think of agency, this is when the Cambridge Analytica scandal comes to my mind. Although you can think of some algorithms accidentally categorizing information as still being part of agency, another way to look at it is through the consulting firm. Facebook gave access to Cambridge Analytica through a quiz that was a Facebook app. Through this app, Cambridge Analytica then took its research and used it to help the Trump campaign. Through data manipulation, it may have been able to influence how people voted. I felt this was relevant when in an undercover video C.A. bragged that it created the “Crooked Hillary” campaign that it spread throughout Facebook.
Tufekci brings up some interesting points about algorithmic editing in public writing. More people are reading things online, so this could mean that information reaches more people. However, with so many online platforms, maybe social media makes information reach fewer audiences because it is more narrowed depending on what people are interested in, or what algorithms decide to show people first. I think depending on what message is being spread, public writing being posted online versus being posted in a newspaper or appearing on television can be received differently. For example, “Flat-earth believers” I feel are received better online than in a print format. As Tufekci points out, pages that are printed the same way for mass amounts of people to read are easier to challenge. If you only know one thing, and that is that the earth is round, one reading is not going to change your opinion. Based on first-hand experience, I’ve seen many people online consider that the earth is flat based on the way information can be manipulated through algorithms.
For my first campaign piece that is a still image of what mental health problems exist for a college student, someone might immediately love it or immediately hate it. Through my second campaign piece that deals with audio however, it could be receive in many different ways. I feel like if I posted my audio piece online, some words I say could be accepted as fact. If Forbes Magazine posted my work and praised it, it would gain more legitimacy. If Pitt posted it, it would create a large audience since the information I am giving pertains to that specific audience. However, if Fox News posted the same information, my message could be restricted due to the large conservative audience that it has. As a public writer, I will have to consider the content of my campaign piece, and how different platforms might aid or restrict my message. Algorithmic gatekeeping changes the game when writing for the public, because now as a writer, people must consider how they can beat the game or win against an algorithm/computer system so intelligent it can tell more about the reader/writer than they know about themselves.