Close Reading Essay “GHOST WORK” by Mary L. Gray and Siddharth Suri
The article is generally about how both authors team up to reveal the truth of how big companies like amazon, apple, google, Microsoft work so smoothly and so advance. There’s labor being done to make these companies so great and they aren’t treated the best. This economy covers the work being put behind all the tech screens and fancy designs to hide the real work being done by individuals who could be replaced easily because of the company’s size and accessibility. Authors also go on to say we all participated in ghost work before making us part of the ghost economy, we have to stop believing this tech is going to replace and take over because in reality we are the ones making the tech more advanced so without the ghost workers there will be more advanced.
“The human labor powering many mobile phone apps, websites, and artificial intelligence systems can be hard to see- in fact, it’s often intentionally hidden.
A new challenge is looming deep underneath the surface of the web, lost in our wrong-headed debates about AI. Anthropologist Mary L. Gray and computer scientist Siddharth Suri are teaming up to show how market services such as Amazon , Google, Microsoft, and Uber can only operate seamlessly thanks to the judgment and expertise of a massive, unseen human labor force. These individuals doing “ghost jobs” make the internet seem intelligent. They do high-tech piecework: X-rated content flagging, proofreading, engine parts designing, and much more. In this “ghost economy,” an estimated 8 percent of Americans have worked at least once and that number is increasing. For conventional jobs, they typically receive less than the legal minimum, have no health insurance, and can be fired for any or no reason at any time.
Most of the contract work today protects AI when it can’t do anything on its own. The dirty little secret of many services is that real live human beings clean up much of the web, behind the scenes, from FacebookM to the “automatic” removal of heinous videos on YouTube, as well as many others. Those magical bots responding to your tweets complaining about your delayed delivery of pizza or the service back to Boston on your flight? They are the latest contract labor world concealed under a layer of AI. Retail, marketing , and customer service are reshaping a hybrid of people and AI. It turns out that, depending on what standards or principles we want to enforce, AI, like humans, struggles to make difficult decisions about what content should and should not be included in our everyday social media diets. The real story is not that by including human editors, Facebook biased its trending topics; it is that today’s AI does not work without people in the loop, whether it’s delivering the news or a complicated pizza order. Content moderation and curation require people employed by technology and media firms to make decisions on what to leave up or take down, from news feeds and search results to adjudicating conflicts about acceptable content. 1111111111111111In the loop that drives AI, we need to think deeply about human labor. For being at-the-ready and eager to do an essential job that many would find boring or too challenging, this workforce needs preparation, support and compensation. The imaginative efforts of humans to channel the speed, scope, and efficiencies of AI will involve a host of future work, going far beyond editorial treatment of trending topics. The first step is to seek more accountability from technology businesses that have marketed AI as devoid of human labor. In ads, we should demand truth about where people were brought in to help us, whether it’s to curate our news to educate our body politics, or to field concerns about what some troll just posted to our favorite social media platform. We should understand that there is human labor in the loop and we want both the ability to understand the importance of their role and the ability to understand the preparation and help that informed them.
As customers, in the AI that compiles our news and media content, we have the right to know what ingredients and processes are in the AI, just as we should know what’s in the food we feed our families. We have a need as people to know where our knowledge comes from. And we should all realize, as human beings, when humans are at work , creating what we eat, whether physical or digital. By the shibboleth of AI, the labor of these hardworking individuals around the world should not be made invisible or opaque. Just as we need businesses to be accountable for the labor practices that generate our food , clothing, and devices, we also need accountability for the digital content created and formed by both customers and employees.