April 8, 2021

The Ghost in the Machine is Not Who You Think: Human Labor and the Paradox of Automation with Mary L Gray

The Ghost in the Machine is Not Who You Think: Human Labor and the Paradox of Automation with Mary L Gray

Mary Gray is an anthropologist whose work explores how technology informs work, a sense of identity, and human rights. Gray applies these concepts as the Senior Principal Researcher at Microsoft Research and as the Faculty Associate at Harvard University’s Berkman Klein Center for Internet and Society. Additionally she remains in a faculty position at the Luddy School of Informatics, Computing, and Engineering. Gray has also authored books such as In Your Face: Stories from the Lives of Queer Youth and Out In the Country: Youth, Media, and Queer Visibility in Rural America but her most recent book, coauthored with Siddharth Suri Ghostwork: How to Stop Silicon Valleyfrom Building a New Global Underclass focuses on how task based work is being utilized by bigger businesses and how this represents a change in the way we conceptualize work.

In this episode we focus on:

  • What is Ghost Work?
  • The gap between what a person can do and what a computer can do
  • Algorithmic cruelty
  • The future of work and what that means for contract labor
  • Tech not as devices, but as conduits for social connection
  • How to bring empathy into the workplace

Where to Find Mary Gray:

Website: https://marylgray.org/

Twitter: https://twitter.com/marylgray

Linkedin: https://www.linkedin.com/in/marylgraymsr/

Music: Epidemic Sounds

  • Dylan Sitts - Ice Cold Beverage
  • 91 Nova - Lushwork
  • Blue Steel - Up Here

Episode Art: Adam Gamwell

Production: Elizabeth Smyth, Sarah McDonough, Adam Gamwell

Transcript

Adam Gamwell: [00:00:00] Hello and welcome to This Anthro Life, a podcast about the little things we do as people that shape the course of humanity. I’m your host, Adam Gamwell and today I’m joined by author, fellow anthropologist, and McArthur Genius Award winner Mary Gray. Mary Gray’s book Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass uncovers the exploitation of contract workers performing mundane but critical tasks for big tech companies. This kind of work includes labeling and filtering explicit or violent images so they don’t end up on your Facebook feed, monitoring website content for violations of terms of service, and even vetting if your Uber driver is the same person who is logging into their account. Ghost work is the labor of cleaning up and managing data, done by a nearly invisible army of remote workers. This invisibility results in a collective deception that makes technology and the world seem more automated than it actually is. Now, before reading Ghostwork I hadn’t put much thought into the way human labor fills the gap in limitations of artificial intelligence. To be honest, I didn’t even know there was a person checking my Uber driver’s identity when they log into work and despite the fact that many of us feel technology is taking over, Mary’s work unveils the ways technology and Artificial Programming Interfaces- otherwise referred to as API’s are entirely dependent on human labor to function. In fact, artificial intelligence lacks the innate human introspection and capacity to communicate that is unique to homo sapiens and as of today cannot really be replicated through coding. 

Mary’s research also opens up the conversation about whose work matters in a high tech world. Most of ghost work labor is considered low skilled and is underpaid and undervalued. Extremely limited worker’s rights, tough working conditions and limited growth potential are not unique characteristics to big tech, nor to this modern era. However, where, how, and all that we do for work is shifting alongside globally connected workforces and developments in Artificial Intelligence, Machine Learning, and APIs. So, one of the big questions Mary’s work helps us think about is: how can we critically examine and adapt our current archetypes of work, of how we value economic life and productivity? Spoiler- there is no simple answer. However, Mary’s book and her conversation does give us some cause for hope. So plug in, hit execute, and let’s get to it. 

One of I've really enjoyed the book and it's such an interesting space. I did some of my training in economic anthropology. And so it did a little bit of kind of workplace studies and work thinking but one of the questions that I was really curious about too, and I liked it toward the end of the book you pointed out, maybe we'll start at the end and go backwards is that the future is not this dystopian idea that robots are gonna take over and humans will have no place in it. I thought that was a really evocative way to close out. Cause also it's a bit, obviously like ethnography does a really good job of showing us the both nuances and complexities, but also really like the human challenges in the experiences that we document as anthropologists. And so I wanted to break down that idea.We can work a little bit backwards of what ghost work actually is in this case then. So I imagine some listeners have read the book and some of them have not. So we can think about this conversation as a bit of a 101 of getting some of the baselines in there for folks, then we can dig into whatever issues come up into more detail.

But, why do we have that idea that the future is this dystopian idea that robots will just take over and that humans will have no place? Like where did that come from?

Mary Gray: [00:01:08] It's, you know, as early as Asimov when we started dreaming about robots we had an idea that robots would be these replacements, surrogates for humans but importantly, we've we often think about technologies as replacements for some facet of what humans are doing or thinking, and it's an understandable framework, but I think precisely because we don't have a very nuanced understanding of how complicated, not just people, but how complicated our relationships to each other are to understand how technically challenging it would be to replace what it is a person is doing when we're receiving a cup of coffee or trying to understand our health.

Like those settings don't lend themselves easily to replacing what is deeply human, that capacity to think on the spot to think on the fly to be able to make a snap judgment and probably more importantly, to be able to communicate in a way that conveys empathy, can draw out more information you might need to understand a situation that complex communication. So that's, I think that the problem is that from the very beginning, the assumption was technologies are effectively extensions of our manual labor, and therefore we can replace anything that seems perfunctory or obvious or redundant.

And in reality, how we go about doing anything, whether it's manual labor, or creative work of other kinds, it always involves this incredibly rich nuanced capacity to create without priors, without prior information and to communicate in these incredibly robust high fidelity ways. That's the, I know that's a heady way to put it, but we were thinking about replacing the wrong thing.

Adam Gamwell: [00:03:14] That's so fascinating too. It's so what I'm hearing too, is that technology in this perspective, isn't an end point, right? It's not a thing that we get it and then it's done.

Mary Gray: [00:03:24] Yeah. I think, um I'm laughing because I, you know I want to bring a feminist critique here. I think there's also an idea that we can have certain mundane tasks done by someone else without ever having to ask them. That's such a dude way of approaching how to get your socks picked up every day.

Um you know I do think we should be bringing a very critical lens to the things we build are often with the assumption. I want somebody else to do it, and I don't even want to have to ask. That as a starting point shuts down a host of other possibilities for technologies that are supportive, facilitate help us attend to each other rather than tune each other out that's what frustrates the heck out of me about, um  the course where we continue to find ourselves, which is what we call the paradox of automation's last mile.

Every time we aim to put automation in place so that we don't have to do something as humans we are reproducing this set of assumptions that there's something obvious or something unskilled about connecting with each other.

 Adam Gamwell: [00:04:40] It's one of those things too, that I think that I have this idea that anthropology, a phrase I've been playing with too, is that anthropology is soft skills for a hard world. And I feel like technology falls into this camp that we're discussing here, right? That it's we think that it's somehow that the quote unquote, soft skills are useless,or they're not worthwhile or they can be replaced almost right. Or they can be, they themselves are not even outsourced. When we think about this stuff, like getting your socks picked up is a great example. Without having to ask for it, it's almost, there needs to be sociality in that idea. But then when you add the, of course the messiness of the human is when, of course it all goes out the window, you can't just get your socks picked up.

Mary Gray: [00:05:18] No, and I like your framing of soft skills for a hard world. In some ways we've divvied up the world into what's hard and what's not. And again, that gets gendered very quickly. Who thinks hard, who does hard sciences, like all of that. And, I keep bringing that to the table because I think in many ways we have not had a chance to thoroughly scour the sets of assumptions we make about who is and is not skillful and therefore who is, and is not replaceable and hopefully what Ghosts Work that, that coin of that turn of phrase that my co-author Siddharth Suri and I came up with the phrase ghost work to talk about work conditions, not the workers.

Adam Gamwell: [00:06:04] That's  great cause I think  one of the things that also really caught my attention too, is even, you know I don't need to make this so anthropological, but of course, why not? But like, I think what good ethnography does and what your book does is that you, you couch this in a larger context in this case about work and labor changes in labor laws, right? And the quote unquote de-skilling of certain kinds of labor. And not that the work itself wasn't skilled, which you make that point, but that labor laws. And just, practices by like William Randolph Hearst, the newspaper industry, and pushing that certain kind of labor is quote unquote skilled.

 

Mary Gray: [00:06:34] Yeah, that's probably one of my favorite footnotes in the history of labor law, is seeing that, that case of the Hearst newspaper company. Basically wanting to say newspaper boys. They were almost exclusively boys, young boys at the time shouldn't be considered employees. And that's the beginning of carving out space to say though, they're just selling my paper.

They don't create my paper. So that there could be an argument in place that allowed us to sort who are the workers we should care about and who are the workers who are always replaceable. They're not doing something important in the U.S. Labor law, the place where we left open a lot of room to discard to erase the value of human labor because we could stack rank well, is it really the important stuff? And I, you know I think that we are at a moment, COVID brings to the, shakes us to our core, to see we're completely dependent on each other. When we're doing our work, I can't do my work if I also am not getting the delivery from someone who's bringing me a better internet connection, the person who's going to come repair my wireless router that seems to be dropping all the time.

Adam Gamwell: [00:07:54] Yeah, that's a wonderful point too. And so it's. And it begins to illustrate just how complex these ideas are. I appreciated that you broke down that Ghost Work itself is not about the workers, but it's about the conditions right. In which they work and what the situation is.

And so what we're seeing is that it begins actually, we can, again, we're thinking historically with the idea that it is as you said, stacking rank of who, what work is quote, unquote, more valuable, who is producing the actual thing, we're producing the newspaper versus selling it and selling it.

And then, so whose labor is more valuable in this case, which is then ironically juxtaposed to the point you made, that we are incredibly dependent upon all of us, and different ones of us to do any of this work. And the idea even too about this when, we wrap it in technology in the idea of the paradox of automation's last mile, which is a wonderful phrase you know is that what automation and API is application programming interfaces that a lot of technologies use to help staffing firms get different workers for contract work, and we should break down what all of this, all these pieces are how this happens, tends to obscure the human relationships that are part of all of these processes.And so it looks like something almost as simple as we don't need to pay newspaper boys the same we do of folks that are printing the newspapers or working in the newspaper house itself yet at the same time, the newspaper wouldn't sell the same way without that kind of labor. So there is a dependence upon it, but it's interesting how they reevaluate it in a way that stacks in the favor of usually of the owner in this case. Very Marxian, almost.

Mary Gray: [00:09:24] Yeah, no, I, you know I think for at least my read of this history, and there's a specific chapter in the book that's on this history is to see there's not necessarily a maniacal actor, who's trying to figure out how do I make things worse for that contingent worker. It's rather that in the beginning of the industrial revolution in the United States made a set of bets about who needs the most protection. So that, from our earliest days, and actually the book's history starts with slavery. It's when we have the Emancipation Proclamation in the United States that we make a statement about whose labor can not be stolen. And we begin then having a conversation about how do we value this new kind of worker who is not a subsistence farmer, but working in a factory setting, working on factory floors that are incredibly dangerous at the time. Seeing labor laws that organize and are always compromises with private sector, public sector workers advocates, citizens, consumers, all of those pressures are shaping what do we come to decide are the kinds of labor that deserve a weekend, deserve a set workday, deserve a minimum pay, deserve later on, health care as a benefit deserve, retirement, as a particular kind of benefit, deserve leave, sick leave and parental leave as benefits. All of those decisions aren't coming from maniacal actors operating in the background, they're coming through all these social and cultural forces. And this is again, what we probably both love about anthropology. It shows a world littered with previous decisions that then, we are born into and we pick up and people are working with these ideas of oh, okay, if I'm part of the professional class, then I probably don't need labor protections because I am my own agent. I can make things work for me. That is a story we have about a particular moment and it was incredibly gendered. If you were an engineer, that was absolutely true but if you are a computer, quite literally, a woman, often a woman of color, who's doing math calculations you didn't count. Again, literally and figuratively. So your labor as a worker in a setting like early aeronautics labs was easily discarded as like anybody can do mathematical equations.

And it becomes a self fulfilling prophecy that that work isn't important when a computational process comes into play and seemingly replaces that kind of particularly cognitive work. So every moment we've had technological advances, they have seemed to justify who we discard. As the low skilled workers who should be retrained and why aren't they training themselves to skill up and do the really important work?

Now we've worked ourselves into a fascinating corner, which is where a world driven by services. Like most of our economic activity is really generating information, working with information, creative work, creating podcasts, doing the soundchecks on podcasts, that's absolutely real work.

It's knowledge work. Perhaps that's a phrase we can use. We call it information service work, but in so many cases, many people might be working on a project like creating particular kinds of information and sharing it with other people. And it's not clear that isn't really valuable work. It requires many hands, often to create an end service or product that somebody can consume and we can no longer tell ourselves, the people doing that work, they're not as necessary as say one other part of that collaborative process that puts together the end product. Unlike on a factory floor where it's a little bit easier for us to fool ourselves into thinking  you can take anybody off the factory floor and replace them with somebody else. And the car will come out the other end.

This is a moment of reckoning for saying actually we, we really are reliant on in service work. We're reliant on people being in the loop for a moment. And then we need them to get out of the way so that if the service we want to offer,  if it changes, you no longer offer podcasts, but you're offering web animations that you would have a crew of people you could assemble to be able to do that work.

So that's, I think to me, the, hopefully the broader message that people are seeing in this is it's no longer easy to dismiss who is redundant or unnecessary. We need a new paradigm for how we think about and value people's contributions to economic life and activity and productivity.

Adam Gamwell: [00:14:54] Yeah, I think that's a, that's an incredibly valuable point and message to drive home. And why I think ghost work as a set of conditions and people that work in these conditions is such an interesting and important  ethnographic group of folks. And actually just in general to understand this is a form of work in labor that a lot of people don't see. That's the name Ghost Work implies that we don't see a lot of these workers in specific contexts. And so I wonder if we can, let's illustrate this with Uber is a good example, that you talk about in the book and you talk about a bunch of different companies, but of who becomes a ghost worker in the context of Uber.

Like I have my Uber app and I want to get a vehicle to go down to Harvard square. And then, I pop in, login, say find car, the car is quote unquote found. And then the driver's connected to me and then comes to pick me up. But what will you point out in this example is that there's actually potentially a number of people that are happening in between these two steps that I, as the user and they, as the driver may never see right.

Mary Gray: [00:15:53] Yeah, exactly. I think that for every online service that seems to magically match a consumer to someone who's meeting their needs along the way, there are really two streams of work that we discovered, or rather we plum in this research to show just how much work though contingent is going on in the background.

So there's one stream that's absolutely looking at all of the number of times that an application is opened. It's looking at what kinds of keywords or, you know text messages might be sent back and forth as say consumer comments. And it's scouring that and asking a person to annotate the details of what's called training data.

And so training data, it's effectively the data that's being generated by you as a person using technology and your interactions with those technologies. And it's valuable information for being able to train algorithms, to be able to train any sort of computational process to see what can we learn from those notes, to be able to figure out what part of this could we automate?

So maybe it learns that you tend to call for a ride in a particular location and it's looking to aggregate your practices with other people. Is there a particular kind of time of night that seems most valuable for us to be paying attention to and what kind of service might be needed for those details. Sometimes they're really self-evident it tells you what to do, but most of the time, particularly in domains, say health where there really isn't a lot of clarity, a lot of clear answers to what to do next. You want people looking at that training data, looking at the mess, the pile of information we've created and making sure it's labeled correctly.

It's that any of the typos that might give you a completely different idea of what to take away from a text and a consumer report might mean that there are people who are trained to just look at that data that's already been created. That's one stream of work. There's the second stream that we describe in the book that are people like the person we described, who's looking at say you calling up a driver and they're spot checking security records to make sure the driver you've contacted matches the driver who's being routed to pick you up. And that's a feature that Uber had introduced, a security feature to help consumers feel confident that the driver they thought was on the way is the same person, that they're not misrepresenting themselves. Find consumers. quality control, quality assurance, a feature, but it requires a person pulling up the drivers image as documented in the records.

And then a snapshot the driver might take in the moment to be able to see is this the same person? So that stream of work is the part that we will never see as end consumers that happens in a moment. And it's the person who can literally be. And the folks we met who were on the other side of the planet, who were doing that security verification, that face ID security verification, who are this linchpin in a moment of what we experienced is just the magic of artificial intelligence that, a driver shows up and they're who they said they were.

Adam Gamwell: [00:19:30] It's funny too cause even thinking about this example too, it's funny how we think this is the phrase you used, which I like is the magic of artificial intelligence. That we think it's just this, that algorithms or AI are quote unquote, smart enough to do that exact thing.

But then I'm actually just curious, I'm thinking about for listeners or folks just thinking about this too if now that you know this, if you didn't know this, that there's actually someone checking the ID of the driver, do you feel more or less safe when you're using Uber? Contemplating that it isn't AI that's doing it right?

Um they are part of it is of course, but then recognizing that actually like the smoothness of technology really is a team effort in a very deep sense that we don't think about.

Mary Gray: [00:20:11] Oh, I love that you put it that way Adam cause it is a team effort and I think that's the thing I don't want to take away from. It is a profound, magical experience that there is someone on the other side of this earth who's making sure you have the driver that you think you do, and it's a spot check and there are definitely, there's definitely different applications of that kind of spot check, but it is this incredible accomplishment that what's called human computation can affectively smooth out something that otherwise could be a pretty bumpy ride. Yeah. I just love that you put it that way, cause it is a team effort and it shouldn't take away from how amazing it is that technologies can connect us and create this experience. What's troubling is that most of us don't realize how much of a team sport technology's accomplishments might be. And even the technologists are often not really fully taking in how dependent they are on that moment of contribution from someone because they're mostly thinking about the engineers. And the economists for Uber's case who are onsite, they're the valuable employees.

Adam Gamwell: [00:21:29] Yeah. And I mean even this concept too, of human computation, you bring up too, I think is a fundamental crux of how we contemplate what's happening here too. Cause even and then this ties into the idea that the phrasing of the paradox of automation's last mile, that we always think about what it is that we could automate.

And the two streams of work that you mentioned before, the idea of annotating training data, and then the idea of someone, and in this case, it's like spot checking and making sure that the person is who they claim to be. And, these both require this level of, again, human computation, where it's and I think in the book you talk about it, it's like closing the gap between what the technology could do and then what people can do and right. As much as we may think artificial intelligence is, wildly scifi, like it can do some incredible things. But at the same time an AI is simply something that responds to external stimuli that makes them like either just do something else, I can respond in a certain way. But it's not at this like generalized artificial intelligence level that can think for itself and

Mary Gray: [00:22:28] Can I pick on that a second too? Cause there was just a headline in the news, New York Times today about drones thinking for themselves to your point, it can absolutely AI can absolutely pick off. It can pick up where humans might've left off in some ways. So it is built to be able to replicate decisions.

What are prior decisions a person has made in this setting? Generally speaking. So it tries to abstract out context, which for an anthropologist is awful because we know that you could generalize some things and that's why it's so powerful. It can only model prior decisions and then it's giving you a prediction on what you might do next.

But it's what somebody like you might do next. And the problem is if it's missing really important elements of your decision making as an individual in the moment that are driven by myriad matters around you, like you have a sick child, so you can't go the same route you typically go, or you need to stop off at the drug store first, then all of our dreams of automating so that they can predict what we want to do next, reasonably speaking they're insanity, because we're thinking we're going to be able to predict what we haven't done yet. And that to me, like holding on to what should sound as a profound paradox, the idea that we would know what we're going to do next before we've done it.

The paradox there is that we are not that predictable as humans. The things we do as people, what makes us human is yeah, sure we do most things most of the time, the same way, but. when we need to make a change in what we do next, AI is the last thing you want in place, because it cannot predict, it cannot think like a human, it can't apologize like a human for sure when it gets it wrong.

And those are really the two features that make up so much of how we interact with each other in service economies, we make a good guess based on having often never met a person before, like you and I haven't met before today. and when we get things wrong and what we're trying to offer each other we are really good at reading human signals that tell us, oh, I think I misstepped there and course, correcting AI cannot do any of that because it doesn't have the training data to be able to evaluate what should I do next in a novel situation? It'll certainly act, but it will act without really good directions.

That's why in the book we belabor the example of AlphaGo Zero or any AI that can be a game, realize that it had an edge in that it had every single match ever played. It had all of the rules and they're actually rules to Go or Chess or any Atari game from our past. And so AI fails. It falls short if it doesn't have all of the rules and a lot of examples of what people do with those rules. So think about your average day, how much of it is rule bound and will not change because things really remain the same.

Adam Gamwell: [00:26:22] It's basically nothing.

Mary Gray: [00:26:23] The things that count that's the hardest thing. That's actually why I think there's so much that artificial intelligence and more, probably more basically just application programming interfaces with just software and the internet could do a ton of amazing things for us, but it has its limits and it has technical limits.

And unless we're all much more informed about those technical limits, we start looking for technologies and AI to do more than it can. And asking engineers who are just wonderful people who believe that things can happen.They can design technologies that will make things happen without having to ask, right? Back to that earlier example of picking up socks, their dream is I'm going to build something that anticipates your needs and you don't have to communicate what they are. That's got to change.

Adam Gamwell: [00:27:24] I agree with you there too, because it's interesting that a lot of and again obviously we're not trying to make this the anti-technology podcast.

Mary Gray: [00:27:30] No, no. I'm pretty pro

Adam Gamwell: [00:27:32] Yeah. Yeah. Pretty pro tech. Um but even the interesting idea too that humans are one of the quote unquote problems to be solved with technology is a pervasive idea that I think your research withGhost Work too helps us dismantle these ideas and in this conversation too, is pointing to it. And even this like both a relatively simple, but complex idea that it's just fundamental for us to understand and know the technical limits of what certain things can do.

And then cause otherwise, as I think you rightly point out that we're looking for tech to do more than it actually can and then it can lead to frustration. But then it also has deep ties to how people are perceived in their work lives. And what they may be asked to do for work based on we're giving you a quote unquote flexible schedules but you need to be on call 24/7. It's wait a minute. Those don't line up very well.

Mary Gray: [00:28:16] Yeah. Yeah. And I think the approach that we take to applying technologies to say that's such a great example Adam, when we apply technologies to something like efficient algorithmic management of people's work contributions, you know a shorthand for that is what we call algorithmic cruelty.

If you have something that's making a set of assumptions about how you should work and how efficiently you should work in relation to others, the shoulds hit a wall when you can't follow through on something, or you don't want to because you have something else you want to prioritize. And that's what we found endlessly in this world.

People who were both using this human computation, this on demand approach to work, being able to pick up a task or a project, they were often trying to have some semblance of control over their time or their schedules. And depending on their experience with any of these platforms, they ended up being what we call hyper vigilant.

They were constantly looking for work opportunities because if they weren't looking for them, no engineer had thought gosh, I should batch these, or I should recognize there's somebody who's logging in. Who's said these are the hours they could accept some particular tasks and I'll just route them, those tasks when they're available. Nobody's thinking from that perspective, they're there again, they're abstracting out and making a lot of assumptions about what's efficient often for a consumer, or a business, and that doesn't have to be inherently bad, but it's unsustainable and it's inhumane when there's no attention to creating so much efficiency that completely ignores the needs of this critical element. The person doing the work in the mix, it will break that person and to rely on that approach means you're just waiting for another person to step in. It's a very, it's an ugly way of looking at our relationships to people contributing value through the work they do. That if you can't do it, somebody else can. Goodbye.

Adam Gamwell: [00:30:33] Right, I was going to say it. It sounds exactly like the Fordian model if you can replace the factory worker. It's interesting how it's like it echoes again in this different capacity. I think that's like that to me is I think one of the starkest things to think about.

It's that as a person who works and has worked as a freelancer for years like this book also hit home, you know and that was, I think what's so powerful about it too. Cause it's, I have certainly thought about that personally, it's, oh, I don't feel like myself as a replaceable factory worker, but I also understand like I see the parallel, maybe that, it's part anthropological training, but I can see how I would look like that on Upwork, for example.

I may have a name and a picture versus a code number, from a staffing firm but nonetheless there's not like, there could be the pressure to drive prices down, on these other platforms as well. It is quite interesting in that regard of how do we avoid that?

I think you talk about a bunch of different platforms like MTurk, Microsoft or Amazon and Microsoft has the, can you remind me? The universal of...

Mary Gray: [00:31:32] Universal Human Relevance System.

Adam Gamwell: [00:31:35] Yeah. Great sounding title.

Mary Gray: [00:31:37] Great name. Really built to help make sure your search results were accurate. So Google has a similar engine in the background. Every tech company has behind it's firewall a version of this kind of platform.

Adam Gamwell: [00:31:53] And yeah. So if listeners don't know what these are, cause this is actually, this was new to me when I read the book. So tell us a bit about what are these, there's kind of these four major platforms that you talk about in the book? So they function on one level to help make sure that we have relevant information when we're searching for something. But what else do they, what else did they do? What is, what does this world they create?

Mary Gray: [00:32:09] So we studied four different companies, effectively, four different kinds of platforms that stand in for different business models that provide this on the spot, on demand workforce. And so one example, Amazon mechanical turk it's one of the oldest that was an approach that Amazon used internally. Actually it created that platform to be able to correct typos and clean up the comments of its earliest consumers on amazon.com that came with lots of book listings. If you remember, it was the world's largest bookstore...

Adam Gamwell: [00:32:49] Yeah, that's right.

Mary Gray: [00:32:50] ...before it became what it is today. And so it built that platform to make sure people had, consumers had accurate information and to do that it meant, let me hire some people to look at images of books and titles and make sure they correspond or images of purple ties and descriptions of purple ties and make sure the content, the description of that good is accurate. Not unfamiliar office work but done by putting that task out and up for grabs to anyone who had signed in with an Amazon mechanical Turk account the internal version of that, whether it's for Microsoft, like the platform we studied universal human relevance system, or the one called ewok, inside Google. That one was set up so that companies could do testing on their products and services, whether it's a web search or making sure that the software they were developing was beta tested properly, that it was being translated correctly when it was providing the written materials that go with any software that you buy, that the translation of the user manual is accurate in thirty plus languages. They use their internal platform to source out that work and outsourcing is an appropriate frame here, but think of it as I need linguistic expertise, or I want someone who understands this set of images and can make sure if I'm doing a search for socks, that it's obvious that the possibilities there are things I put on my feet and also my favorite baseball team.

So that's what relevance means when you're looking at search strengths. Is it I'm giving somebody who's doing a search, the kind of information that, that we might expect when we're doing a search online and that explodes in the mid 2000’s into the other kinds of tasks that we studied.

One, Lead Genius is a startup that takes sales leads, takes information that companies need to be able to make the right call. So they've got a list of people they can call. They could scrape the web and find out who might be the right person to contact if I want to sell, you know air conditioners to your company.

But instead of just taking a guess and accidentally calling the right person, you can have your sales team work with a vetted set list of who should you contact by having people curate that list. Because it's actually very hard to, if you think about your own resume, maybe you're the right person to contact.

Maybe there's somebody else I should contact to reach you. So Lead Genius came up with an entire business, organized around, let me help sales teams at different companies be able to do their job. And it's a mix of automation and human intelligence, because you can start with a scrape of the web and have an idea of who to call.

And then you can give this list to people and say, find the best sales leads for these teams that want to generate new sales. And then the fourth platform we studied Amara is this fascinating world of people who volunteer to do video captioning and translation. And when companies started coming to this volunteer community. The organization Amara decided to organize a service called Amara on demand so that it could put those jobs for people to do translations of documentary films, of training videos up for bid so that people could participate in captioning and translating things they chose to capture and translate.

And if they wanted to keep volunteering, they could do that too. So those four businesses are just stand-ins for this world of work that's organized, not around a nine to five job, but that is literally organized around different projects and tasks. And those projects and tasks can range from find a typo to translate a song into a different language.

And we chose those because they are precisely the kinds of tasks that most people assume will be automated away any day. There's no evidence they're going to be automated away. There's lots of evidence that our automation of those particular kinds of tasks has gotten much better. So take any translator service you might choose.

It's pretty darn good. You can use a translator like Google translator or Cortana and get a fairly accurate translation of a word. Of a sentence you say in one language, but would you rely on that if you needed to have a really important conversation. And there was a language barrier

Adam Gamwell: [00:38:02] Or a legal document, right?

Mary Gray: [00:38:03] or a legal document or any place where it counts. So that's the thing that we are describing. Are how people can be put to the task of a lot of different projects and the approach of this labor market is you don't need full-time employees so different platforms have spun up to offer different workforces. What we found were people who are working on multiple platforms, you can be driving for Lyft or Uber and you, or both, and be on Amazon mechanical Turk or a usertesting.com or Accenture.

And that's the thing. There are large companies that are organizing these contingent workforces. Sometimes they're signed up to do one task. Sometimes they're signed up and authorized to do a set number of hours, but that's not a full-time job. Or if they're doing it for the number of hours that might make a full-time job, it's not how we tend to think about employment.

There'll be leaving that job as soon as the contract is up. And that's the part that probably, I hope resonates with freelancers and contractors everywhere is that you're tasked to do a project. And when you're done. You hand in that work and what's radically changing beneath our feet is that algorithmic management is being brought to the table to schedule at least in part manage, ship, bill, process, all of the exchange that describes a lot of our work. And there are no particular regulations that help us orient to that new way of working.

Adam Gamwell: [00:39:55] Yeah, that was actually, that's what I was thinking about following up with this idea is that also again, it was really surprising to me, but also makes perfect sense. Just if you pause and reflects at how work is aligned today that this is not a quote unquote regulated space, especially because so much of it is international too.

And so that even, you might need arbitrage, in order to like, schedule and set up and make culturally appropriate and PayScale, et cetera, per country or per place, even if one of these staffing firms of course works with multiple countries or folks in different parts of the world.

And so there is no. There's no international protection, but then on top of that, there is no like necessarily local protection either. And you make a good argument around the idea of US labor laws and how they don't line up to protect these kinds of workers, which again was set in this again, the historical precedent.

Backing you know from the Emancipation Proclamation into newsboys being contract laborers, into seeing what kinds of work that we can both devalue and treat as unskilled, meaning it doesn't need protections or count as training, for example. And so it's interesting to then, bring this all to bear now, and that's really what we're seeing in this place despite the fact that there may not be all these protections.

One of the things I liked about the book too, is that you say this is a fundamentally hopeful account of how folks who are approaching work and, making both the best of a situation, but then also, picking the pieces that they can to work and whether their strategy is to gain skills for their resume or to be quote-unquote between jobs, what you make the good point that's actually a distinction that's going away. There's not so much an in-between job or being in between jobs in this world.

Yeah. And that's that's fundamentally, I think, a different way of thinking about work than most people do. And I guess part of it is like, is, in terms of this being a hopeful account can we catch up as I, the global society is a very big ask. But, how do we help people catch up to this idea that like the, yeah, the nature of work has changed in that we are thinking more about in between projects versus in between jobs, for example. 

Mary Gray: [00:41:25] Yeah, I think I hate to say it. I feel like COVID 19 has accelerated at the very least it's amplified what's not working about our current orientation to employment. So in the United States, particularly we had a set of assumptions about, say unemployment insurance. It turned out you had to be working in an employment relationship to be able to tap unemployment insurance.

So what do you do if you're a contractor? We had to scurry to figure out like, Oh gosh, yeah, those folks also work. They are effectively self-employed, but we don't have an approach to self-employment that also values that work as a contributor to several different employers. So we didn't have a way of figuring out how we hold each other accountable for the needs that we have right now, or just simply healthcare.

We literally treat healthcare as a benefit of employment. We don't treat it as a public good, a public necessity, and that is painfully obvious and working against our healing right now. There are some basic things and I would say this is mostly recognizing that we have to have a new paradigm for how we think about what is the social contract that should be attached to every working adult? Because often we're looking for things like, how will the market fix this? How will the market make sure we don't have downward pressure on wages, for example, that somebody can't underbid me because they might be more desperate or in a place that isn't as expensive to live. And I would argue what we need is a new paradigm that says, what are the responsibilities to people contributing to the output that we now all consume, which is often services and information. So if I take that as a starting point, not what do employers owe their workers, but what does society and what do people like companies or entities, like companies need to be able to draw on the labor of working adults. If we start with that question, then we can move forward with, oh, we need something that assumes people will need to take a break at some point, and there's not a single employer of record who will be responsible for that.

What's an approach to that? One of the approaches we have in the conclusion is to think about portable benefits, where for every minute, quite literally you contribute, you could imagine that there is a set of funds set aside. So that minute is matched. So not so much that you can take a break tomorrow, but somebody like you can take a break tomorrow, right? So it's not attached to you as a worker is more broadly attached to how do we support working

Adam Gamwell: [00:44:58] Yeah.

Mary Gray: [00:44:58] And that I again I'm a relentless optimist, but what I hope is. What is theorized through looking at the case of people doing this work is that we could reorient to work and approach the question of how do we support working people anew because that's what we did at the industrial revolution. It was just that we ignored sets of people who were working people. We ignored contingent laborers. We ignored immigrant laborers who were always on the outs of many of those factories until there was enough advocacy to get them in. We ignored women because they were not visibly in the workplace, even though they were shoring up people being able to go to work. So it's literally looking at all the people that were left out, not again by intention, although arguably there was intention along the way by some actors, but just the cultural force of the assumptions we made about who deserves a good working environment.

How do we create a good working environment? All of that, if we reorient to okay. What do you do to make sure somebody can work in an environment where there are many employers of record, not one, there's no single work site. And in many cases you will never meet your collaborators in person, but you will need ways in which you can connect and have a conversation about what it is you're producing, that you have a right as an individual to know where your work is going, because it is still your output, your creative output. So if we start asking those questions and that to me is the hardest part, right? Is to say our old answers don't work anymore because we have to ask new questions. Anthropology was built for that.

Adam Gamwell: [00:46:52] Yeah, I share your optimism in that. I think it is, in the nice mantra that another world is possible. And part of that comes with asking new questions and being willing to do that. And so I think what is your hope, as an anthropologist, specifically, I know you're more than just an anthropologist, but, you know having an ethnographic mindset and thinking about questions of culture and taking a more holistic approach, how does that help us?

Put those new questions out there. I guess if folks, I imagine a lot of people listening are anthropologists. I know they are, but some don't have to be just social scientists in general, what does this perspective do in order to let us ask new questions, I guess is the way to think about that?

Mary Gray: [00:47:25] Yeah, my message is that we have a lot to contribute and I think, we know that we're gathering a sense of the detail of how people live their lives, their everyday lives. And I think we're often tasked with examining what are problems in their lives. I know I wasn't taught as an anthropologist to necessarily offer a possible intervention. And if anything, our history as a discipline is riddled with places where that was so either poorly carried out or it was so fraught. So intervention and warfare is a terrible application of anthropology. I would argue, but a place where, you know, to learn how to intervene is to say, I'm looking at the way people are living their lives and I can see where things are made harder, precisely because there are a set of assumptions about what's easy or who matters. That's where we, as anthropologists are trained to look at how people make sense of the world and it gives us this refraction into what the world assumes, taking for granted about where people fit in it.

And I can make that concrete for technologists for building out technologies to date. And I think we're about to change this, but to date, we've been working with the set of assumptions that people are individuals, they're choiceful agents. They have behaviors that are entirely psychologically bound. And we are just beginning to enter a world in which computer science and engineering, which are the core disciplines to building technologies are coming to grips with how much it matters to look at people's social relationships because of fact, when they build technologies, particularly networked technologies, they're not building them for humans or individuals. They're often designed for social connection for sociality itself.

So they're building relationships and they've not been trained to orient to relationships and those unwritten, those tacit rules that bind us. Well, anthropology has really well-trained to bring those to our attention.

It's not that we document them so much as we can get a handle for what are the ways in which people are operating that are working with all of the invisible sets of rules in place, right? So we're in a really good position to collaborate in. For me, the collaboration with technologists is precisely to move us all to a place that does not essentialize people as individual actors.

But instead opens up a new set of questions that recognizes that is a kind of a small set of moves when a person's operating from core instinct, most of what we're doing, and I'd argue 98% of what we're doing is operating within cultural logics. You know and we're, it's always an indefinite move.

I don't control my environment, but I move through it and showing how people move through their environment is the beginning of being able to build out systems that address their collective social needs. And until we as technologists now to throw myself into that group, until we see people as part of broader groups and circumstances, we will fail the needs they have and labor employment is such a oppressing place for us to get it right.

Adam Gamwell: [00:51:30] I think that's incredibly powerful. And to your point too, I think what adds such importance and weight to your work is precisely the idea or the reality that you are working with and as a technologist in addition to your training as an anthropologist. And so this question sounds a little canned, but I'm really interested in this, but also as a public intellectual and the work that you do, like you're known to the general public through, your writings and your talks, and you have a lot of views on your work and obviously getting a MacArthur award doesn't hurt that as well, which is amazing.

So congratulations there the funny thing is this is rare amongst anthropologists though, and so oftentimes a lot of our expertise, like the culture concept, or I'm curious about this with you and technologists that gets popularized by other or non anthropologists. And what has led you to make this the public impact.

That you've made and I'm hearing it around the idea of technology and anthropology but yeah. So how'd you think about that? What has led you to make this kind of public impact that you're making

Mary Gray: [00:52:23] I came up at a time in anthropology and particularly came through what was at the time, the society of lesbian and gay anthropologists who became the association of queer anthropology. And it was being surrounded by other people who like me were first and foremost, political activists. Most of those folks were working on HIV and AIDS and the politics of AIDS internationally.

And so I think from my earliest days, what drew me to anthropology was both what the discipline does and I love the four-plus fields approach to anthropology the breadth at which we can learn how complicated humanity is a way into thinking as a political organizer, which at the time I was a queer youth organizer, it leads us into being able to have a rigorous, methodical way of interrogating how the world is working and what are the possibilities for changing it. So I found my way into anthropology and was always surrounded from my earliest days by other anthropologists who felt the immediate need for anthropology to serve. And I think it's funny. I think you might've mentioned this a little bit earlier.

I can't remember if it was on record or not, that you think about something like design or other places that are more critical applied. It's the politics and the economies of higher education that have kept anthropology from being what it I think is best built to be, which is an intervention.

Adam Gamwell: [00:54:05] Yeah. Yeah.

Mary Gray: [00:54:06] So that's sociology got in there as a bit more engaged with the public. But anytime we make that distinction of applied anthropology and the unmarked category of anthropology, we should be suspicious. What's happened there. Yeah. So that's what motivates me is I actually believe that anthropology is best designed as an intervention, but to an earlier point, it's an intervention that's built on really not just methodical, rigorous standards for interrogating a problem.

It's there to theorize. It's there to generate new questions. It's a theory building exercise, and I resist anybody saying there's some necessary division between theory and application because the best applications are theoretically rich and insightful and leave room are supposed to leave room for us to say, it's not done. What else can we think about the world?

Adam Gamwell: [00:55:07] Yeah, no, I think that's beautiful and right on. And it's funny because we don't make that distinction with biology, for example. 

Mary Gray: [00:55:13] Yeah, I know. Oh my gosh. You're so right. I hadn't thought of that. Yeah, yeah.

Adam Gamwell: [00:55:17] That's, wait a minute. Yeah. So yeah. Why don't we close social thought? It's wait, human nature, quote unquote, you can't experiment with that.

Mary Gray: [00:55:23] Oh my gosh. Yeah, that's interesting.

Adam Gamwell: [00:55:26] Chapter two, Mary, I want to thank you so much. This has been an incredible conversation and I really appreciate you taking the time. I know you're incredibly busy and this has been super enlightening and enriching so many thanks for sharing with us.

Mary Gray: [00:55:36] Oh, Adam, thanks for the invitation. I'm really, I'm so glad to be on this podcast and I'm so glad that you have it. I hope you bring anthropology to the world and it needs it.

Adam Gamwell: [00:55:46] All right. Let's do it together. Yeah. It's a team. It's a team effort, isn't it?

Mary Gray: [00:55:50] That's right. It's a team effort.

 

Mary L. GrayProfile Photo

Mary L. Gray

Anthropologist, Senior Researcher, Author

Mary Gray is an anthropologist whose work explores how technology informs work, a sense of identity, and human rights. Gray applies these concepts as the Senior Principal Researcher at Microsoft Research and as the Faculty Associate at Harvard University’s Berkman Klein Center for Internet and Society. Additionally she remains in a faculty position at the Luddy School of Informatics, Computing, and Engineering. Gray has also authored books such as In Your Face: Stories from the Lives of Queer Youth and Out In the Country: Youth, Media, and Queer Visibility in Rural America but her most recent book, coauthored with Siddharth Suri Ghostwork: How to Stop Silicon Valleyfrom Building a New Global Underclass focuses on how task based work is being utilized by bigger businesses and how this represents a change in the way we conceptualize work.