Safe Space - Talking Trust and Safety

Safe Space - Talking Trust & Safety: Patricia Cartes | EP 3

Safer by Thorn Season 1 Episode 3

From aspiring United Nations interpreter to Trust & Safety leader, Patricia Cartes, Head of Trust & Safety at Cantina AI, takes us on her global journey into online safety. Thorn's VP of Strategic Impact, John Starr, explores Cartes' journey from translating spam websites to shaping content policies at tech giants.

Patricia reflects on the early days of T&S under "online sales and operations" to its current status as a flourishing field critical to business success. She offers a robust perspective on balancing free speech with user protection, drawing from her extensive experience in public policy and global perspectives. Her vision for the future of the field is full of determined hope, where growing professionalization and dynamic cross-industry collaboration pave the way to diligently address complex online safety issues such as child sexual exploitation.

Learn more about Thorn’s technology for good at safer.io

- I'm here with my friend and head of trust and safety at Cantina AI, Patricia Cartes. Patricia, thank you for joining us. It's TrustCon Week. Are you excited?- So excited. This is the best week of the year.- [John] Why?- Because we all come together, all of our friends. Please don't tell my kids or my husband that this is better on their birthdays. But yeah, it's like we come together and I feel so energized. I think, all year, I live for this conference, and it reminds me why we work on this field, that there's a path forward, that I'm not alone.- You are not alone.- Which I forget. And so, yeah, I couldn't be more excited for the week.- Well, I appreciate you taking some time out of your week to join us and chat with me. So we're gonna be talking trust and safety, of course, but more at a human level. We're not gonna, you know, debate the merits of any sort of regulation or talk about any specific outcomes. I'm really curious to talk, to talk journeys and pathways to trust and safety. So many people we've worked with over the last decade plus have had just very interesting journeys into the space. I think it's one of the most interesting parts about trust and safety and what makes this week so exciting and interesting, as you noted. And so would love to spend some time talking about your journey and lessons along the way. You have had an incredible career, and I think are gonna have interesting perspectives for us to chat. Because the goal is whether or not you're like a seasoned trust and safety pro, or maybe you're just like trying to get into the space, is that you find this interesting and helpful. So what I love to do is maybe can you tell us a little bit about what your role is now? Give like a three minute commercial on like what the day in the life looks at? And then we're gonna go backwards a little bit. And because you have a really interesting trust and safety journey. It's got public policy, it's got consulting, and I think it's gonna be really interesting. So tell us a little bit about what you do now.- Yeah, I do everything under the umbrella of trust and safety at Cantina AI, which is a social AI platform, and that means that humans can interact with each other and they can also interact with bots. And so, as you can imagine you have the challenges of human interactions, which, you know, by nature are difficult whether online or offline. But then you also have these bots that are hyper realistic and that are becoming a part of our lives. And my role is to think through potential risks and to try to mitigate them. It's impossible to prevent, you know, every potential risk, but you can try to mitigate. I think that's our job. And so, you know, on a normal day I might be drafting policy in the morning. I might be engaging with a regulator in the afternoon about what we do with, you know, training data on AI models or bot prompts that might be harmful or illegal. And maybe in the evening, if I'm unlucky, I'm dealing with an escalation because they come at all hours and we are still invite only so it's really incredible. I think it speaks volumes of, you know, the complexity of trust and safety that even before you launch a product to the general public, you can still be as busy as I promise you I am.- I believe it. I believe it. I believe it. And like obviously that's, it sounds like a very interesting spot and, you know, position that you're in right now. And so I want to get a little bit into like how you got there. It's a really big job and I'm sure you've learned a lot along the way. And so one of the things that I want to anchor us is we have a former colleague, Sinead McSweeney, shout out to Sinead, and she did a TED talk. And in her TED talk, she really, like, among other things, captures this kind of idea of like, we were all asked as kids, you know, what do you want to be when you grow up, right? And she kind of turns that on its head a little bit because she's like, well, the job I have didn't exist when she was a young lady. And so to kind of take us all the way back a little bit. What did Patricia want to be when she was growing up?- A translator. A translator and more specifically an interpreter, which is a simultaneous translator in the UN. And that was-- I'm not surprised by this. Go on. Tell us more.- That was my dream. At age 10, I told my parents I wanna be a translator and I wanna work in the UN with people from different countries, different cultures facing different, you know, political issues and I wanna be the conduit to communication. I'm sure I wasn't as articulate at that age, but to my surprise, my parents took me seriously. They were both academics. Maybe that's why they took me seriously. And they said, "Great, then let's get you to, you know, go to England and Ireland and Canada and France, and you're gonna learn English and French, and then we'll get you into college so that you can be in that path." But unbeknownst to them, one of the countries where they sent me was Ireland and Ireland was leading a technological, you know, transformation at the time. The Irish government was very smart in investing in tech companies in the early 2000s and a company called Google in 2006 when I was not even out of college. So my resume and contacted me about a web spam job. And the main reason for that was that I spoke three languages and they needed people that could translate the websites and really understand the spam. And as with anything, trust and safety, there's cultural nuances. So I got started.- So yeah, so let's maybe kind dig in a little here, because did you see that opportunity to achieve that UN perspective through this? Or like, how are you viewing this? Were you viewing this as like a pathway to that ultimate place where you wanted to get?- Yes, because not to bore you with the world of translation but to become a UN interpreter, I mean, I'm sure now this has changed, but for the most part you had to do a master's in either Geneva or Paris. And it was really hard to get into those masters. It required many years of experience in the translation field and me having fallen in love with Ireland, I thought, well, if I work in Ireland for a few years, I'll achieve, you know, the level of proficiency that's required to take the tests to go into those universities. And so it was a path. And I was really passionate about translation. I was also thinking about academia as a potential path for me. I was very into James Joyce, "Ulysses." And when I started working at Google in web Spam, which again was like, I'm gonna translate some websites, I'm gonna assess them for spam. I still didn't know trust and safety, what it entailed. Trust and safety was not a term that was used. You know, we were under online sales and operations. There was no-- Sales.- Yes, online sales and operations is the org. There was not such a thing as a trust and safety org. We were also called search quality and we would see attempts to manipulate the search results pages. So, you know, I'm gonna hide some keywords and I'm gonna rank higher. Now that's what got me into safety. It was the breaking of the rules to achieve a result through means that were not legitimate and I have this distinctive memory of realizing spam, like fighting spam, fighting harmful uses of technology. This is what I wanna do and I can't picture a life where I wouldn't do this.- And this was at Google?- [Patricia] Yes.- And so at Google, your goals and your view of what was for you in the future evolved?- Yes, that's right.- Okay, and so at what point,'cause I know after Google, I think you went to Facebook.- [Patricia] Yes.- At what point did the vocabulary maybe or version of the vocabulary that we use today become clearer to you and you're like, oh, this is something, trust and safety. Like, walk me through that.- I think it was 2009. I've been very lucky to be at the right place at the right time. And Facebook contacted me in 2008. They were about to open up the headquarters in Dublin and they wanted somebody to work in user operations so still not trust and safety, but user operations. And user operations was looking at abuse reports filed by users. They were also doing other things like security, right? Like, I lost my password, how do I reset it? And that was the first, I think, notion of trust and safety because the community guidelines, which are, you know, the rules governing the site beyond the terms of use, which were very legalistic. We were starting to assess content versus the community guidelines and those community guidelines were evolving almost on a daily basis. I don't even think they were public when I joined in February 2009. And so it was a time of deep philosophical conversations. But also, most importantly, what I was hoping to do was bring the European perspective. It was a very US-centric work and Europe was in a very different place. I mean, a lot of content is illegal in Europe that is not illegal in the US. And so my role was to, as a, you know, I think I was 24 years old, but I would be like,"Hey, so I assessed this report. Based on the community guidelines, I shouldn't take action however, I think we should maybe talk to the lawyers'cause we might need to do something for my jurisdiction." And that was safety. Like we were calling it safety and community standards. And I think that's the first time that I kinda like became aware that this is a field, it's rapidly evolving and the challenges that I'm facing at Facebook, YouTube is facing as well. Twitter was starting to, you know, like pop its head up and I think they were, I'm sure starting to face similar challenges. But nobody knew who was who unless you had worked at multiple companies.- Got it. You started at Google, you explained the kind of like evolution at Facebook and your kind of UN hat, if you will, chiming in on the global perspective. Is this the time when you kind of leaned into that more and continued to do a little bit, or started to do a little bit of public policy work? And like, how did that kind of get introduced to you?'Cause hearing you talk about your kind of childhood dreams or ambitions as a UN, like I've seen you in those types of conversations before so it's not surprising for me to hear that, but tell me a little bit about how that took shape.- Yeah, I love that question because it's such an unexpected path, I think in that year of 2009, I'm deep in the trenches of content moderation. I also become a manager for the first time. I'm bringing in content moderators to help me with France, Italy, Spain, those markets. And one thing happens. Facebook starts to get a lot of questions publicly about why did you take that piece of content down? Why didn't you take that one down? And the first person representing communications is hired. The first person representing public policy is hired. Lord Allan, he's he Baron of Hallam. He's still a lord. He was not a Lord at the time but Richard Allan joins and as soon as those two hires are made and there's a channel for those incoming queries, there's a need for somebody on the trust and safety side, which is still not called trust and safety, but it is trust and safety to explain what has happened with any piece of content, but also how we should explain the nuance of why we made a certain call. Because at times, a call may appear to be wrong, but if you take into account the context around it, it's not wrong in the moment. It's maybe the policy was falling short there. And so that became my role is give me those escalations and I'm gonna tell you, I'm gonna look under the hood and I'm gonna tell you here's the action that we took and how I would speak about this publicly. And civil society got very interested because the European Commission was finding two programs Insafe, safer internet centers, and INHOPE internet hotlines. Insafe focus mostly on objectionable content that targeted minors, INHOPE was child abuse material. And so these two big networks that have centers in each member state of the EU, you know, it's like, you know, player number two has entered the room. They entered the room and they're like, okay, so why did you take that down? Can we work with you? Can we give you more context so that you might not make...- You were a translator.- Yes, exactly.- You were an interpreter.- Yes, yes.- Yeah. It's full circle.- Yeah.- That's super interesting. And it makes total sense. And so when did you leave Facebook and go to Twitter?- I left Facebook in 2013 and, you know, you mentioned our colleague Sinead. She spoke at a parliamentary here in Ireland on behalf of Twitter. I spoke on behalf of Facebook. And I remember watching her. She spoke before me and she got a lot of heat from the members of Parliament. And I thought, well, she's got a big challenge in her hands. How fascinating. That would be so difficult. And I remember getting home that day and saying, I could never work for them. That's so much work. And then I joined them.- You started in a few weeks?- Yes, a few weeks. I got an email that same night and-- Stop it.- Yes, I did from the human resources team. And again, I found the challenge fascinating because Twitter was a very open platform. There was a lot more of open communication, less direct messages. I think at that time you couldn't even share images on direct messages. So it was, because it was open communication, it was challenging. It also had been used as a tool during the Arab Spring so it had already shown so much potential for human rights defenders and advancing democratic causes. And I thought, wow, with that great power comes great responsibility. I couldn't possibly be in the driving seat of that in any capacity, but precisely 'cause it was so exciting, I couldn't really, you know, turn my head away.- Yeah, was there a major takeaway that you have? Obviously we worked together when I was at Twitter, that's when I met you. Is there kind of a takeaway from that point in your career that you have as like a big learning or like a big moment?- Yes, there's a few. You know, I think coming from Facebook that had a very... I think there had been a lot of internal discussions about what Facebook should be or do we want it to be a family friendly site. And that showed in the way that we moderated content and set the rules. Twitter was very pro free speech. And that with itself brought its own challenges. But what I take from that time is you can do free speech with all of the challenges that it brings right if you have a very dedicated team. Every person that I met during my time at Twitter was incredibly dedicated, really wanted to push the boundary on free expression while at the same time preventing the worst of harms. And I hope you let me borrow your phrase, but it really has inspired the rest of my career. They didn't count on us. You know, when I think of the bad actors that might think because of free speech, you know, I'm gonna get away with this. You know, like free speech taken to be, you know, absolutist perspective. You know, there's no rule of law. And that team that was so dedicated, I think in my view, regardless of whether we made the right calls in terms of, you know, moderating content or building systems, there was a dedication and a passion to prevent harm that stays with me today and that's why I keep doing this job. I just think of every person that I work with during that time and I think we really hit a sweet spot and we could stand really proudly for the decisions that we were making.- Well said. I'm curious, is there a piece of advice if there's maybe a young woman, young man who wants to be a translator for the UN perhaps, or maybe wants to be, you know, head of trust and safety, in an AI company and they're just beginning their journey. What piece of advice do you have for them?- I would tell them to persevere. I started from the bottom. You know, when I first joined Google, they would tell you, you need to review X amount of hundreds of websites a day manually and I would do double that. That was my goal for the days. I'm gonna do double and I'm gonna learn all of the technicalities of spam. If you're dedicated, if you don't mind, you know, rolling up your sleeves, you're gonna learn a lot on the front lines and you need to persevere because it's tough. You see content that is very challenging. You see issues that seem like you can't solve them. You're never gonna solve them. It's like, how are we ever, as a society going to solve this? But you are part of a collective and there's a lot of power in that collective. And that's why this week is so exciting to me. I mentioned earlier you sometimes feel like you're working alone, but I turn around here today at lunchtime and I talk to somebody and they're facing very similar challenges. And so I would say, you know, roll up your sleeves, persevere, make connections. Every company I've, you know, worked for, I've made a point of trying to connect as many people as I could, whether it's in data science or product or engineering. I would be this crazy person that at lunchtime would sit at random tables. And that's hard, by the way, I know I might seem like an extrovert, but I have some introvert tendencies and putting myself out there at a lunch table when I don't know people is really hard. But getting to know people, making those connections. And even outside of trust and safety, I think people are very passionate about the challenges that we face and you can, you know, harness that interest in the connections that you're making. But don't give up and even though you'll see problems that seem unsolvable, work with others, rely on others and persevere 'cause this can be your community and I hope that, you know, you find a home in this work.- I love that. So you work a lot with, you have kind of over your career worked externally, so engaged with a number, civil society, regulators, government officials. What do you think is the biggest misconception? Generally humans outside of this space, but I think it's appropriate to ask for you, like, what's the biggest misconception like policy makers have of trust and safety?- I think it really boils down to the scale of it and the complexities and the nuances of any piece of content or account in a specific setting. Like if I give you an example of a content moderation call, I have to make. I present some facts to you and you might say, yeah, I would take that down, or I would suspend that user but sometimes when you take a step back and you look at the scale and you look at the fact that, well, that content moderator maybe has just reviewed 500 pieces of content before that and they have a couple of seconds to make a very complex call. Or there is, you know, a geopolitical factor that is coming into play. All of those nuances are really hard to capture. And, you know, I have conversations not just with policy makers and regulators, but even with my own family where they'll say like, oh no, you should allow that graphic artwork on the site. Because why wouldn't you? It's a drawing, right? Take L'Origine du monde which is a beautiful painting, is well, so what you should allow it. And then I have to provide that nuance of, but I have 13 and 14 year olds that might encounter that piece of content even though they didn't ask to see it in any way. Somebody might share it and it pops up on their feed. And so that nuance, I think is sometimes like lost in translation. The good news is that over the last, you know, 10 years, when I think about the public policy roles I've had at Facebook and Twitter, regulators and policy makers have become very sophisticated in their understanding. And also we have, you know, new pieces of legislation that give them more access to risk assessments and audits. So I think we're in a good path, but really, you know, I would say that it really boils down to the context in which our work takes place. It's sometimes impossible to articulate that well.- Yeah, so much of trust and safety is being able to understand the perspectives of other humans engaging and surrounded culturally, globally, contextually. I totally agree. So I've never met a trust and safety pro that is like home run, you know, perfect. I feel like it's the concept of continuous improvement is something that I think has really been core to trust and safety. So first with trust and safety specifically, what's one way you'd like to see the space get 5% better?- I mean, here I'm gonna talk about Thorn a little bit because, you know, last year at TrustCon, Charlotte Willner, the executive director who I've worked very closely with in the past, brought up, you know, child sexual abuse material. We would encounter an image and okay, we can handle it for our platform that we're a wall garden and how do we talk to the other platforms? And it was very discouraging that it felt like insurmountable like legal challenges to get to share, you know, information and best practices with people in the industry. When I look back now, I am in awe at the amount of collaboration that exists. You know, classifiers like Safer have completely transformed the industry and it's not an exaggeration. I wish somebody had told me back in, you know, 2010, don't worry, there's going to be a solution, a technical solution that is going to be put together by a third party that we trust that has a lot of expertise that you lack, you know, from the inside. And I would say for the future of our industry, we need a lot more of that collaboration. And it's not, you know, that's one example of child sexual exploitation, but what about non-consensual nudity of adults or hate speech. You know, it's more difficult because I think we can all agree about the legal status of content of that nature. And it's much trickier and much more sticky when it comes to other speech issues. Like what is considered hate speech in Spain where I grew up, you know, I grew up in the eighties and nineties with a lot of terrorism and we had some pretty strict laws, we still do around what you could say publicly. That's very different from what I can say in the US. And so while it is difficult, I think sharing intelligence signals and working together, having not TrustCon, the Integrity Institute working with civil society Insafe, INHOPE that I mentioned earlier, are really good examples of that. They gave us the chance to engage as industry. I met people from industry sitting at conferences that were put together by the European Commission. And I think that's how we're gonna get that 5% better, is to continue to think through how can we collaborate because we can't do it alone. And also, even if we tackle a challenge in our own platforms, we're never going to eradicate that abuse at the internet level.- True. We talked about some of your aha moments for your career or for your path, or your journey. In trust and safety, we all have our oh no moments. Was there an oh no moment for you over the years? And you don't have to get into specifics if you don't want to, but like, was there an oh no moment for you where you're like, oh, the game has changed in just kind of the kind of adversary nature of things or the evolution of like products. I remember being at Twitter and learning that the company had just purchased a live stream app and that was like an oh no moment. You know, for me just having to like get up to speed really quickly on kind of what that would mean. Was there one of those for you?- Yeah, I've had a few. I've had a few. Yeah, how much time do you have? But one that I distinctively remember is, I think you were part of meetings with the Trust and Safety council we had at Twitter. And we decided we wanna bring in civil society. At the end of the day, they are the experts on the different fields of abuse. And it would be great to keep them informed on any product feature we're developing, policy, et cetera, but also learn from them on a really regular basis about what they're seeing on their markets for that type of abuse. And I remember hearing about how one of our features, which was block was actually being misused by abusers on a domestic violence setting and that misuse, we had not anticipated. And unless you were in the weeds, if you were working with law enforcement, with victims of domestic violence, you probably would've been able to anticipate it. We didn't. And that was a real oh no. I mean, of course, before you put out a feature, you think through how can this be misused? But there's going to be some edge cases that they are too far, the mirrors in the car, you can't see them. And I thought if we had engaged the Trust and Safety Council earlier, perhaps we could have prevented that type of misuse and there were victims already. And we were able to fix it really quickly, but there were already victims and you know, one victim is one too many. So that was a real moment of, okay, we need to do better. This is not just Twitter or Facebook or Google. This impacts anybody who's in those platforms and it's our responsibility to really think about that adversarial, you know, red teaming and where is the misuse of the platform gonna come from.- Yeah, what's the future of trust and safety?- I'm hoping, you know, TSPA. I was thinking about it this morning. I remember when the TSPA became an idea because Adelin Cai, who used to work with us at Twitter, visited me at Postmates, a company where I was working in offline harms. And she brought it up and, "I have this idea, I thought it was," I mean, it was just her and a group of very impressive people and I was like, "I'm behind it. We're gonna join you. Absolutely." And now, four years on, I cannot believe it was just four years. It feels like the TSPA has been around for a long time. The Integrity Institute has been around for a long time. They haven't. And so I think for me, the future of trust and safety is a lot more of this. Like our awareness that we are a field of professionals. I think we knew, but we now feel entitled to feel pride, you know, for the work that we do. We tend to be the people that are like in the background. You know, when I meet parents of the kids that go to school with my kids and they ask me like, what do you do? I'm like, I work in online safety. And it feels weird to explain what you do. And I'm hoping that through the TSPA, through even... Like universities are developing incredible curricula for trust and safety too.- [John] And you can go to college for trust and safety now.- Yes, so going back to what Sinead had said, right?- Yes. Exactly. It's evolved.- Maybe my kids could see my work and say, "I wanna be a trust and safety professional and go to Stanford." And you know, I mean...- It's huge.- Yeah. Who knows?- It's huge.- Yeah, so I'm hoping that we continue going down this like professionalization, I don't know if that's a word. I'll make that up.- I think it is.- And just, you know, feeling that pride because I think we deserve it. We see, you know, difficult content, we try to protect users. I don't know anybody in this space that does it with any misguided motivation. You know, like, nobody's here'cause they wanna make millions of dollars. Or it's like you're in this field because you want to prevent harm and you want the internet to achieve its potential, you know, promise to society and advance society. And so I hope that we get to do more of that as a group of professionals.- That is a great way to wrap up the conversation. I found it to be very compelling and I learned a lot about you today, which is so cool. Are you excited? Are you speaking at TrustCon?- I'm speaking at a Birds of a Feather event about the DSA.- Okay.(both laugh) On that note, thank you so much for taking time out of your week to join us. I really appreciate it.- Thank you for having me. As you know, I'm a huge fan. I just said it. You've inspired so much of my career and it's such an honor to, you know, get to talk to you and also be within the Thorn universe because you are, don't tell anybody, but you are my favorite nonprofit and experts on the fields of like child safety. So thank you. It's really an honor for me to get to talk to you.- I really appreciate it. Thank you.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Ctrl-Alt-Speech Artwork

Ctrl-Alt-Speech

Mike Masnick & Ben Whitelaw