
Safe Space - Talking Trust and Safety
What does it take to make the Internet a safer place? More importantly, who does it take? Safe Space explores the human side of trust & safety—featuring the leaders, thinkers, and builders working to protect online communities. Hosted by John Starr, VP of Operations and Strategic Impact at Thorn, this podcast dives into the personal journeys of those shaping the future of the digital world—one policy, one decision, and one conversation at a time.
Presented by Safer by Thorn.
Safe Space - Talking Trust and Safety
Safe Space - Talking Trust & Safety: Yoel Roth | EP 1
In this candid conversation, Yoel Roth, VP at Match Group and Twitter alum, shares his personal and professional evolution in Trust & Safety. Host John Starr, Thorn’s VP of Strategic Impact, asks Roth to reflect on his interests in the early internet days and lessons learned while shaping the safety landscape of today's biggest platforms. Discover how a curious grad student became a pivotal figure in online protection, and learn why Roth believes we need to shift from reactive moderation to proactive safety by design.
Yoel brings a fresh perspective to hard questions about what we can do better as a field and gives his take on why the perception of T&S as the "internet janitors" just cleaning up the bad stuff needs a serious update.
Learn more about Thorn's technology for good at safer.io
- I am super excited to be joined by my friend and the current VP of Trust and Safety at Match Group Yoel Roth. We are here for TrustCon. Trust Con. Are you excited? Yes. It's, it is the event of the year every year for three years running. It's true. How many sessions are you speaking at this year? Just two. But I, I feel like at TrustCon, the most important thing is always the hallway track. And so I feel like there's, you know, an hour or two where you're on a panel or whatever, and then many, many hours of reconnecting with colleagues and friends and catching up after a year. For sure. So, as we mentioned, we're here at TrustCon and we're actually experimenting with some video content that we're bringing out to all of you. We're here, of course, to talk trust and safety. We're here to talk about it at a slightly different angle. I'm not intending to go in the weeds on any sort of trust and safety outcomes, or go back and forth on the merits of, of new regulation. I'm, I'm genuinely interested in talking to people who make up the space, who helped form the space, their stories, and their journeys into trust and safety. I think that it's one of the most interesting facets about the space. And we have maybe one of, if not the most interesting person, people in trust and safety right now with us today. So we're gonna talk to Yoel about his journey into the space. We're gonna talk to him about what he's learned along the way, and look whether or not you're like a seasoned pro or you're new to the space, or you're interested in trust and safety and trying to get into the space, I hope that you'll find this interesting. So, Yoel, what, so let's start with where you are now at Match Group. What does a VP of Trust and say, like, what does your job look like there? Those listening may not fully understand the complexity, especially of your organization. Give us a little sense of like your day to day and your role. Yeah. Match Group builds dating apps. Our mission is to help people in the world connect with each other safely and authentically and find love and relationships and whatever else it is that people are looking for. And we are the parent company of some of the biggest apps in the dating space, including Tinder and Hinge and dozens of other services worldwide. And so that's the first thing folks should know about my role at Match Group, which is, it's not one company or one app, it's 40 different apps. And each of those apps works a little bit differently. Each of those brands approaches trust and safety questions in slightly different ways. And sitting where I do at the central level, our job is really to help coordinate all of the work happening across Match Group on safety and integrity issues and make sure that we're all building in a positive direction. That we're taking the lessons from each of our brands and socializing that portfolio wide. And that we're continually pushing forward the state of the art of trust and safety across every one of our 40 apps and services. Easy job. You know, I've been in the role about six months now and, and I'll say every day I am learning something new. Yeah, yeah. You know, that, that actually is a really good segue. So from a very well-known trust and safety pro to maybe one of the most well known Twitter alum. So Sinead McSweeney, shout out to Sinead if she's watching, she did a Ted Talk where she talked about her life, you know, growing up and how people always would ask her, you know, what do you want to do when you grow up? And she really posited this idea of like, Twitter didn't exist when she was growing up. The role that she had didn't exist. And that's very true for a lot of people in the trust and safety field. Take us way back. What did Yoel want to be when he was growing up? I wanted to be a paleontologist because I just loved Jurassic Park. Yeah. But you know, I, I grew up when the internet was first becoming a really mainstream phenomenon. I remember when we got our first dial-up modem at home. I remember when we got our first broadband access at home. And I remember as a middle schooler and a teenager, the way that the internet helped me connect with people in the world who shared my interests, whether it was arguing about politics or video games, or I played the cello, I found communities of folks who shared those interests. And it felt magical from the very beginning, from the weird moments of AOL chat rooms and ICQ and message boards and live journal. There was something about that ability to connect with people that felt, that was transformative for me and that I felt was going to be one of the defining features of my life. And I didn't know exactly what form that would take. Yeah. But you know, especially when I was in high school, I was starting to, to figure out my sexuality. It was something that was so important for me and for how I developed that I realized I wanted to spend my life trying to make sure as many other people could have those positive, transformative experiences of technology as I did. And that took a lot of different forms over the course of my career. But I feel like the unifying thread is, the internet is magical. How can I preserve as much of that magic and push back on as much of the dark side as possible? I think that's really interesting. So let's maybe take a kind of a beat here. So you went from that, that sort of revelation if you will, or that kind of acknowledgement to then, then you really got your, like start in this space in academia. Can you talk about what that was like? And also like was the field of trust and safety formed then? Or was it something that we were still struggling to like find the vocabulary? Certainly I was struggling to find the vocabulary for it. You know, I went into academia because you could say it's the family business. Nearly everybody in my family is a Dr. Roth. And so when I graduated from college, it was, it was sort of a question of, well, are you getting a PhD or are you going to law school? And I made the, the foolish decision to get a PhD and studied what I was interested in, which was the internet. When I was in college, something crazy happened, which was when the iPhone came out and I was an early adopter. I got one on day one. And to me the iPhone again felt like one of these like sea change moments for technology. And I got really interested in what the rules were for the iPhone. And specifically I, I kept seeing these moments where some things were or weren't allowed on the app store. And I kept asking the question, well, why is this the case? And I read a biography by Walter Isaacson of Steve Jobs, in which he recounts some conversations he had with Steve about Steve Jobs's ideas about the app store. And he said, you know, I, this being Steve Jobs, I don't like porn and so I wanna give people on the iPhone freedom from porn and therefore the app store doesn't allow porn. And I was just struck by the absurdity of that. Right. Not feel however you wanna feel about porn, but this device, it's in my pocket, your pocket, billions of people have smartphones. Every one of those billions of people are affected by a decision that was made because Steve Jobs didn't like porn. And the legitimacy of that choice and the system of governance underlying it struck me as something worth studying. And so I worked on a PhD about that. Very cool. So you worked on your PhD, you defended your dissertation and then you went to Twitter. So my first time at Twitter was actually while I was still working on my dissertation. And as folks who have come from an academic background will know everybody who's on the journey of writing a dissertation at some point hates it. And I reached that point of absolutely hating my dissertation. And so I thought, what can I do for a summer that isn't writing my dissertation? And, and I don't recommend this as a career move, but I applied for one internship, at Twitter, which was a platform I really liked. And for whatever crazy reason the people at Twitter decided to hire me. Now was this, was this the time I missed your interview or, yes. One of my interviewers was John who totally missed my interview and, Not my proudest moment. You know, it happens. But I had the opportunity to, to move to San Francisco and spend three months at Twitter just researching safety and working with the team that I learned was called Trust and Safety. I didn't, I didn't really know what this looked like from within a company before, but I got to meet some of the people doing this work in the earliest days of the company. I got to participate in their discussions and their debates. I got to do content moderation, hands-on and sort of feel emotionally what it's like to have to make these decisions. And at the end of my summer, I realized there is nothing in the world I wanted to do more. Wow. What, so can you, can you maybe give a little bit of color because I totally remember that time and like putting aside the fact that I totally whiffed I was doing something really important. I'm sure, I'm sure. And like you were very gracious Single-handedly dealing with ISIS or something. Single handedly, yeah. And so I'm literally DMing you like apologizing and hoping to get on another call with you. But putting aside that give, like, I think you, I think you did a really, a really a really great job of talking about like a general sense of like, it sounds like I'll paraphrasing a bit. Like you, you essentially like saw what your future could be like with the humans and with the problems and the intersections. What was like, was there a moment during that internship where you're like, oh yeah, I found like, like was there a moment? Can you, can you give us a little bit of texture? So every Friday at Twitter we had something called Tea Time. Yes. Familiar with this. It, it typically did not involve tea. It usually involved wine and beer. Right. But the idea was the whole company would get together, executives would talk about strategy and what we're working on, and we'd all hang out and have a good time. And I remember that at one point while I was an intern at Twitter, there was a user protest. I forget what exactly our users were mad at us about it definitely related to our handling of, of safety issues. And I remember thinking, you know what if I download all of these tweets and classify them and just start saying like, it's not a user protest, it is a protest about this specific thing. And so I developed a content classification taxonomy and I kind of broke down the thousands of tweets that were part of this protest and did what grad school had trained me to do, which is produce data analysis. And I said, you know, here are the top 20 themes in this protest conversation. And what was really surprising was that it wasn't people who were just kind of nebulously upset. It wasn't even mostly about our policies, it was about the product. It was like, we want the block feature to do this, and today it doesn't do that. Or we want search to work this way and not that way. And I didn't know if anybody would care about this analysis. I sent it to my boss and was just like, Hey, like here's the spreadsheet. Maybe this is interesting to you. And then I'm sitting at tea time that Friday and Dick Costello, who is the CEO of the company at the time says, our trust and safety team just did this incredible analysis of user discussion about what people want our safety features to be. And we're gonna do that. And I was just flabbergasted. There was this analysis that I had done in a couple of days that had made its way to the CEO of the company and there was the CEO saying, we're going to do what our users are telling us. Because one random dude in trust and safety listened to users and wrote it down. Very cool. And I went back to graduate school and then over the next few months I watched feature after feature after feature come out at Twitter that were the things on my spreadsheet. So that's really great. And that's really, that's really insightful to I think the culture that was there then. Yeah. And, and, and kinda the experience that I think a lot of people had there. So you go back to school and then you come back to Twitter, you we could, we could spend a day in here talking about that. Give us what was that like for you at Twitter? I know it's probably speaking so full disclosure, Yoel and I worked together at Twitter. I learned everything I know about trust and safety from John. Yeah. And so one of the things that I think is true about Twitter is you have kind of different chapters there. Yeah. As you did as well. Try to try to give us a little bit of a, a flavor of your time at Twitter. What did you learn and what do you, you know, just kind of what were some big kind of takeaways that that kind of stay with you from there? Yeah. You know, one of the, one of the defining features of Twitter culturally was that it was a company very willing to give smart people the space to explore what they thought was impactful or interesting. Yeah. And I was really fortunate to have leaders and mentors at Twitter, like you and like Del Harvey who would listen when I would come to you with something that I thought was a problem and then say, okay, go explore that. For example, when I started at Twitter, I got really interested in data licensing and the Twitter APIs. Yeah. Which was this like total niche area that nobody was really paying attention to. And I like, I thought it was kind of cool and kooky. And so I was like, what if I paid a little bit of attention to this in like 20% of my time? And so I started to build out some policies and I started to build out some tooling. And then about a year later, Cambridge Analytica happened and all of a sudden everybody in the tech industry is thinking about APIs and data privacy issues. And Twitter was ahead of that because the company gave me a little bit of space to just follow this issue I happened to be interested in. And that was, it was such a defining thing at the company that at every juncture in my journey, and I think the journey of lots of other people, we had limitless opportunity for what we could work on and the freedom and the flexibility to explore those issues. And so one of the takeaways for me was don't just do the things that you already believe are going to be the most important or the most significant or the things everybody wants to work on. If you're interested in some niche issue, go work on it, explore it, see if there's something to it.'cause you don't know what's going to be the issue a year or two years from now. And it might well be that niche issue that you worked on today. The sort of other kind of big defining moment for me at Twitter was realizing just how eclectic and diverse the service is in my academic life. The fancy word for for this was polysemic services are what people make of them. And Twitter was really defined by that. So many of the things that make Twitter what it is from the hashtag to at mentions and replies, those were all user derived innovations. They were things that the community on Twitter built and then the company helped kind of implement them in the product in certain ways. And I always thought that was really magical, right? It was not that this product was our top-down vision. It was more like we were stewards of this thing on behalf of the community and we were building something that would respect what the voices on Twitter wanted and the diversity of those voices, the different ways people were using our product, from music to sports, to tv to politics, to anime discussions. All of that really taught me to have a focus on not just what most people are doing, but also what are marginalized folks doing? What are the non-dominant voices doing? And how can we build a product that serves them as well? Very cool. And so just to maybe tie a bow on your journey, tell me a little bit, share a little bit about, like, I, when I heard you went to Match Group, it wasn't super surprising to me, especially based on, we didn't talk about what you did your dissertation on or what kind of interests you, but like, it wasn't shocking to me. Can you maybe give some color on kind of why you said yes to them? Yeah. No, I, I've found dating apps fascinating for my entire life. Yep. I can count the number of dates I've been on in my life that I didn't meet online. On one hand I met my husband on an app. Yeah. And I think dating and love and romance and relationships are, are a foundational and universal part of the human experience. And today a lot of that experience plays out online for L-G-B-T-Q folks. More than half of relationships form online. Most people have at least tried online dating. And so I think there's a ton of opportunity there to build products that not only are ubiquitous and quite widely used, but that also helped make people happy. Hmm. You know, I, I really loved the work that we did at Twitter to give people a space to debate politics and to have civically important conversations. And now I have the opportunity to work on products that help people find love. And I think both of those are important, but they're different facets of the human experience. And so I like getting to work on both of them. What's, what's like, is there a piece of advice that you would give a young academic looking to explore or break into the trust and safety field? Work faster and work publicly? Ooh, tell me more about this. Academia is for lots of very good reasons, all about the slow and methodical production of knowledge. It's about producing work that will stand the test of time that you'll write an article and 50 years from now, a class of students will read your article and say, damn, that's really smart. And that's important, but that's not the thing that's necessarily gonna drive impact in the real world. If you care about shaping what technology looks like, if you care about influencing policy discussions, if you want to make the products you use better think faster and think scrappier. Right. Publish your work now. Publish it open access. Seek opportunities to publish in journals with rapid turnaround times. And then really think about how you can share your work publicly as you're doing it. Be willing to put your rough drafts out there in public and get feedback on them. Feedback is hard, but approach it with humility and openness and a growth mindset. And you can have way more impact by just doing your work out there in public in the world, rather than saying, I can only put my work out there after it's gone through three years of peer review. Really powerful work faster and scrappier. Let's, let's maybe zoom out a little bit and talk about the space a little bit. So what, from your perspective, what is the most misunderstood part of trust and safety or of what it's like to be a trust and safety pro? What do you wish more people knew about us? I think the biggest misconception about trust and safety is that it's just censorship or it's just reactively cleaning up bad stuff after it happens. I think one of the most interesting developments in the trust and safety field from when I first started studying it 15 years ago to now, is that we're not just the janitors of the internet. We're thinking about how to build products that are resilient to abuse upfront. Julie Inman Grant, who's the eSafety commissioner in Australia, and a former colleague of ours at Twitter Yeah. Has for years now been talking about safety by design as a core thing trust and safety needs to be doing. It's about making it so that your products are more resilient to misuse and are safer through the very fundamentals of how the product is built. And I think that's incredibly important. It's not just about dealing with harmful behavior once it happens, it's about making products that encourage civility and respect and kindness and authenticity. And I think there's ways you can build a product that does that. And that's such a powerful shift in the way that we think about trust and safety. It's something we were on the leading edge of at Twitter. We built one of the first teams that did this shout out to Product Trust and it's, I think it's really where impact in the space is going to be going forward. Awesome. So if you're in trust and safety, you know that it is not a perfect science. It's a lot of continuous improvement engine and always striving to to be better. What, for, from your perspective, what's a, what's a part of the space that we could be 5%, 8% better in? Yeah. One of the most important things we can do as a field is being more willing to share with each other what we know, what we've learned and what hasn't worked. I think a lot of the expertise in the trust and safety field stays locked behind the walls of NDAs and within big companies. The amount of expertise that exists at Meta, at Google, YouTube, Reddit, Pinterest, Snapchat is immense. We could solve all of the world's problems if we could figure out a way to pool our knowledge and our experience and our tooling and what we've built. And spaces like the TSPA and TrustCon are starting to build more of that community. But I think the direction of travel here is not going to be one company just getting incrementally better and better. It's going to be all of the professionals working in this field, bringing their 5% improvement together with everybody else's 5% improvement. And then we can really start to chip away. What does the future of trust and safety look like? I'll give you two answers to that. Okay. The first one is Safety by Design. I can't stress enough the way that we have to start thinking about trust and safety work as not being moderation after the fact, but design before the fact. We've gotta be building a culture of trust and safety within product teams. We've got to have every engineer working on product development, thinking about safety and misuse. We've gotta be red teaming products before we launch them so that we can think about adversarial use ahead of time. And I think that's one of the most important things we're seeing companies build out today. And I really see that as the future of the field. The second piece of it is something that I've been doing as a side project since I left Twitter, which is helping to build out a hub for open source trust and safety tooling. One of the intuitions that I've had, and that a few other folks in the field have had is that the technology that we build from hashing and matching systems to rules engines is locked away within specific companies. And that each of us who move between companies end up building the same things over and over and over and over again. What it would, what would it look like if we broke that cycle? Yeah. What would it look like if we built a hasher matcher and we open sourced it, which is what our colleagues at Meta did, what would it look like if we built a rules engine like Smite or bot maker tools that we had at Twitter and we published the code on GitHub so anybody could use it. And it's that motivation that's leading myself, Camille Francois, who's a professor at Columbia, Dave Wilner, Juliet Chen, and a group of us to start to build out a hub for accelerating this type of open source development. We think that this can accelerate work at the big companies and then critically we think it can accelerate work at small companies just entering the space so that the entry costs aren't so high. If you wanna build an amazing new social product, you should be able to use free and open source tools that build a safe experience from day one without needing to spend years and years building trust and safety infrastructure first. And so I'm really excited about that as a future direction for the field. Very cool. To, just to double click a little bit on the safety by design component. You know, obviously Thorn is a nonprofit, we are focused very clearly on a, I would say it's a very small but important part of the trust and safety kind of, you know, landscape of harms. How, how does a company like Match Group or Twitter, how do you think about staffing or having the kind of expertise to certainly like think about concepts like safety by design with a child's perspective in mind, or a teenager's perspective in mind, or a young person's perspective in mind in the context of safety by design? Or kind of in general, how do you, how do you think about like capturing these very, you know, unique subject matter experts and tying them into your general process? Yeah, I mean, the most important thing you can do is talk early and talk often to a wide range of experts and voices in the field. Even a company 10 times the size of Match Group 10 times the size of Twitter, how 10 times the size of Meta is never going to have every perspective represented internally. You can do your best to hire diverse teams, you should, but you're never going to have a comprehensive perspective. And so partnership with outside groups, including Thorn, who bring that wide range of intersectional expertise is really, really critical. At Match Group, we have a trust and safety advisory council, which Thorne is a part of where we talk about the policies, we're developing, the products that we're building, and we bring some of the big sticky questions about what we're building to the group. Not to, you know, just check a box and tell you here's what we're doing, but to actually seek feedback and perspectives and questions. And I would really encourage every trust and safety practitioner to seek those opportunities to find partners in spaces where you have blind spots. And to really, again, don't treat it as a PR exercise, don't treat it as box checking, treat it as a chance to approach these questions with humility and with a growth mindset. It's been such a pleasure talking to you, man. It's been a long time. I can't thank you enough for joining us and I hope you have a really good rest of the week at TrustCon. Thanks for the invite. Thanks. This is awesome. Hey guys, I'm John Starr, VP of Strategic Impact at Thorn. Thorn builds technology to defend children against sexual abuse. If you're interested to learn more about our organizations or the solutions that we build, check out safer.io.