March 16, 2023
EP. 309 — How Smart Is AI? (Part Two) with Meredith Broussard
We’re back with the second half of our special two-part episode about algorithms and artificial intelligence, featuring returning guest Meredith Broussard. Yesterday, we got the T on ChatGPT, and learned why we can’t trust algorithms to be fair or objective. Today, Jonathan and Meredith get personal about their encounters with algorithms, and consider what an equitable tech industry could look like. Plus, deepfakes, TikTok, the Supreme Court, and Jonathan’s take on M3GAN!
Meredith Broussard is Associate Professor at the Arthur L. Carter Journalism Institute of New York University and Research Director at the NYU Alliance for Public Interest Technology. She is the author of Artificial Unintelligence: How Computers Misunderstand the World (MIT Press). Her work has been featured in the New Yorker, the New York Times, the Atlantic, BBC, Wired, the Economist, and more. She appears in the 2020 documentary Coded Bias and serves on the advisory board for the Center for Critical Race & Digital Studies.
Make sure to check out Meredith’s new book More Than A Glitch, out now from MIT Press.
You can follow Meredith on Instagram and Twitter @merbroussard, and at meredithbroussard.com. MIT Press is on Instagram and Twitter @mitpress.
Curious for more? Here’s a list of people, projects, and other resources mentioned in this episode:
Algorithmic Justice League (Joy Buolamwini)
Algorithms of Oppression (Safiya Noble)
Blueprint for an AI Bill of Rights
DAIR Institute (Timnit Gebru)
Deb Raji’s work
Design Justice (Sasha Costanza-Chock)
Melissa Murray’s work
Mimi Onuoha’s work
Race After Technology + Viral Justice (Ruha Benjamin)
Rumman Chowdhury’s work
Take My Hand (Dolen Perkins-Valdez)
The Markup (Julia Angwin)
Under the Skin (Linda Villarosa)
Weapons of Math Destruction + ORCAA (Cathy O’Neil)
Follow us on Instagram and Twitter @CuriousWithJVN to join the conversation.
Jonathan is on Instagram and Twitter @JVN and @Jonathan.Vanness on Facebook.
Transcripts for each episode are available at JonathanVanNess.com.
Our executive producer is Erica Getto. Our editor is Andrew Carson.
Our theme music is “Freak” by QUIÑ; for more, head to TheQuinCat.com.
Transcript
Getting Curious with Jonathan Van Ness & Meredith Broussard
PART II
JVN [00:00:00] Welcome to Getting Curious. I’m Jonathan Van Ness and every week I sit down for a gorgeous conversation with a brilliant expert to learn all about something that makes me curious. On today’s episode, I’m so freakin’ excited, because we are joined again by Meredith Broussard for part two of our conversation about artificial intelligence. If you have not had a chance to listen to part one, I’d recommend revisiting it—it’s just the one literally before this. We talk AI basics, ChatGPT, and that fucker Francis Galton. Today we’re going more in-depth on algorithmic bias, and what the future holds for artificial intelligence. Let’s get into that part of the story. Without further ado, here’s part two of our conversation, where we’re asking: How smart is AI? Part two! So in our gender episode of Getting Curious on Netflix. We did a sketch that riffed on a computer not being able to break free of the gender binary. Is our world, even though I already know that this is true. But just in case you need a literal NYU professor to explain it. Is our tech really hardwired for a gender binary and cis gender norms?
MEREDITH BROUSSARD [00:01:07] Hundred percent. Yes. The reason for this is that our tech reflects basically 1950s ideas about gender, because that’s when computers were made. Our systems, like, NYU core university systems, like our student management systems, were developed in the sixties. Right. And so they have these really retro ideas about gender. So NYU has done a really good job of updating its computer systems so that now our students can have a range of gender options. And also our students get identified to professors, librarians, and what have you using their preferred name, not necessarily their legal name. And this was an extremely expensive and time consuming change that only happened in the past couple of years because people recognized, “Oh, wait, like, we should not be deadnaming our students in these systems, we should not be forcing students to identify as one of only two genders because gender is a spectrum. People use different pronouns. People change their names. And it’s really important for our computational systems to reflect the reality as opposed to forcing our students to fit into some narrow 1950s concept of gender.” I would love to ask you, if that’s okay, like, how do you experience tech systems that don’t have a, a representation of you? Like, in the dropdown. Would that be okay?
JVN [00:02:47] Sure. But I do, before we do that, that’s the thing that blows my mind about, like, these bills in Arkansas, Oklahoma, even Texas, these, like, anti-trans laws. And the one in Tennessee that’s been proposed, it’s, like, being called the “anti-drag law,” but it’s really the anti-trans law because it, like, it it criminalizes anyone who dresses in clothes that are not of their assigned sex and can be seen in public because, like, it’s written so vague that, like, anyone can be criminalized, much less, like, a performer.
MEREDITH BROUSSARD [00:03:14] Yeah, it’s terrible.
JVN [00:03:15] But that’s the point, it’s that these legislators are trying to legislate, like, against reality. Like, the reality is, is that trans people have existed for ever. The reality is, is that young trans people do exist. Like, they’re not being groomed, they do exist. Like, young trans people have been existing, like, you know, kids, adults, everything in between, like we exist. And you can legislate against it all you want, but you’re just legislating against a reality. The reality is is that we do exist at all ages. People learn that they are trans at different ages. That is true, obviously, like, society has, you know, a say in how someone has their gender expression, because obviously we all exist within our society and how people treat you is going to have an affect on—more or less of an affect on—individuals. So, you know, that goes without saying. Now, to answer your earlier question, because I’m non-binary, I feel like some of these affect me less than our trans siblings because when I think about the cost, like, the financial and emotional toll of changing your, you know, identification cards, dealing with schools, dealing with higher education, dealing with health care, all of those things are made infinitely harder by being trans versus non-binary, I would say. I do think it is harder. That being said, these are some of the ways that I have experienced, like, algorithmic bias from my gender identity. So because I have long hair, because I wear tights, because I wear a lot of, like, long, like, t-shirts that cover me, like, when I go through an airport, like, without fail, if I ever go through a scanner, like, I’m always stopped for, ahem, pat down, those things can not tell, like, if there is something in my bun, they can’t tell my privates. They think that I am a woman. They think because I have long hair, because I’m wearing, like, dresses. So I’m always stopped. I’m always—and frequently I will be taken into, like, a separate room, like, away because I have been flagged as, like, either being suspicious or being, like, maybe hiding something in my hair or in my tights or in my ankle or in my fucking private parts. That has happened to me.
MEREDITH BROUSSARD [00:05:21] It’s dehumanizing.
JVN [00:05:22] Yeah, tons of times. Because I don’t fit into, like, the outline of what the thing reads is going to be a man or whatever. So there’s that. Another one, this is more of a funny one, but there was that TikTok filter recently of, like, talking to your younger self and because I had, like, long hair and a beard, it didn’t know who I was. So it was—it gave me, like, lipstick and kept the beard underneath my, like, neck. But then it obviously did not look like me as a teenager. Like, as a teenager, I had short hair. I looked like a cis teenager, you know. So just being able to speak to your younger self when you’re trans or non-binary is a different thing. And there is no, like, I just looked at it and I was, like, “What’s it going to say I looked like as a teenager?” And I couldn’t help but laugh because I was just so funny. And then another one that more pisses me off and frequently pisses me off. But again, champagne problems is, like, if I go into a store, like, a high fashion store and I want to buy a bag at Chanel, or I want to buy a baguette wherever I want to buy one of my, like, pretty bags from, without fail—I have never, not been to an Hermés, a Chanel, a Fendi, none of those high fashion places—not one that has, like, non-binary as an identification marker. You can’t put it like you cannot put that in your client profile. So every time I go to impulse buy a fucking expensive purse that makes my inner child happy, I’m always left a little bit, like, having to, like, confront my capitalistic bitch ass because I’m, like, “These people won’t even fucking honor your gender expression.” And here I am, and you know, I got to grapple with that shit because these fuckers can’t even be bothered to put it in non-binary thing on their fucking thing! So those are just three quick ways. Not that I’m upset about any of them. I—just kidding.
MEREDITH BROUSSARD [00:07:02] Yeah, you should be upset, like, I’m upset for you. I started thinking about this originally because when I was a kid and I started having to fill out forms, you know, the form would ask for race and, well, I’m a Black woman, but my background is multi-racial. And so I would get upset that, like, there wasn’t a box for me to tick. And one of the things that’s really important as a, as a writer is to, you know, not only think about your own experience, I mean, I write a lot about race and technology, but also to think about the experience of people who are not like you. And so I started thinking about, “Alright, well, what is it like to not have a box for your gender identity?” And I was like, “Oh, that feels just as shitty as it does to not have a box for your racial identity.” And my God, like, the number of times that you get confronted with, like, your identity being erased inside technological systems, like, it’s, it’s maddening. So I write a little bit in the book about the airport scanners. Sasha Costanza-Chock, whose book Design Justice is one I would definitely recommend. And Sasha writes about the airport scanners. So those things have a frickin’ button. Okay. Like, a lot of them have a button, the TSA agent will press the pink or blue button, like, when you walk through. And that’s how they determine what they scan for, which is, obviously, incredibly hostile to trans people.
JVN [00:08:32] That makes sense why I get stopped all the time. Yeah. So it’s, like, I should really just be wearing, like, jeans and a sweatshirt. Like, don’t ever wear what I would actually wear to an airport because, like, you can guarantee you’re going to get stopped. So in our previous conversations, we’ve talked about biases against darker skin in AI. In your new book, you discuss this again in reference to medical racism. So how is medical racism influenced or reinforced by tech?
MEREDITH BROUSSARD [00:08:56] So all of the things we’ve talked about already in terms of all the bad stuff about humanity being embedded in our AI systems, this comes up again when we’re talking about the technology of medicine. There is a really common misconception out there that Black people have greater muscle mass. And this is a—, this is a racist notion. It comes from a fetishization of Black male strength in the slavery era. And interestingly, this racist notion gets embedded in something like a measurement of kidney function, right. So bear with me for a second, I’m gonna chat about formulas. When your kidney function declines, it gets measured by something called an EGFR, an Estimated Glomerular Filtration Rate. And when your EGFR gets down to 20, that’s when you are eligible for the kidney transplant list, and you kind of hang out on the list and you wait for a new kidney. Well, the way that your EGFR gets calculated up until very, very recently, up until the last couple of years, was based on race, right. So because Black people were thought to have greater muscle mass, they had an extra multiplier thrown into the EGFR formula so that Black people with kidney disease qualified for the transplant list later than white people. Right. And so when you develop an algorithm that evaluates somebody’s eligibility for the kidney transplant list, you have this racist notion embedded in the actual technology that is calculating whether somebody gets a lifesaving organ. I mean, this is medical racism.
JVN [00:10:54] And just to drive the point home, but it’s already pretty fucking clear. But, like, as we said before, with, like, ChatGPT, it’s, like, you’re putting in parameters to feed this computer, to determine the statistics and the probable outcomes of the question that you’re asking. So if the algorithm is, “Does this person need a kidney transplant?” But it’s using 60 plus years or 70 plus years of information in medical textbooks, and in medical things that have been, you know, literally just taken from old texts and put into this algorithm, like, no one is, is programing it to say, like, “Leave out the 50 years of the multiplier that said that Black people had a higher muscle mass.” And unless it is audited and explicitly programmed out, which my understanding is that it is not, it is a continued problem, even if they don’t explicitly run it through, like, a non-race based thing now, like, in the last few years, it’s, like, that part has been air quote “corrected,” but it still has information that wasn’t corrected or, like, fixed. Is that correct?
MEREDITH BROUSSARD [00:11:53] Yeah. Yeah. And you see this in all kinds of calculations. There was a quote unquote “race correction” in calculating which football players were eligible for the NFL payout for concussions, right. So Dorothy Roberts has written about this and Black players were considered to have lower mental capacity and so they were calculated to get lower payouts from the NFL’s concussion fund than the white players. Or we have things like pulse oximeters, which came to prominence during the pandemic because people realized that, “Oh wait, pulse oximeters work well on lighter skin and they don’t work well on darker skin.” Another thing that I write about in the book is an AI-based tool that Google put out within the past couple of years that claimed to identify skin conditions. And, you know, the idea was, “Oh, yeah, we could eventually, like, use a tool like this to diagnose skin cancer.” Well, guess what? It only worked really well on light skin. It didn’t work well on dark skin. So all of the horrible stuff in human history is embedded in our technological systems, which is why we just shouldn’t be technochauvinist. We shouldn’t have total faith in these systems.
JVN [00:13:14] And also, like, with the dermatology one I was thinking, like, if you would have had a derm and an engineer to be there to be, like, “Hey, have we run this on other skin tones, like, how well does it work on other skin tones?” It’s, like, when there wasn’t enough cooks in the kitchen in the making of the thing, you know, like, if this technology is there to work for some people, then there is technology that would make it work for everyone. And if there’s not, then fucking put a lid in it until it does work for everyone.
MEREDITH BROUSSARD [00:13:39] Yeah, until and unless it works for everybody.
JVN [00:13:43] Yeah!
MEREDITH BROUSSARD [00:13:44] But then we can also look at medical textbooks, right? Like what? Look at the illustrations in medical textbooks. Well, these are mostly people with light skin. Like, what is the training data for AI in medicine? Well, it’s mostly, you know, it’s mostly white people. Well, who has better access to health care? Like, we know that access to health care is race- and class-based.
JVN [00:14:07] So, Dorothy Roberts, you mentioned her earlier.
MEREDITH BROUSSARD [00:14:11] She’s so brilliant.
JVN [00:14:12] She’s brilliant. And she did, she did the work on, like, the NFL thing? Was she able to, like, take them to fucking task? Like, was she able to like, tell them, “What the fuck!” and they fixed it? Or no.
MEREDITH BROUSSARD [00:14:22] It is my understanding that the NFL thing has been resolved. I should say one more thing about the kidney thing.
JVN [00:14:29] Please.
MEREDITH BROUSSARD [00:14:30] So this EGFR correction was the standard in medicine for a very long time. And so patients’ advocates spoke up about the inequality embedded in this calculation. And they did manage to get it changed in the past couple of years, which is great and is absolutely a cause for celebration and, like, a triumph for algorithmic justice. What we do need to do now, though, is we need to go in and audit all of the lab systems, like, every single lab, everywhere in the world that used this calculation needs to have its software and processes updated. And we also need to go into every kidney transplant list and, like, rearrange people because, you know, Black and brown people have been pushed to the end of the list. And so, you know, they probably need to be higher up on the list.
JVN [00:15:25] I have another question on that, too, because, like, I mean, another place that jumped into my mind when you were telling me about racism in the kidney thing, is that, like, Black women have, like, isn’t it, like, three times more likely to die in childbirth in the US? And just even with, like, pain and, like, just how, like, the medical field treats, like, Black women’s pain, like, it’s, like, childbirth, but also, like, just pain in general versus their white counterparts. Like, is that just, like, good old fashioned interpersonal racism or is that also reinforced from, like, algorithmic bias, too? Like, does that kidney bias, like, work its way into, like—it must, doesn’t it?
MEREDITH BROUSSARD [00:16:03] I will say that researching medical racism was one of the hardest things that I did in writing the book, because the history goes so deep and because it’s just so pervasive. A couple books that were really good resources for me as I was researching. There’s a new book by Linda Villarosa called Under the Skin, which is about the history of medical racism. And there’s a book by my friend Dolen Perkins-Valdez called Take My Hand, which is fiction. It’s historical fiction, and it’s about some girls who were forcibly sterilized in the South back in the day because this was another thing that happened to Black girls. You know, there were these eugenicist ideas that said, “Oh, well, we should just sterilize them so that they don’t have babies.” I mean, which is horrific. Like, can you imagine being, like, 12 or 15 and, like, having your, your reproductive organs, like, forcibly removed from you and, like, you have no idea what’s going on? And the girls who were the subject of the book, like, they were developmentally disabled, like, they really didn’t know what’s going on. It’s just it’s, it’s a heartbreaking, heartbreaking story.
JVN [00:17:20] Eugh!
MEREDITH BROUSSARD [00:17:21] Not only was that material really hard to research and to reckon with just personally. But then, like, the big idea is, “Okay, all of this bad stuff gets embedded in these algorithmic systems,” because people are just uncritically making these medical technology systems built on top of the racist medical systems. And so we really need to unpack the racism underneath in order to build better technological systems. And until we do that, like, our technological systems are not going to be fantastic. They’re not going to be inclusive, like, they’re not going to be as effective as people claim.
JVN [00:18:02] There needs to be, like, a racial social justice awakening in tech. Like, not a moment, but, like, a whole awakening that is persistent.
MEREDITH BROUSSARD [00:18:10] And I hope it starts with More Than A Glitch.
JVN [00:18:14] Well, I mean, I think you have been, you have been and continue to be—the amount of times that someone has talked to me about your work, like, not in my podcast, like, I hear about you in circles doing things that, like, they didn’t even know that, like—. No, your work is, like, you are. You are doing this.
MEREDITH BROUSSARD [00:18:29] My gosh, that’s amazing. Thank you.
JVN [00:18:30] You are doing the fucking goddamn thing. So where else does AI fail us when it comes to our bodies? Like, say, with, like, cancer detection. And if you’re comfortable, like, to share with us, like, what have your experiences been?
MEREDITH BROUSSARD [00:18:44] Well, I can tell you my cancer story because at the beginning of the pandemic, I was diagnosed with breast cancer. So I had a total mastectomy and then the entire world shut down. So not only was I dealing with the pandemic, but, like, I was recovering from, like, frickin’ cancer surgery. It was terrible. And, you know, people, people often react to cancer diagnoses in unexpected ways. I got even more myself, which is to say, I started making spreadsheets and I started researching obsessively. And during the pandemic, instead of just chilling out and recovering and, you know, figuring out life after cancer, I decided it was going to be a really good idea to research the frontiers in AI-based cancer diagnosis. So what I did—and I should say I’m fine now.
JVN [00:19:50] Yes!
MEREDITH BROUSSARD [00:19:51] Go get your mammogram. Early detection is key. I am so grateful to all of my doctors. They were completely amazing. I got great medical care, I’m just, I’m so, so grateful. But I also did this weird thing where I took my own mammograms, I ran them through an open source cancer-detection AI, in order to write in the book about what is the state of the art in AI-based cancer detection. I did this because I was reading my chart one day. You know how you can, like, get into your medical, electronic medical record and you can like, read all the, like, you know, all the little bits. So I saw this note that said, “Your scan has been read by an AI.” And I was, like, “Oh, wait, what? What’s that all about? What did the AI think about my scans?” Like, I was really fascinated. And so it sent me down this rabbit hole of, “Well, what’s going on with that?” So I have a whole chapter in the book about, “What is the state of the art in AI-based cancer detection? What did the AI see when it looked at the insides of my boobs? And what is the future of AI-based cancer detection?” It works really well in certain situations, was my finding. And then it doesn’t work in other situations. And so it’s not really ready for primetime. People like to talk about it as if, “Oh yeah, radiology, like, that’s a profession that’s going to be replaced, like, you know, next year.” But it’s kind of like self-driving cars: the promise is more exciting than the actual reality of it.
JVN [00:21:24] So did it, like, misdiagnose yours? Or did it not quite get it right or, like—
MEREDITH BROUSSARD [00:21:30] So one of the things I realized was that I was completely wrong about my concept of what the AI would do. Like, I thought that I was going to take my entire electronic medical record and all of my scans, and I was going to feed it in. And then the AI would, like, look at all of the data and all of the pictures and, like, I don’t know, give me, like, an animated doctor that would, like, tell me, you know, “You have cancer.” Like, I don’t know what I thought, but like, I definitely expected a lot. The way it actually works is totally different. You take a single picture, like, a single view, and you feed it in. The computer has been fed with, you know, thousands, millions of other views of breasts. And those breasts have been labeled. Like, there’s been a red box drawn around the cancerous part in that single view. And so what the AI will output is it will draw a red box around an area of concern on your skin. That’s it.
JVN [00:22:37] So did it predict your area of concern right?
MEREDITH BROUSSARD [00:22:40] It did! So impressive, right?
JVN [00:22:43] But it’s, like, a little bit more, like, like, yeah, like, not as comprehensive. Like it was gonna be, like, “Your treatment is this and do that and, like, it’s more here and it’s like probably this type and it’s going to grow slower, faster.” Like it was pretty much just, like, “This is the spot that the doctor should check.” So, like, a little bit more basic at the moment.
MEREDITH BROUSSARD [00:22:59] Yeah. And it also was the case that when you are confronted with something like cancer, you want a doctor, you want nurses, like, you need people. The idea that you would just, like, walk up to a machine and, like, feed in your mammogram and it would put a red box around something and then just, like, not tell you, “Okay, this is what we think it is. Like, this is your prognosis.” Like, when you get diagnosed with cancer, you need somebody to tell you, “Listen, you are going to die. But not today and not from this.” Like, that’s, that’s a helpful thing to say. I mean, if you’re me, that’s a helpful thing. So maybe it’s you know, maybe it’s hard for, for somebody else. But so you need medical professionals to help you through this incredibly traumatic experience. And, you know, not just with cancer, with any kind of major diagnosis. I don’t think we really want just a computer telling you, “Oh, yeah, you probably have a problem.”
JVN [00:23:57] Yeah, I mean, of course. I think I also wanted to just quickly flag that, like, when we were talking about audits, it’s, like, even when people within the tech industry, we have seen that when they do flag an issue, like, in an internal audit or, like, do raise, like, concerns, especially when it is, like, Black women engineers or, like, even, like, people of color within the tech industry, they are often met with like scorn and, like, disbelief. And we saw that in that one—that nice lady from Google, and then, what was her name?
MEREDITH BROUSSARD [00:24:30] Timnit Gebru?
JVN [00:24:31] Yes, yes, yes. So it’s, like, even when there are whistleblowers, sometimes it’s like the powers that be, like, really seek to silence them, because there’s just a lot at play here with, like, stock prices, like, when things get out, like, it’s just—everyone’s always worried about, like, the worst thing. And then when someone’s like, “Hey, wait, we need to, like, fix this one thing,” it’s like it’s not even fixed. And then it’s actually made worse because no one was dealing with what the whistleblower said. So how have recent layoffs in the tech sector—we’ve been hearing about that a lot—like, how has that set back DEI efforts?
MEREDITH BROUSSARD [00:25:03] I think that when you are laying off a lot of people, you are generally undermining your DEI efforts. The tech industry has, has not done a fantastic job of making itself more diverse over time. They have talked about it a lot. They have made a lot of promises. But we haven’t seen the needle move. We have seen the needle move in academia. So, like, in data science programs, for example, we have much better gender balance than we used to. You know, we have, like, women in engineering programs. In the past couple of years, we have seen some major funding efforts going into data science and computer science at HBCUs. I think the tech industry needs to do better. Like, I’m not, I’m not really willing to say, “Yay! Tech industry!” yet because they just haven’t done enough. Like, when they do, I will be the first one to say, “Bravo!” But, like, we’re not there yet.
JVN [00:26:15] Until then, we are not getting a, “You passed this course,” from Professor Meredith Broussard, honey. So you’re going to work a little bit harder. So I would say this: I had leased a Tesla because Bobby used to have one and I was, like, “Oh my God, these are cute. I feel like I’m stuck in an iPhone.” But then Elon Musk started saying all that crazy shit about trans people and pronouns, and he just started talking a bunch of, like, we’re conservative shit. So I fucking revoked that lease, turned that fucker in so fast I got, like, a different goddamn electric car. So fuck that guy. He’s an dick! And that’s the only way I knew how to hurt him, honey. I was, like, “I’mma hurt that bottom line more than I’m going to hurt him on Twitter. So I’m just going to, bye!” And I took it away from there. But I did think in one weird sense, he kind of did us a favor because we could see when he took over Twitter how much the owner of a company can change the parameters, the goals, the objectives of said place. And I really can’t think of a time when a company of that size has changed hands in such a public way and we’ve all been able to kind of see that, like, it really is, like, a before and after. And I think that’s kind of interesting. So what do you make of that? I mean, like, just the whole, the fuck?
MEREDITH BROUSSARD [00:27:30] I think from an algorithmic bias perspective, one of the greatest losses at Twitter was Twitter’s META team. Which, you know, sounds confusing because Meta is Facebook, and then there was a META team at Twitter. But META was the algorithmic accountability and algorithmic auditing team. It was run by Rumman Chowdhury. It had some really amazing thinkers. They had people who are really thoughtful about issues of algorithmic bias. They were doing things, like, they ran this bug bounty, this algorithmic bug bounty, where the idea was they would give you one of the many, many Twitter algorithms and teams would look at it and try and find algorithmic bias problems, and then they would get a bounty, a reward for finding the problems. I mean, what a great idea, right? Algorithmic Justice League, Dr. Joy Buolamwini’s organization, has published a big report about using algorithmic bug bounties. I think this is a really promising option for finding problems inside algorithms.
JVN [00:28:35] Okay. Actually, I had another question about that, and you maybe just answered it. But when you are auditing, like, do you use the platform and then take the data from using the platform? Or to do a true audit, you have to go into the coding of the platform itself. Like, is it from a data of, like, interacting with the outside of the platform or do you actually have to go, like, inside? Would you have to have, like, access into the admin of Twitter, access into the admin of IG to do a true audit, or do you just interact with the app from, like, a gajillion different accounts?
MEREDITH BROUSSARD [00:29:08] Alright, so I’m going to try and restrain myself from, like, getting all nerdy and excited. I’m going to, like, take a deep breath and not, like, start talking about code or start talking too much about code, because this is a really good question. And, like, I can talk for like an hour about this. So you can do an internal or an external audit. Journalists generally do external audits because as journalists we don’t usually have access to the internal systems, but people inside a company can do internal audits. And so both are valid methods. One of the things that I worked with Cathy O’Neil’s firm ORCAA on is developing an algorithm auditing platform, where the idea is you’d be able to hook it up to any algorithm and kind of figure out what’s going on inside it. Right now, algorithmic auditing is kind of a bespoke process. It’s not an automated process. And people, people often do want to audit Facebook or Instagram or Twitter or Google, which I totally get, those are the platforms that are most familiar to us, but those are the most sophisticated algorithms, and it’s actually really, really hard. So, like, Google, for example, every time you do a Google search, there’s something, like, 250 different machine learning models that get activated. So you’re actually auditing, like, 250 things simultaneously. Like, I don’t know about you. My brain can’t hold that in.
JVN [00:30:40] I couldn’t even do one!
MEREDITH BROUSSARD [00:30:41] Yeah. So it’s, it’s a lot easier to audit something that’s smaller. Like, say, an algorithm that is used to grade student essays. Like, remember when there was a writing section of the SAT, there was a move to try and have computers read the essays that students wrote and grade them, which PS I think is a terrible idea. So if you are starting out with auditing, you want to start with something small and easy to comprehend, as opposed to starting with something incredibly complicated like Instagram.
JVN [00:31:23] That makes sense. There are two Supreme Court cases right now that involve big tech. Can you tell us about them? And, like, are they really going to upend the entire Internet? And what do you think needs to happen with them?
MEREDITH BROUSSARD [00:31:34] It’s so interesting to me that people are talking about these two cases like it is the end of the Internet. Right. Like, there’s all this drama around them. They mostly center on CBA 230.
JVN [00:31:48] It’s, like, suing tech companies, isn’t it?
MEREDITH BROUSSARD [00:31:50] Yeah. So CBA 230, it’s Section 230 of the Communications Decency Act, which was something that was put in place in the very, very beginning of the Internet era. It was put into place when our, our model for technology was the telephone. Back in the day, we didn’t hold somebody like Verizon or AT&T responsible for the content that was, you know, that was delivered over phone lines. Tech companies, back in the nineties or whatever, said, “Okay, well, we don’t want to be liable for the content that is delivered via computer lines.” And so that’s kind of at the heart of 230. Well, we’re in a different era now. We’re in an era where, you know, platforms control almost all, you know, speech on the Internet. These specific cases are looking at, “Did platforms enable bad stuff.” In one case, it’s a question of terrorism. And what they’re trying to decide is, “Okay, do we need some kind of modification to CBA 230?” You know, in response to these cases. But overall, do we need to modify 230? Yes, we do. How? I don’t know.
JVN [00:33:12] Because if the Supreme Court says that it does need a modification, do they say that it’s unconstitutional as written? And then, can the Supreme Court send the law back to Congress? That might be a question for someone else, but it’s, like, I wonder what the possible outcomes of the case are.
MEREDITH BROUSSARD [00:33:26] If you want to watch tech lawyers absolutely lose their minds, get them into a room and then start talking about CBA 230 reform. And, like, their heads would, like, literally explode and they will get so worked up. And, you know, sometimes when I’m talking to a tech lawyers, like, I’ll just kind of, you know get their take on it because it’s really great to hear somebody’s rant.
JVN [00:33:51] We should do that for a separate episode.
MEREDITH BROUSSARD [00:33:53] You know, have you ever talked to Melissa Murray?
JVN [00:34:56] No.
MEREDITH BROUSSARD [00:33:57] Alright, she’s a law professor at NYU Law. She has a podcast. She live tweets Supreme Court cases, she live tweets The Supremes. And she’s completely amazing.
JVN [00:34:11] Ohmigod, we need to have her, we’re writing it down as we speak. We’re rounding third base. We are doing it. This is going to be two episodes because there’s no world in hell that we can get this into one.
MEREDITH BROUSSARD [00:34:18] I love it!
JVN [00:34:19] There’s just too much stuff.
MEREDITH BROUSSARD [00:34:21] That’s what the experience of the book is like, by the way. Like, there’s just so much packed into it that it’s hard to say what it’s about in, like, one sentence. Like, if I did have to say, I would say the book is about kind of the way that it’s necessary to think about human frailty inside algorithmic systems, think about bias of the human world, inside the algorithmic world. But it’s really about, like, a ton of things simultaneously.
JVN [00:34:48] Which we love because, I mean, this is such a—, I think one of my biggest takeaways from, like, just Getting Curious period is how interconnected, like, our world is and how much our history influences everything, like, our past informs our present on quite literally everything. You know, it’s devastating, fascinating, interesting, exciting. Sometimes healing, sometimes, you know, three steps forward, eight steps back. It’s all of it. It’s, like, it’s just such a big, large mixed bag.
MEREDITH BROUSSARD [00:35:21] Yeah, Yeah. 100 percent.
JVN [00:35:23] We’ve had, like, M3GAN, that one movie with, like, that robot who kills all the kids because she’s that other girl’s best friend. I have to say, I loved it. I took an edible. I lived my best life. I thought it was incredible. I was so thoroughly, like, I mean, I felt bad for that little boy when he got ran over. But I was also, like, “Well, bitch, you shouldn’t have fuckin’ made fun of her. Okay? If you would’ve kept your fucking mouth shut, you wouldn’t have gotten taken out by that semi.” But one thing that I didn’t love and I was kind of scared about: deepfakes. How scared should we be?
MEREDITH BROUSSARD [00:35:55] I’m pretty freaked out by deepfakes.
JVN [00:35:57] Because they’re so believable.
MEREDITH BROUSSARD [00:35:58] They’re so believable. I am not a person who, who’s like, “Oh, my God, it’s the end of the world. Because, we are going to, you know, only have simulated reality anymore.” Like, I don’t think it’s, like, some kind of singularity event. But I am concerned for the disappearance of truth online. I’m concerned for people not being able to trust what they see online. I am also concerned for creators. One of the things that happens with AI systems is people start talking about, “Oh, well, we’re going to use this AI in order to replace writers or replace artists or replace, you know, workers of, of whatever stripe.”
JVN [00:36:42] Models! Like—
MEREDITH BROUSSARD [00:36:45] Yeah. It’s a bad idea because people need jobs.
JVN [00:36:48] Cause you could do, like, deepfake movies, literally, if virtual technology gets good enough, like you could just, like, not even have actors, not have makeup artists, not have any of it. It could just be, like, 100% all the way generated from AI and, like, not even look animated. Like, it would look real.
MEREDITH BROUSSARD [00:37:02] Yeah. And I mean, do we, like, do we need that? Like, I don’t know that we need that. Like, I think that people need jobs. Like, I love creating things. I love writing. I love that writers and actors and makeup artists have jobs.
JVN [00:37:16] Yeah, I don’t want 3D printed fucking chocolates. I want homemade Meredith Broussard fucking chocolates. Keep your 3D printed chocolates somewhere else!
MEREDITH BROUSSARD [00:37:24] I know, I want to make my truffles. I want to send them to my friends. I do not want to be replaced by a computer.
JVN [00:37:30] So this is kind of going to be, like, end of game rapid fire. Are you ready?
MEREDITH BROUSSARD [00:37:35] I’m ready.
JVN [00:37:37] Okay, so let’s say we have a friend. Let’s say that that friend downloaded TikTok on their phone. Let’s say that friend is me. I have TikTok on my phone. I use it when I’m… all the time. Like, should I be scared? Do I need to actually get, like, a different iPhone, put it on that iPhone, and then get, like, one of those, like, pocket WiFis and only, like, attach it to that. Like, can the Chinese government literally see everything I do on my phone? Is that bad? Should I stop being so lazy about doing TikTok on my main phone? Cause I read that in an article. It’s not really rapidfire anymore. Is it bad? Am I okay? Are you shitting your pants or am I shitting my pants.
MEREDITH BROUSSARD [00:38:07] Well, okay. So we should all be shitting our pants about various things—
JVN [00:38:12] Shit.
MEREDITH BROUSSARD [00:38:12] But we do not need to do it about TikTok right now. I will admit that I do often think about, like, setting up some kind of air-gapped, like, secure system but that might just be because, you know, I think about this stuff.
JVN [00:38:26] I think about it, too. But then I’m lazy and I just can’t be bothered to do a different phone. And that’s how I don’t even know how to—I don’t know how to do that! But, wait! Back to the question. So: TikTok, Congress, wanting to ban. Is this really just, like, the US being scared of, like, gay people, communists, like, just always having a new thing to be afraid of? Or are we actually afraid of TikTok?
MEREDITH BROUSSARD [00:38:49] I’m not personally afraid of TikTok, I think that there is a lot of US-China rivalry in Washington. I think that this plays out in the tech realm. My read on the, “Let’s ban TikTok from government phones,” situation is that it’s about US-China tensions and it’s some manifestation of that. I think that what people can do in general is kind of be more conscious about how algorithms work and also how algorithmic targeting works and how targeted advertising works. And so once you understand more about that—
JVN [00:39:29] Okay, because these bitches are listening. They’re listening. It is not, like, there’s no—, ‘cause Mer, wasn’t it you that said that they’re, like, or maybe it wasn’t. I’m not trying to throw fingers in here, but someone had told me that it’s, like, you go to somebody’s house or whatever and their IP address was looking for toothpaste and then your phone connects to their, like, IP whatever. So, like, you might get a toothpaste ad for what someone living there was doing. But then the other day, I was, like, somewhere where, like, there is no fucking world in which they could have been talking or looking at what I was—I can’t remember, like, what I was looking for. It was something really random. It was something about, like, it was some shit that had nothing to do with anything—it was, like, growing grape vines or something. I was, like, trying to figure out if I could grow something, and I wasn’t even anywhere. And then my phone just fucking pulled it up and I didn’t even Google anything, and neither did Mark. Like, we literally were talking about it. And then I looked at my phone and it was like the thing that was like on my fucking phone. So I feel like those bitches are listening and they’re like, What do you think?
MEREDITH BROUSSARD [00:40:27] I think it would be really convenient if they were listening, but I really don’t think that they are.
JVN [00:40:35] Hmm.
MEREDITH BROUSSARD [00:40:37] That probably was me about the toothpaste, because that totally sounds like something I would say because I definitely talk about IP addresses—
JVN [00:40:44] But how come they know what I’m thinking at all times?
MEREDITH BROUSSARD [00:40:47] Well, we are—
JVN [00:40:50] Basic. And predictable.
MEREDITH BROUSSARD [00:40:53] We kind of are. Yeah. Yeah. I was trying to find some nicer way to say it, but, like, no, we’re pretty predictable. And we are interested in the things that our friends are interested in—
JVN [00:41:05] Oh! And all we do is talk about gardening.
MEREDITH BROUSSARD [00:41:09] Yeah. So like, if you talk about gardening with your friends, you’re, like, in the geographic proximity of your friends, they can also do location tracking. So like if you’ve been to the garden store, your phone is tracking your location. Your location data is being sold. It gets tagged as somebody who’s been to a garden store. And, like, yes, people go to garden stores, like, yes, we think about things like rain barrels and we think about things, like, you know, the fancier little things to, like, tie up your vines. Like, there’s not that many things out there in the world of gardening.
JVN [00:41:49] Okay. Touche. Maybe you did get me to cross back from the conspiracy line. So I appreciate you for that. You’re always a good friend for that. I can’t believe that I’m saving my hardest hitting journalistic question for, like, three to the end or, like, two the end. But my TikTok algorithm has recently become 80% people fighting and 20% Britney Spears conspiracy theories, also with a trace allowance of, like, Idaho murders, Murdaugh murders, with an even more trace of, like beauty, but it’s mostly, fighting, Britney Spears, murders. I don’t know what happened. It’s just what it is.
MEREDITH BROUSSARD [00:42:25] I think that that one of the ways that we can kind of seize power back from the algorithms—
JVN [00:42:31] Yes…
MEREDITH BROUSSARD [00:42:32] —in our social feeds is we can use these apps, we can use these social platforms like computer programs as opposed to just, you know, using them as, like, things we do on my board and we just, like, scroll mindlessly.
JVN [00:42:45] So I lost 3 hours of my life yesterday to this Angry Karens thing, because now the algorithm knows that I love seeing racist people, like, get either physically assaulted or, like, cussed out, like, cause I just I love watching a Karen get her comeuppance. Like, it just, it, it’s like, it’s like reaching a scratch under a cast that you’ve had for, like, a year. And then you can get the hanger down there and it’s, like, you just get bathed in this, like, cold, nice, refreshing water. And then next thing you know, there’s another dumb bitch getting fuckin’ her comeuppance.
MEREDITH BROUSSARD [00:43:24] Yeah. And then next thing are, like, scratches. And then, like, you get gangrene under your cast.
JVN [00:43:28] No, mine’s thriving. I—, I’m not even infected like I am. I mean, my husband would disagree because, like, he was trying to, like, get me to, like, come help with dinner and, like, feed the animals. And I, like, was lost to space and time, like, zoning out. So we do need to wrestle back our humanity from these algorithms. So I appreciate you telling us how to do so. And then I interrupted, talking about my loss of life to algorithms. So but yeah, so we need to do what to enjoy the apps without losing our whole hours to the apps. So how do we do that?
MEREDITH BROUSSARD [00:44:01] Yeah, we need to train the apps. We need to consider them as just computer programs that are more under our control as opposed to being, like, mysterious things that feed us suggestions.
JVN [00:44:14] So I need to not watch the Karen videos to the end so that I’ll stop thinking that I’m obsessed with it. Even though I am.
MEREDITH BROUSSARD [00:44:22] Right. So I mean, you need to, like, type in, you know, “Dutch poodles” and, like, watch, you know, videos of, like, poodles who live in Amsterdam.
JVN [00:44:30] So am I a nightmare, and I’m basic, is that just what it is?
MEREDITH BROUSSARD [00:44:34] I mean, we want to watch what we want to watch. Right. I mean, that is one of the reasons that the Internet is so great is because, like, you can find this extremely, extremely specific stuff. Like, if you want to watch polar bear attack videos, like, they’re out there!
JVN [00:44:48] Yeah, how specific? Meredith, I’m obsessed with your brain. You have the most beautiful brain around. You really do.
MEREDITH BROUSSARD [00:44:55] Thank you so much. Thank you so much. Well, okay, so if you, for example, like, watched videos of, like, that guy who grows the giant pumpkins, like, that will cleanse the timeline.
JVN [00:45:07] He’s all over mine! I love him. Big Bear Swipe, actually, like, you know, it’s, like, fucking crazy. Mark literally got me to a Bear Swipes’ seeds?
MEREDITH BROUSSARD [00:45:16] No.
JVN [00:45:17] I have two of those seeds, and we are going to plant them. I am literally planting, like Bear Swipes’ genetic babies.
MEREDITH BROUSSARD [00:45:27] Oh, my God.
JVN [00:45:29] I know!
MEREDITH BROUSSARD [00:45:30] Bear Swipe, by the way, listeners, Bear Swipe is, like, over 1,000 pounds. Like, this is a giant pumpkin, and is an amazing thing. And, you know, like, the fact that I have seen giant pumpkin? Total triumph of the algorithms.
JVN [00:45:43] Yeah, he is like, one of my favorite followers. Like, he is so interesting, and I’ve learned I’m actually going to when I grow pumpkins this year, I’m going to do that gauze thing, like, where you wrap the flowers to swell the ovules and, like, make the sperm, like, one, the stamen—I always call it sperm—but stamen, like, he does this thing with whatever—, we need to get it on the podcast. But you’re going to see me doing some of his stuff.
MEREDITH BROUSSARD [00:46:08] Yeah, you’re gonna do, like, you know, assisted reproductive technology, like, with your giant pumpkins, aren’t you?
JVN [00:46:15] Well, I’ve done manual pollination. I’ve just never done the thing where you put gauze on the outside to make them get, like, more giant. Like, the night that they’re about to open when I would normally pollinate them. You put this gauze around it so that it stays closed so that, like, the ovule, like, pumpkin thing, like, gets bigger on the bottom of it or whatever. It’s, like, so interest. Okay, so! Final question we made: I don’t know if you can believe it. Is there a world in which tech can bring us closer together as humans?
MEREDITH BROUSSARD [00:46:44] I certainly hope there is. I mean, I, I feel like, you know, talking about giant pumpkins brings us closer as humans. I feel like kids who, who don’t feel like they fit in and their communities, like, find friends and mentors and role models online. I feel like tech does make the world more accessible to people with disabilities a lot of the time. But I feel like tech also has, also has drawbacks. You know, it also reproduces inequality. And so I think that our, our conversation needs to change. We need to just not have these conversations about, “Oh my God, bright technological future!” And we need to dial it back. We need to be more realistic and say, “Okay, yeah, bright future, but also let’s not get so carried away that we forget all the problems of the past.” And I think we should also celebrate the work of people who are doing really amazing thinking about algorithmic justice issues.
So I mentioned Dr. Joy Buolamwini and the Algorithmic Justice League. Other people that I write about in the book are Safiya Noble, her book, Algorithms of Oppression, along with Cathy O’Neil’s book, Weapons of Math Destruction. These were really, like, the books that kicked off mass public understanding of algorithmic justice issues. Ruha Benjamin’s book, Race After Technology, was really inspirational for me. Dr. Benjamin has a new book out called Viral Justice that is just incredible and will upend your understanding of, “What can a just world look like in the technological sphere.” Mimi Onuoha is doing incredible work. Timnit Gebru, Rumman Chowdhury I mentioned, lots of amazing thinkers out there. One of the things in the back of the book is kind of a list of resources for, “Okay, if you didn’t get enough with, like, the 5000 billion citations in the book, here are other things that that you might want to read.” So I really hope that people will just kind of understand, like, how, how fascinating this is and, like, how this is a civil rights issue and can get activated.
JVN [00:49:07] And then I lied when I said that that was my last question because I do have one more last one. So, and I can’t remember whether I asked you in the first two. But when you were, like, a young baby, Meredith, when you were, like, 16, 17, like, deciding on, like, you know, what you wanted to do when you grew up. Obviously, you know, sometimes I think about kids that are that age now and I think like, I think it’s just such an intense time. This feels like a, feels like a confusing time. Feels like a pivotal time. I guess every time feels like a confusing and pivotal time, but it just feels like a lot right now. And one thing I think about hairdressing for me is that hairdressing and, like, that artistry in the community of hairdressers like the salon world, that gave me, like, a North Star to keep following and to keep focused on when my world became unstable and shaky, and I was going through, like, all my personal stuff and like hardships. And I do think that a lot of times that’s why it’s so good to, like, love what you do, because it can be a stabilizing force even when other things, you know, are changing and it gives you something to come back to that you love. So if there is a young person listening to this, or even someone who’s not young and they have listened to our episodes and they are inspired by this, maybe they feel powerless to stop it. Maybe they feel powerless that they can’t be a part of these conversations. But what if someone wants to make a career change? What if someone wants to, like, really make this part of their, like, life’s work or they have a loved one that wants to make this their life’s work? What do they do? Like, if someone wants to—is realizing that they want to—grow up to become their version of Meredith Broussard. Where do they start? What do they major in, where do they study?
MEREDITH BROUSSARD [00:50:48] Oh, great question. There is now a field to go into. It’s called public interest technology. And it’s exactly what it sounds like. It’s making technology in the public interest. So sometimes that means doing things like what Julia Angwin does and interrogating algorithms and finding out how they’re racist or sexist or ableist or whatever. Other times it means something like working on government websites, making them stronger so that they don’t go down when there is a pandemic and, you know, millions of people are filing for unemployment simultaneously, right. So there are lots of ways that you can build technology in the public interest and get involved in the field, either from a building technology perspective or from being involved on the policy side or, you know, getting involved in legal efforts around fighting algorithmic justice. There is something at the university level called PIT-UN you at the Public Interest Technology University Network that is a really powerful network. So at NYU, I work with something called the NYU Alliance for Public Interest Technology, and we have run career fairs. We are part of PIT-UN. We do training for students and early and mid-career scholars who are interested in accelerating their efforts around public interest technology. So it’s a really vibrant field. And so “public interest technology,” that’s your keyword. That’s going to, that’s going to get you a touch with the good stuff.
JVN [00:52:25] Meredith Broussard. Thank you. Thank you. Isn’t big enough, but, like, thank you for being so generous with your time and with your work. And also, sidebar, thank you for, like, making this your work. I think that you are doing, like, such revolutionary and important, like, mind opening work for so many people. And I do feel more comforted in the face of all of the weird stuff that is happening that at least there are people like you teaching the children, the young people now. So not literally the children, but, you know, the young, like, literal academics who are going to be, you know, major in their own right.
MEREDITH BROUSSARD [00:53:03] Jonathan, thank you so much. It’s so great being here with you, just such an amazing conversation, it means so much.
JVN [00:53:09] We literally love you so much. We can’t take it. But you guys, if you don’t understand that you must read Meredith’s new book More Than a Glitch: Confronting Race, Gender and Ability Bias in Tech. I don’t know what to tell you because even though this was a special two-part episode, we still, we’re giving you Titanic like scratching the surface of the iceberg. Like that iceberg didn’t even go. That is the tip of it, honey. Like, there’s so much juicy info, like, this is the tip. So, like, get into the whole book. Support Meredith’s work. We love you so much Meredith, and thank you for coming on. Getting Curious.
MEREDITH BROUSSARD [00:53:39] Thank you.
JVN [00:53:42] You’ve been listening to Getting Curious with me, Jonathan Van Ness. Our guest this week was Meredith Broussard. You’ll find links to her work in the episode description of whatever you’re listening to the show on. Our theme music is “Freak” by Quiñ – thanks to her for letting us use it. If you enjoyed our show, introduce a friend, honey, and please show them how to subscribe. Follow us on Instagram & Twitter @CuriousWithJVN. Our editor is Andrew Carson. Getting Curious is produced by me and Erica Getto.
Recent Episodes
May 31, 2023
EP. 322 — How Many Hard Rights Can One Supreme Court Take? with Professor Melissa Murray
Guest Melissa Murray
In the coming weeks, the Supreme Court of the United States will hand down decisions that could have major implications for LGBTQIA+ rights, racial justice, tribal sovereignty, and beyond.
May 24, 2023
We’re dripping in jewels this week on Getting Curious! What does it mean for a diamond to be “hard”? Are lab-grown gems made to perfection? What’s the difference between rubies and pink sapphires?
May 18, 2023
EP. 320 — How Did New Orleans Become New Orleans? (Part Two) with Dr. Kathryn Olivarius
Guest Kathryn Olivarius
New Orleans was one of America’s most important cities in the early 1800s. It was also one of the most deadly. This week, to mark the new season of Queer Eye, we’re exploring New Orleans history with Dr. Kathryn Olivarius in a special two-part episode.