Cathy O’Neil earned a Ph.D. in math from Harvard, was a postdoc at the MIT math department, and a professor at Barnard College where she published a number of research papers in arithmetic algebraic geometry. She then switched over to the private sector, working as a quant for the hedge fund D.E. Shaw in the middle of the credit crisis, and then for RiskMetrics, a risk software company that assesses risk for the holdings of hedge funds and banks. She left finance in 2011 and started working as a data scientist in the New York start-up scene, building models that predicted people’s purchases and clicks. She wrote Doing Data Science in 2013 and launched the Lede Program in Data Journalism at Columbia in 2014. She is a regular contributor to Bloomberg View and wrote the book Weapons of Math Destruction: how big data increases inequality and threatens democracy. She recently founded ORCAA, an algorithmic auditing company.View the profile
Emily Bell is Founding Director of the Tow Center for Digital Journalism at Columbia Journalism School, and a leading thinker, commentator and strategist on digital journalism. The majority of Emily’s career was spent at Guardian News and Media in London working as an award winning writer and editor both in print and online.View the profile
About the talk
After the 2016 US Presidential election technology companies described how AI and algorithms would be deployed to ‘fight misinformation’ and safeguard democracy. Yet in 2020 those promised solutions spectacularly failed. In this 25 minute discussion, digital media expert Emily Bell talks with algorithm expert Cathy O’Neil about the problems and proposed solutions to the online organization of extremist groups. In particular, they will discuss what hasn’t worked and cannot work, and the types of new thinking required to deal with this issue.#AIWontWork
SXSW dedicates itself to helping creative people achieve their goals. Founded in 1987 in Austin, Texas, SXSW is best known for its conference and festivals that celebrate the convergence of the interactive, film, and music industries. An essential destination for global professionals, this year’s online event features sessions, showcases, screenings, exhibitions, professional development and a variety of networking opportunities. For more information, please visit sxsw.com.
Connect with SXSW:
Hey, I only will know I'm not welcome. Thank you. And now I can't introduce you which is really exciting. Emily is a, is a journalist and she invented podcasting at the guardian and also she's been running with child Center for a decade on digital media. And the thing about it is like, I met Emily, because I worked for, like, a very short amount of time at this General journalism, school started starting this like a journalism thing and she recruited me. And obviously she got me and ever since then,
we've been taught, we basically been either listening to Bluegrass drinking bourbon or singing karaoke or some combination thereof and what's ironic about it is even though she knows all sorts of things that I really want to know. I've been there, I've actually had the opportunity to ask you these questions. So I'm really happy to be here. I have motor oil coming out next Monday. So if I started talk about it now I think I would Here's my first question by saying, like, I feel like I might know a little bit more
about misinformation, and like Fringe groups, because of my research on my shame book. By the way, the new book that's coming out in the year, is called the shake machine, and I should have tried to understand how people get into these at all. But I know I said, I know more than the average person, but I don't know nearly as much as somebody who actually knows about the stuff. And I, so I have a really dumb questions and it's just, it's delightful that I get to get to pretend that. I'm just asking for the sake of the audience is
like, literally like, is this a new thing? I mean and how new is it? Like what about it is? New is just the scale of its new and like what? Cuz I'll just start with that. Meteor Netflix. It's from Rolling thinking about this American civilian, the purpose of it was to train influences. Mechanism of spreading messages. They change every time you have a change of Technology between 9019 Spectrum Spectrum, speed test, Austin, Richard just in certain geographies, that's reason. I mean is always the rich, donadio
internationally as well as information coming through. Are very special fades. Ways of it to move like, full of media has changed dramatically. Can information misinformation spread between populations say wet when you think? I'm looking at buying gold program information session on how we think about things the same because it's Yeah, I mean this is one of the most. I mean, I was going to talk about the little later but I'll talk about it now, like this is the the thing that
fascinates and scares me the most about this topic which is that, like, the techniques I use, I do an elder of them's, the techniques of the basic framework, I employ would be completely useless for this problem. So, to be clear what I do in a given set of narrow contacts, which is what my job, it is, you know, insurance company wants to decide how much to charge for health insurance, you know type saying. Well, then the stakeholders are clear that the stakes are high but it's very narrow as a decision goes and who
cares? What did they do? How do they define success? And how do they Define failure? And to what extent of this algorithm in this context, have an existential threat? Like, will it, will it be really bad? Performing in some aspect Matrix is that Matrix of stakeholders to end decisions like efficient definitions of success or failure, its large. But it's a finite set of a two-dimensional set of concerns and stakeholders intersection and intersecting. So you can like go through every single one of those boxes and be like this probably won't happen or this might happen. We should keep
an eye on that or whatever but this is definitely happening we should mitigate that problem. Let's go do that. If we can't do that was just turn this off cuz it won't work. That's the kind of calculus that we employ and it's just not is not, you know what I mean? Like one of the stakeholders for a Facebook literally, everyone is just a caller, never sign on to Facebook are affected by the job for them and buy this misinformation. So is it is infinite. And in for that reason, I'm just like, oh my God. Show me. I'm going to go to Just Dive Right into my
second question which is like you know, you certainly like way more people are influencers now and they are not government-sponsored necessarily or they might be government sponsored by Russia or another company and Country Troublesome for the platforms and what did they say they were going to do? Cuz like from my perspective it's like this is an obvious problem. They keep saying they're going to do something, nothing happens. Like is that just the story. Is there any more of
these problems has been around to see this in a way that enables you to scale? Grave Digger. You did not have to deal with expectation that you will be looking at things as they were I think Facebook, will you experience something? She is driven out on NetSpend electronic. is consensus around things again, saying in the same social media, Separate from mainstream media. She's thinking about just media. I would say that the year with things things in our faith because you
promised in the sudden way they not to touch anything on Facebook, about what you going to do, we? All right. yeah, so mean Yes, I would, you do just from the algorithmic perspective of what you've just said. I'll just do the lean on my very dry nerdy version of that which is of the algorithm gave us like profiled us. Give us more of what they thought we would eat. You know, optimizing on the stupid as possible thing which is how long do we spend on Facebook, or I had with laundry, spend on how many clicks do we give them? Which is our profit. You
know, straight up proxy of prophet that was easy to measure and nothing else. So very dumb, very very dumb. I'll do a lots of information, lots of data profile acid decide what to serve us that would keep us there. I'm so, so so dumb, but also very informed and then 2018 rolls around before Congress. And this is what I get really interested in this misinformation question because he says Away Eyes going to fix that. You know, we were working on Is going to solve solve this problem, and that's what I'm like, oh
no, it's not. It's not going to, because you're not really quickly, throw out why it wasn't. It wasn't going to work a computer. Please go better than human computer, play checkers, and chess better than he is. And I was like, oh, that's really not that impressive because those are very well to find tiny little universes, where the definition of success is obvious and everyone agrees on it. Where is the definition of success? Is the problem here, definition of success is keeping us on Facebook and they're in the very, very
clear, indirect definition of success. Like the indirect result of that was service things we disagree about service things that we fight about so late. We are so outraged that we stay on Facebook or we're so obsessed and conspiracy-minded that we saw on Facebook to learn more. So, the definition of success Is in fact, the problem here but the real thing is that the Facebook algorithm and I'm just going to focus on Facebook cuz it's an obvious focal point never really thought of it in a complicated way. I never really understood it in the complicated way and in particular will never be able
to infer truth from misinformation. There was no model Fortress. Listen to question. What is that right? When you take things? Because everything is great acting people and the next one is like okay. So then because again, whatever most recent stuff, How many times you want to prove? Something to me, this is what comes back to the question of media and policies and your your Post-it notes in the guardian. Like their newest thing is like, oh actually, we're going to have a moderating board and we're going to like seeing through it with people because a guy doesn't work.
And as I understand it and tell me if you understand it the same it's actually it should have to fold system and I dropped like this. Like I have to ask C is in mind in the first exes. Did someone flag it as a problem problematic content and that could be an AI or could be Human by the answers. Different media platforms have different game systems Flags. Musically mean, you use the key word Nazi or whatever it is was like really dumb because again AI doesn't have a bottle for you guys going to do with the things like You
actually mentioning this term. I keyword searches or something almost as obvious as keyword searches so we won't have time for that. But the point is it's very easy to bypass that AI capturing system. And and of course, as you point out that the paid influencers who are building propaganda, they're very, very good at gaming that sister, right? That's what they spend all day doing. Sorry, just to finish my to axies like and not only not only does a, I not pick up everything that's actually a bad, but if you deliver to the people who want to hear it, the people don't
don't like things as bad even when it's bad. So that's one of the one of the filters. The other filter is the policy itself. So like if you could fly to get sent to this moderate-income, it's just a group of people may be a very fancy group of people. If they are the Facebook oversight board but then they have to compare what they're looking at to the policies, which which again might might have flaws as well. We know that he does and we know likes Tik-Tok has been getting a lot of trouble for taking down black women talk about lingerie but not white women talk about lingerie in
exactly the same contacts. So so there's a bad policy problem too. And then of course, so all I'm saying is that there's an statement on top of all of it. You know, you have to be flagged and found to be bad for a piece of content to be to be Silvestri quadrants of bad stuff that's not even touched and false negatives. But there's also a false positive on my point is that is a total mess but we've come full circle by Quinn. Psycho, we don't want to say what's bad. A eyes going to solve that problem is going to cover for the black of moderation. And now we're
having moderation covering for the lack of working out. Is Ashley illegal. If we put some extra part is we are against Donald Trump's Facebook page problem. I'm just really appeals to me, feels to me. But having said that, like, I mean, I'm willing to go there, like, you and I both have three sons, right? And what I find really amazing about my three sons, is that all three of them are much more likely to to identify fake content. That my mother is my mother, by the way, I computer scientist who helped build the internet. She's very Savvy
technologically, but she's not Savvy in that particular scent. So it's kind of like herd immunity to miss information and US. Track, you know, we're just like, let's just let everyone like, like, bathe in this information to the point where would like to be finally figure out how to, how to avoid it, but it was obviously only the young people that would survive. And then there's a lot of approaches to sort of, like, trying to insist on accountability, by the by the platforms and all of them. As far as I'm concerned, either are completely fake because they are just Going over these
exact things that we just talked about. A I not working in the moderation, not being sufficient, they immediately put these platforms out of business, like I like tell me what you think of the Australia experiment inspections countries to make determinations about what do you want? European. Really where we should be focusing. Where do you come now? I meant to say earlier, when I said that like my framework wouldn't wouldn't work for Facebook solder than what I meant to say is, like I wouldn't
work for Facebook, like a Facebook said, could you out of there out of them and and mitigate some of her harm? And I'll be like, I just know, I had that on their list, you know what I mean? But I would be to work for the Attorney General, if they could subpoena some data to to display evidence of harm because that is actually what I do. So, and I and I mean, I do think that the critically important thing that Facebook has it done yet is be more generous with its data and it with the Harvest is created.
Buy this talk
Buy this video
With ConferenceCast.tv, you get access to our library of the world's best conference talks.