Table of contents
About the talk
What if technology could understand people in the same way that we understand one another? In this inspiring session in conjunction with the paperback launch of her popular memoir, Girl Decoded, Affectiva co-founder and CEO Rana el Kaliouby, Ph.D. discusses her groundbreaking mission to humanize technology with Emotion AI — technology that analyzes human expressions and reactions in context. She describes how its applications in automotive, advertising, mental health and autism research feed her mission to transform our relationship with technology, and by extension, with one another. Rana also explores how these applications will facilitate the shift towards an empathy economy: the ability for companies to truly connect on an emotional level with their clients and embody empathetic leadership to create an unparalleled sense of purpose.
SXSW dedicates itself to helping creative people achieve their goals. Founded in 1987 in Austin, Texas, SXSW is best known for its conference and festivals that celebrate the convergence of the interactive, film, and music industries. An essential destination for global professionals, this year’s online event features sessions, showcases, screenings, exhibitions, professional development and a variety of networking opportunities. For more information, please visit sxsw.com.
Connect with SXSW:
Egyptian-American computer scientist and entrepreneur in the field of expression recognition research and technology development, which is a subset of facial recognition designed to identify the emotions expressed by the face. El Kaliouby's research moved beyond the field's dependence on exaggerated or caricatured expressions modeled by laboratory actors, to focus on the subtle glances found in real situations. She is the co-founder, with Rosalind Picard, and CEO of Affectiva.View the profile
Hi everyone. I'm running a couple. You be co-founder and CEO of afectiva. We are at MIT spin out on a mission to humanize technology. I'm also author of the book, girl, decoded, a scientist quest to reclaim our Humanity by bringing emotional intelligence into technology over the past 20 years. I've been on this mission to build artificial emotional intelligence essentially, marrying IQ and EQ in technology. UCA is becoming mainstream. It's taking on roles that were traditionally done by humans. Everything from seeing as your personal assistant may be hiring your
next co-worker making decisions on your behalf of assisting with your house Healthcare driving, your car. All of these roles that I need done by humans, are now being taken on by a i and we're finding that there needs to be almost a revisit of the social contract that we have between humans and machines. Technology and how we interact with technology, and we're already seeing kind of some of the repercussions of that. Social contract, not being in place really, a not be well thought-out. I start my book by talking about an empathy crisis, how
I believe that because of Technology, not only because of Technology, but because of how technology is designed and how it's the ploy to scale, unfortunately, it's very polarizing and we obviously seen that manifest and all sorts of ways over the past number of years. But there's also a trust crisis. There's really very little trust and how organizations are building an AI and deploying a diet scale. And there's a? On whether this technology is the is being developed in a way that, you know, that this just and fair and accountable.
So, all of these questions come into play as as we think about humanizing, Knology. And in particular, over the last year, as this pain demick hit, it's become really clear that technology is, of course, front and center and how we can connect and communicate with one another with your friends, whether it's your colleagues as an organization. How do we keep people motivated and loyal? And and engaged while we're all working virtually and everything that this is going to, some form of it is going to stay
post the pandemic. But also how do you connect and communicate with your water? Stakeholders be your customers or Society at large. So, all of this is giving rise to what I'm calling the empathy account, and it's essentially, this economy that puts value on organizations that value, empathy that practice empathy that implements empathy. It's in a way he becomes a superpower, but they leverage technology in a way that Fosters this empathy and this is Kind of really you, right? Because when we think about technology and AI, it's all about automation productivity and
efficiency. Nobody's really thinking about how to bring a. I in a way that The Fosters are the ancestors, human Connection in, this is where my work comes in. So, I started on this journey over 20 years ago, I am originally from Egypt in the Middle East and I grew up in that part of the world. Then I had an opportunity to move to Cambridge University to do my PhD in machine learning and computer vision. And I got to Cambridge and I realized it was my first experience living away from home, and I was very homesick and I realize that the only portal of communication I had with my family back.
Home was the computer, right? This was way back before smartphone, but it really hasn't changed. A lot of our communication is is mediated through digital communication. Well, if you kind of dissect, how humans, communicate only about 10% of how we communicate is based on the Choice of words, we use 90% plus is nonverbal and it split equally between your facial expressions and gestures and body language and your vocal intonation. How fast are you speaking? How much energy is in your voice? So the 90% of the nonverbal communication is essentially lost
in cyberspace cuz when we were totally communicating with one another. Unless we have our video just turned on you, miss all of that and even when you have the videos turned off, so many of us, right? I mean case and point, I'm presenting to all of you. If we were together at South by Southwest in person, I would be able to gauge the level of Engagement of all of you and I would be able to in real-time adopt my message. That's what great communicators to that's what we all do actually subconsciously. But an additional virtual format it's really hard to tap into
kind of the audience and the energy of the audience and we're starting to A lot of interesting Incorporated technology. Like the ones we felt like in motion, AI. Artificial intelligence AI paper cup in interest. Are they smiling? Are they laughing? Like, what? I crack a joke, cuz everybody find it funny or not, right? It's so hard to do that in a virtual environment and that has repercussions on how we connect both one-on-one, but also how we connect and engage our organization. So again this concept of empathy Economy Tires in this idea of
leveraging aai and human perception. A I understand and take him, very data-driven approach to quantifying the motion Spotify and empathy in ways that have not been done before. And it is not only transforming human-machine interfaces. Obviously there's a lot of applications in conversational agents, But the power is really in reimagining what a human to human connection to look like. And so I go back to my days at Cambridge when, you know, I was chatting away with my family back home and I was feeling really homesick and lonely, but this machine was completely oblivious
to my emotional state and worse yet. It's really didn't kind of cash. Your emotional responses that I had when I was communicating with my family back home. And so that set me on a journey to capture these nonverbal signals to build, emotional intelligence into our machines II, focus on the face. As you can tell, I'm very expressive and I wanted to capture all of these expressions. And so I have to study can buy in a lot of my study of computer science, machine learning with emotion science in the signs of facial expression. As it
turns out, our face is driven by about 45 different, facial muscles that contract and move to create thousands of different expressions of social signals, of cognitive and mental state. And there is a system called the facial action coding system where you can go through a hundred hours or so of training to be able to quantify are objectively score, these different facial movements. So what I did is I took that fish flax and coding system and essentially automated at using machine learning, deep, learning and gobs and
gobs of data. And I was able to back then Implement a very this was twenty years ago, right? So webcams were big and blurry and there wasn't enough power, but the proof of concept was there. I was able to demonstrate that we can essentially use camera sensors to capture this type of data. And from then on, I found myself at MIT at the media lab essentially exploring some of the early applications of this technology. In one of the primary use cases of this technology was in autism so it's almost like an extreme example of how this technology can
really help with human to human connection. Individuals on the autism spectrum really struggle with understanding other people's nonverbal. It's super overwhelming and so almost like they avoid that sensory overload all together. And they don't even look at a person's face or eyes until they miss out on this channel all together. So I'm at MIT we proposed project to the National Science Foundation initially turned down. But we persevered where we essentially pitched, the idea of building a Google glass-like device that has a little camera, it
has little tiny camera sensor and in real-time it reads the expressions of people you're interacting with and it gives you a real-time feedback in the form of a heads-up display against the Google Glass, or auditory feedback about the person you're interacting with, and it's think of it as a real-time coach for these kids on the Spectrum. So, it might be deployed this at a school for autistic kids in Providence, Rhode Island, or starting to see very powerful results. These kids, we're starting to engaged and actually, look at the
person. They're interacting with we game at 5. And so, every time they were able to make face contact, they they, they kind of accrue points and the kids love that. Free product a gaming elements into that interface or interaction, but at the same time, being at the MIT media lab twice a year, we would host a lot of our industry Partners in companies and it was like a show until we actually called a demo or die because it was an opportunity to showcase what we're working on and the Fortune. 500 companies always expressed interest in
deploying the technology in various ways and when the list got about 20 or so companies that provided the impetus for spinning outfit either out of MIT and set us on this mission to humanize technology and bring this technology to transform a lot of Industries. So I kind of want a cluster the applications into a number of a bucket under this broader umbrella of the sympathy economy. The first is using this technology to better engage understand your consumers. Sword of the early use cases of the technology was in the media analytics and market research space
where we are. We're currently deployed in 90 countries around the world. We have measured responses from over 10 million people, we've tested. But the idea is, when you're on your phone, watching a Netflix show or video ad for your permission to turn the camera on. And if you say yes with your permission we are able to capture your moment-by-moment responses and we don't care about who you are but we care about your response so it's Anonymous, right? We aggravate the response of everybody who's giving us permission and then we're able to show you a
moment-by-moment aggregated / of how people responded. Did they find it funny because it's sentimental was it did it hit the right, emotional chord, right? And an end, then we tie this to actual consumer behavior and advertising ppis. Like, you know, did you, did you recommend this to a friend? Did you, did you share it with your social network? So, a lot of like a, we find that this emotional journey of a piece of content correlates very highly with by Rowdy or even purchase intent. Or sometimes even purchased
actual purchase Behavior. But also things like brand perception and brand loyalty, so it does Drive these really important consumer Behavior. Yeah. That companies are interested in tracking and it allows this was especially key during the pandemic. When companies really wanted to be right in in how they approach their consumers and what type of messages they actually use to engage their consumers. And we've done a lot of work with some of the top of advertisers are in the world test their ad campaigns over the
last year and we found things like being a pathetic. Really, of course, is important in your message but it has to offer that I may be true to your brand and the purpose of your brand. So we found a lot of like just kind of superficial like we're all in this together. We're here with you. Care for you kind of messages but they weren't really connected to the brand or what the brand or product stood for the organization. So we were able to tie that to action or tie that to me again. What kind of their brand elements did a lot better. They just resonated a lot better with the consumer.
So again it's really hard to get it this by asking people questions but by being able to tap into an audience is visceral. Subconscious response were able to do that at scale. You'll have to go into the lab or into a focus group. We can just leverage the fact that were all sitting in front of our machines, you know, For Better or Worse all the time and gauging with contents and with each other and were able to capture that data. What kind of woman bucket of applications but you can expand that across the customer lifecycle. Obviously customer
experience is really important. Then again you can imagine how this is an area that has exploring but it's not an area that we had a lot of Attraction. And yeah but imagine if you know, experiencing a website or your shopping online, if we're able to capture the moment of frustration moments of confusion, again moments of Interest then that is obviously very important data. As you crack that endure experienced user testing is another area. That is a really kind of similar interest. There are other companies that are integrating this type of technology and
call centers. Could you do, is one example of that I'm helping call center agents, really manage burnout and Nana, it's obviously, it's a very stressful job with, with colors. Usually calling and with a lot of high levels of fresh frustration and anger. And they integrate, this kind of prosodic feature analysis of the voice into, you know, routing the call to the right call center agent. And I hopefully improving customer experience. But also employee experience as well as as a CEO of a company. I really resonate with
this idea of employee experience. Our team is global dispersed, all around the world. We are all virtual for the most part and it's really hard to get a sense of who is motivated, who stressed to some, you know, who they burning out. I want to make sure that everybody is staying mentally healthy, but it's so hard to do that when we're all remote. And again, you could imagine how this type of Technology with the right privacy, considerations and place. Cuz I I hear you. Privacy is as is a big, a big deal here, and I'll
cover that because it's something we care deeply about. But you can imagine how we could use this technology to capture things. Like I'm mental health, level of anxiety, depression stress. We do know that there are facial and vocal biomarkers of these mental health diseases. And if we're able to gain leverage, the fact that we're spending so much time online to capture a person's Baseline. And then, when they start to deviate from it, you could make that event data available to them, maybe to a small number of confidence,
may be a clinician, or doctor again. How, and who do you share the data with is a key consideration, but the deed itself can be extremely powerful. So mental health is a is a kiosk in the area of this technology. You can also bring empathy in in the actual technology in the interface so we do a lot of work in the automotive industry where it's all about. Capturing in the short-term driver States is the driver tire is the driver distracted texting. While driving is the driver drowsy we have like four or five levels of drowsiness. We can actually
packed Microsoft leap, which is essentially you're asleep, which would be very dangerous but it does happen on the road and were able to detect that in real time the way too early stages of drowsiness. So we can enter or the car could intervene earlier on. So we're spending a lot of our, my chair kind of exploring, the use kisses around driver monitoring. But beyond that in the world of semi autonomous vehicles and fully autonomous vehicles, there's a lot of innovation and Imagination re-imagination of what a Mobility experience can look like. I'm, you can think about
personalizing content, personalizing, the environment in the car, depending on, who's in it and what they're doing. And there's a lot of innovation going into that. I'm very excited. I can't think of a car as just a robot on Wheels. So you could expand everything, we're doing or replicate everything. We're doing for the car, in your office, or your living room, or Kitchen, where you have a sofa robots, or maybe a conversational interface, like an Alexa, or a Google home, that could capture again, all of this information and build it into the conversation
interface. So that it stops being just a transactional interface, which is where we're kind of stuck with a lot of these interfaces, but move it to the next level, where it's truly conversation. Persuade you to change your behavior. One reference here, you may have all seen the movie, Her where this conversation interface are operating system. I guess Samantha gets to know, you know the owner really. Well, I know him really well. She's able to persuade him and motivate him to change his behavior and he's depressed medicine. Got him out of house and kind of
re-engage with the world. So I really do think that there is huge potential for these interfaces be its, you know, whether it's Siri on your phone or takes the form of embodied agents or maybe it's in your vehicle. All of these interfaces. If they have emotional intelligence, then they will be able to persuade us to motivate us to change Behavior. Hopefully to be more productive. Healthy are more connected like you can you can you can pick your you know the parameter that you really care about but also in like how we connect and communicate
virtually with each other. I talked about the example of of live-streaming and end in Virtual events where I find it really, really painful to be presenting. I'm not seeing my audience and I just wish there was an ability to capture people's responses. I don't have to see everybody's face. I recognized again, there's privacy considerations of that, but can I visualize it in a real-time curb that just gives presenters a sense of yeah? How powerful for the audience? I feel like all
craving to have the sense of a shared experience and it's so hard to It's so hard to do that when we're all, you know, and in our own spaces kind of distributed virtually, this has applications in Virtual learning environment as well. You could imagine how this did. It could be super powerful for educators and Learners to determine the level of Engagement of the Learners and Readiness to learn. Again, I imagine a lot of this hybrid world is going to stay post the pandemic and then tell her house is another
area that we've all been kind of. It's been accelerated big because of the pain and I can, it has a lot of power and potential, but can we then use the fact that you let you know, your interfacing with your doctor and the doctor has, you know how to capture all this data and quantify for the first time ever empathy, right? Like what does empathy really look like? And can we do a b tests, right? Like, for example, if the doctor's like Take a note, some looking like that. I never actually looks up into the camera and and look at the patient. We know that this correlates with perception of
less empathy and we know that doctors who are perceived as being apathetic are more likely to get sued. So we now have an opportunity to quantify these things that the soft skills that are really key and in our workplace in our personal and professional lives and we haven't been able to do that before. And then finally I'll just end on the notes of I truly believe that this is going to become ubiquitous it's going to be the back to human machine interface and it will transform any Industries. But I also fundamentally believe that as an inventor of this field of the
category of artificial intelligence, I have a responsibility to ensure that it's done, right? And there are a lot of ethical and moral implications of this technology terms of how we develop it in a way that's not by But also in terms of how and where we deploy it. So for example we as a company do not do any work in surveillance or lie detection or security for a number of reasons, but primarily because we do not believe it respect people's privacy. So we have turned millions of dollars of funding and potential Revenue in the space because it does not align with our core
values. So I'll just kind of giraffe. I believe there is huge potential of this technology and reimagining not only with a human machine interface will look like in the future actually today but also the future better have some more importantly, reimagining what human to human connection and communication looks like and upgrading the current ways were interacting digitally in a way that brings us more together as opposed to polarize us. And I really do think there's a way to do that, but we have to think about the ethics of all of this and the
unintended Consequences. So we're part of an organization called The partnership on AI. At which was started by the tech Giant's Google Microsoft, Amazon, Facebook, IBM, and they've been partnered with ACLU and other civil liberty organizations, but also startups like afectiva and we've just wrapped up a project where we went through, all the applications of emotion AI, we could think of and we try to articulate what are the unintended consequence of what can go wrong here and and how can we guard against that? So I really do think that we should I
mean regulation, but I think is inventors and innovators and thought leaders and business professionals, we shouldn't just wait for legislation and regulation. We should be at the Forefront of Designing this in the right way and be stewards. I'm so thank you again and I look forward to hearing from you. I always love hearing from people, but they think and maybe some of the applications that you're interested.
Buy this talk
Interested in topic “Startups & Entrepreneurship”?
You might be interested in videos from this event
Buy this video
Our other topics
With ConferenceCast.tv, you get access to our library of the world's best conference talks.