Events Add an event Speakers Talks Collections
 
SIGCOMM 2020
August 11, 2020, Online, New York, NY, USA
SIGCOMM 2020
Request Q&A
SIGCOMM 2020
From the conference
SIGCOMM 2020
Request Q&A
Video
A Conversation On Privacy
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Add to favorites
160
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About the talk

A Conversation On Privacy

Speakers: Dave Choffnes (Northeastern University), Dave Levin (University of Maryland), Franzi Roesner (University of Washington), Stefan Savage (UC San Diego)

01:35 Dave Levin

08:56 What Is the Game Theory for Cyber Defenses

11:02 Characterizing Robocalls through Audio and Metadata Analysis

12:59 Pre-Installed Android Malware

15:41 How Do We Get Technical Audiences

18:25 Intimate Partner Violence

25:07 IoT Lab

29:56 Location Privacy

32:08 Build Security into the Foundation

About speakers

David Choffnes
Executive director at Northeastern University
Dave Levin
Assistant Professor at University of Maryland
Franziska Roesner
Associate Professor at University of Washington
Stefan Savage
Professor at UCSD

David Choffnes is the Executive Director of the Cybersecurity and Privacy Institute. Since joining Northeastern in 2013, he served as an associate professor teaching Fundamentals of Computer Networks and Networks and Distributed Systems.

View the profile

Dave received his PhD from UMD. Before returning as a professor, he was a researcher at Hewlett Packard Labs in Palo Alto, CA. Dave's research interests include securing the web's public key infrastructure and defending against nation-state censorship.

View the profile

Franziska (Franzi) Roesner is an associate professor in the Paul G. Allen School of Computer Science & Engineering at the University of Washington, where she co-directs the Security and Privacy Research Lab. Her research focuses on computer security and privacy for end users of existing and emerging technologies. She is the recipient of an MIT Technology Review "Innovators Under 35" Award, an Emerging Leader Alumni Award from the University of Texas at Austin, a Google Security and Privacy Research Award, and an NSF CAREER Award. She received her PhD from the University of Washington in 2014 and her BS from UT Austin in 2008.

View the profile
Share

Call. Maybe we don't have permission to do that, okay? It looks like we're all here. Excellent. Welcome everybody to the security and privacy panel. Great to see everyone and excited for the discussion. I let me first. Take a moment to introduce our panelists and thank you everybody. For for joining us today, I'll start with fonsi rosener Ramsey is an associate professor in the school of computer science and engineering at University of Washington. She's also magic in the information school. There she works on many different exciting topics including those at the intersection

of security and a mess reality as well. As some interesting user focused topics which I think our attendees were very excited to hear about. So thanks for joining us. Probably see you next week. Have Dave, tropolis, who is an associate professor in Curry College of Computer Sciences at Northeastern and a member of the cyber security and privacy Institute Dave works primarily in through the system and networking. He's done a lot of really cool work lately on privacy and the internet of things.

And so thanks for joining us. Next of all introduced. Dave Levin who is a professor at the University of Maryland College Park in the department where he works on a number of topics, including I think most recently, a lot of work on internet censorship, and designing various tools to measure and circumvent internet censorship. And finally, I'll turn to Stephen Savage who it's is my my Stephen is a A professor in the computer science department at the University of California and San Diego,

he works on just about anything having to do with security and thinking, if it relates to security, he's probably works on it. And so, we're excited to hear from everybody today. Thanks to her for taking the time everyone. Okay, great. So, I think without further Ado, I think we can kick it right into the questions. And I think for those of you following along at home, which I think is all of you. There is a slack Channel called sitcom 2020panel privacy

where I will monitor questions if you'd like to post things there and and upload them. I understand there's also a sick, a zoom Q&A and I see you in there. There's an invitation, the industry-sponsored it don't go to that. This is way cooler, but I'll follow the Q&A there too. So let's kick it off with the in the slack channel app and a couple Kickstarter questions which I shared with her panelist earlier. So why don't we go around? Just talk about the first one first, which is so I asked two questions are your most important problem

or area and security and privacy that you're not work, but others are. And then the second question, I posed was what was what's your favorite paper? Security privacy in the last and years, I'd like to hear an equals one cuz it's easy to kind of go back and and pull. Seminole papers, but I think it'd be interesting and fun to hear what people are reading lately. So, I guess I should feel free to chime in go back and forth to make a separate. So, I do you want to take it off?

I'm super curious to hear everybody else's answers. With the first one, which is what are the areas of security and privacy are? The topics problems that are important, that I'm not working on. I think the two that immediately came to mind for me was first topics at the intersection of kind of security, and AI in machine learning. So things related to adversarial, machine learning like tricking machine learning models with adversarial inputs. Also, looking more broadly thinking about going to

fairness in machine learning and explainability. I think some of those things. A little bit further from what you might traditionally, think of a security. But as a security person, everything looks like a security question to me, basically, everything where there is the opportunity for things to work in unexpected ways from how you've designed it. So I mean that's one huge topic but that's that's one that came to mine and the other is around, especially microarchitecture roll side channels as things like Spector. So

places where are layers of abstraction breakdown and their unexpected information. Leaks are other things like that. I think there's a lot of really interesting Works going on in that space right now. I will cheat with one sentence and mention two areas that I am working in because I think they are also important. I want us around and of online misinformation and disinformation and the other is around kind of human factors insecure. More generally, but specifically thinking about vulnerable, populations are marginalized communities in and how to understand and support

them. So other people those are exciting. I want to hear more about that once you're working on to let some things that time to just mention at the end or two areas that I am not working on which I'm super excited. So thank you for your work on that. I do want to talk about something that's maybe a little bit less technical then what we tend to think of our focus on this, move the needle on privacy or security But ultimately, the weak link is carbon based on the people in the loop here. So I'm really interested in that

aspect of things. It's an area that I haven't worked in particularly how you do Outreach and education is not as education to adults, but you know that if there's the kind of education we need to have about privacy and security Now to think about those topics needs to happen much earlier. Moves and some more of the education side of things, but I just don't understand. How do we, how do we take our all of our technical insights and then try to work on closing the loop on the inside, that's super exciting date.

Hey, yeah, thanks again for having you part of this panel, in regards to a, I left about empowering a guy and more about putting guard rails on it there. A lot of things coming out, that concern me quite a bit, it's as powerful as a very powerful set of pictures. But a lot of these things I see, as being, particularly, powerful, and empowering, for oppressive regimes, as I think, we really have to understand what some of the fundamental limitation to are some of the limitations of what can we do to put guard rails on this. Look me, keep

from happening with the Frets. Do we see coming down the down the pike and when we do about it but right now it's just hard to understand what the capabilities are. Thanks. Stephen. I think so. First off, I think I think all of the topics everyone's mentioned are fascinating. I'm going to take a slightly different tack because I think one of the great things about a community right now is that all of these things have been discussed are being worked on, it's kind of like a

Renaissance. And security work. There's just so much great work going on. So it's a struggle to come up with things that are not being worked on. But I think that's one place. Where are our community, does have a little bit of a blind spot. Is looking at a big-picture questions. So, two examples of the house that I want to call at 1 is kind of what is the game theory for cyber defenses? So we are always coming up with this fix and that sex than the attacker gets around it and they get around it by making any attack, more sophisticated and said it's kind of like the the

antibiotic problem in the large wear. What we end up doing is training a better and better and better attacker threw kind of the darwinian process and we never step. Alright, where does this lead? Does this get us to a better place or does this you know, not get us to a better place? And I think the other place for that comes up his privacy where we spent a lot of time on on technology and measurement that surround the margins, but not a lot of time looking at kind of fundamental drivers and that's the lack of an alternative economic model to advertising an absent. You know.

There's, there's a consent loophole that is so big that like everything drive through it, but no one is talking about, is there a way that we can make, not all of the incentives focused on extracting all of our information from us and that's cuz it's hard. But this is a community that I think is well-suited to look at it. Fantastic. I think lots of good fodder for follow-up questions and discussion her. This is great. I think, can we hear before we before, we kind of dive in on those and I see already. There

isn't a question in the Q&A, which, which we can take eventually as well. But before we dive in from each of you, one, cool paper in the last in the last, in the last year. Graziano rental. At you, you're the security PC chair. I have multiple examples, but I'll try to stick with what? Okay, so one from his neck security, that's being presented this week. That I think it's really cool is called me find the actual title of it. Who's calling characterizing robocalls through audio and metadata analysis.

This is basically a paper looking at robocalls through measurement studies, and they set up a Honeypot of 50,000 phone numbers over a year and analyze the river called that they got. So I think it's a really nice example of a measurement of a real life phenomenon, that many people are interested in affected by. So that's, that's one from our program this week that I'd recommend. Next week and we can mix up the order just for fun, right. Let's see. Let's go to

be more fun papers. I've read in a while, as the 2020 Ty paper by Ben Dowell and Company on how do you do wearable microphone jamming? Cuz it had all this cute little. And then there was an actual artifact and it has provided a way for individuals testing agency, Independence of all of these centers around them. But I want to plug one other paper really quick, because the people almost never plug the Xavier doesn't really nice. That's okay, paper at security and privacy on Cyber

Insurance. Whichever one kind of talks about the very few people have done the deep-dive to understand this came out of your comments and they attack. And it's a really nice treatment of all of the previous work. So you can read this paper and actually feel like, you understand what are the open questions? In cyber insurance and I think we don't have enough papers like that. Text Stephen. I'll just call out Dave and see who comes. What comes next? I thought that meant you were taking it. So,

it's been a long year since March has been happening, but I'm going to sit out as one of my colleagues papers on installed Android malware or just hurts after some of the things like my way through the supply chain, of all the crazy things I get started. Bad enough. All the stuff we find in stock, are you can download but then what people are stuck with the software that comes with your phone, I thought it was really interesting and also just to call out other days, Geneva work, it's near dear to my heart,

cuz it squarely in the topic of net neutrality in censorship in something that my group has been working on for for a while but it's sort of taking it to this really nice. Elegant solution continues to deliver even you know, this week is going to save made that announcement about China moving to block in the ass than I thought. That stuff is really cool. So my favorite creeper of last year was from another. Buddy of mine are in Showman music 6:30 last year. 2019 please pay

inside and they showed that just by monitoring Bluetooth and drink a bunch of extra clever. Tricks to detect credit card scammer in gas, station pumps. And I think what's really important about this ask about some of the stories afterward like this is really like to ride along the Secret Service like really good stuff but I think one of the most concerning for attackers are physically altering our critical infrastructure. Defensive that we used in the past. Just don't apply when they can physically alter your hardware. And so I need this

paper is really important because it shows a passport for for doing something like that. Cool. Well thanks everyone needs. You've definitely help me populate my, my research group reading list for the next month or so excellent. We have a few questions. I think both in the in the slack and in in the zoom Q&A, I think. Let's see. I think, Ellen zegarra has a question here that has been up for a couple times, and it actually relates a little bit to see if he mentions

in a bigger bigger. Bigger picture issues, we met some privacy. I reach Etc. And she has requested. And then and then and then refunded the question. How do we get to welcome? Appreciate and evaluate socio-technical work. Such as privacy and underserved communities in your car. Privacy is an interesting area because it is inherently boundary of technology and and people power caci users. Etc. And I think, I think Dave you in particular talks about Outreach to to, to various communities there. So anyone who wants to chime in please, please please pick that question

up. I said, pick up the hand and off, which is one of the best ways to appreciate underserved communities to understand what they're going through. I think it's really, really easy to assume that our security challenges are the same as anybody else's security challenges. And so bring me some of your work has totally open my eyes as to like totally different security challenges that people face. So I think doing that kind of work is super important to see like experiencing the same thing. Yeah, I would say that's been really eye-opening for me. Also just doing the work.

So I think my research background like I started off kind of much more in the technical Dexter than I am now. So, like my undergrad, research was in computer architecture and I kind of over the last few years, have started more and more talking to people because I've had this in a realization of like it's not enough to say like we can't just sit here and think about what are the important problems and they like, oh, you know, journalist probably have security needs to do some things for journalists. You can't do that and isolation. And so I've had a

variety of work, looking at actually understanding different communities in it, you know, it's it's challenging. It's a what you kind of, like, especially as a technical research to say, like, okay, we talk to these people and then now, we understand their truck models and their needs. And so now we know what to build That last step is not usually so straightforward. Sometimes, when you learn is that there's nothing you can do to solve these problems, if there's some really fundamental tensions in the needs and constraints, people have it. So we don't work talking to journalists, for

instance are survivors of human trafficking. One of the kids to the question of, you know what's what's one of my favorite papers in the last 2 years, when they also had on my list that I didn't mention or set of work? Is there some really great work coming out of on Cornell Tech, on intimate partner violence? Kind of both the technical side and the human side of that. I think that's it, that's another really good example of that, that this type of work that I point, you understanding, what are the implications of the Technologies were building and what are the actual needs and constraints these

populations that without understanding the people we can guess at but we might not be building great things. I just want to highlight something that friends he said, cuz I think it's really spot on which is that the challenge that these problems have with the general technical audiences, that they're messy, but they don't have Chris abstractions. They don't have Krispy answers and it requires a very different set, one to appreciate a very different set of more qualitative aspects, that are not got a part.

And parcel of the way, we do most computer science or search. And so, I think that now, I think the community of people who appreciate that has has grown tremendously over the last decade, but I think it's, it's, it's deeply related to the same reason why we have, for example more attack papers than anything else, could be the easiest ones to do. They really crisp? You know, you did it done and everything else you kind of got to convince people in this is at the extreme. Side of that point where it's like, okay, trust me I have the problem, right? She's just got to

believe me. Ellen's question, also had something about evaluation of course, the messiness of the demesne. And of course, there are, you know, qualitative evaluate the quality of messages of all kinds. Translate, I think you might be have a good perspective on this as a recent program committee chair, who's seen some of these papers up here at a security conference. So how does that type of work? Typically get evaluated. I assume by experts on the program committee who know this message. But

can you speak a little bit more to do the evaluation of that kind of work? What makes a good security and privacy paper that touches on those? Those kinds of topics. Yeah, I think it's a good question in there. If there are multiple Parts there, so one is maybe too directly answer Ellen's questions about, you know, how to how to get technical audiences to appreciate in value at this work. I think part of it is recognizing that these are fundamentally interdisciplinary questions until we have to use messages from, you know, from systems and security but also from human-computer,

interaction and given sociology. And so bringing those experts on to the program committee. I think that's a shift that, at least I've seen over the last, maybe second rounding up the road. Like, when I started grad school, I think there were like, more humans facing papers. You wouldn't necessarily submit to some of the main computer security conferences, where they are now, much more common and accepted as a, as a core part of this whole problem and solution space dancing. Part of that has been a shift in the makeup of the program Committees of people

who know how to evaluate this work. Do this type of work themselves there. I think there's also a perception that doing because of so fuzzy and messy that doing work, qualitative, worker work with users is not rigorous, that's definitely something that I kind of thought. Before I had done some of that work myself and really understood the different qualitative methodologies, like how you talk to 15 people, how could you even know anything from them, and that doesn't seem very scientific. But there are, you know, well-established, qualitative methods and and

other types of strategies for doing that work in evaluating it in there, many, many people who We are experts on Novelli, waiting it, and doing it. And so there's you to get into that type of work. There's also a learning curve and what are the, what is the language in the evaluation of this type of work? But thanks and also point out that there are there are quite a few nice hcim qualitative message. Yeah I think all this could probably over panels probably chime in on this topic some more but

I think we have some other questions. There was a C1 in the slack channel that that I think there's a couple actually that I think I can look together related to Coyote and and and Art devices, you will security and privacy. So take a couple of these and sort of time into Stefan's big Vision question as well. So obvious, Sagar asks, how are you going to address ciocan security versus entry and privacy debate? And I read this to mean on the one hand we want our devices that we tracked it to the network to to be entering

encrypted on the other hand is Steven points out. There is increasing my not increasing. There is a substantial incentive for the manufacturers of the service providers etcetera. To collect data about us when we least devices in So did others attention there that? I think the house is getting it and then I think I'm going to lump this question with Rahul sharma's question here, it says it's been a lot of surveying looking at how various devices, spy on, consumers,

which ones in your views need more attention and then, you know, so I think that, I think they're a couple of our panelists could certainly talk about that but let me start a wrap. It all again how to resolve. It could even try back summer, get 7th game, three questions. If we wanted to I can start. This is the my iot lab. I expect you. So yes that is the lab we can access for a while now. But everything you see in the picture is internet-connected and crazy, right? Because it need to be in

the increasingly. I think we're going to see every device that didn't previously have a connection but has power is going to be able to connect one way or another. And the reason as I think these questions are alluding to is that there's opportunities to monetize the data that comes out of it. It's not scary for functionality and it's really hard space because increasingly we are seeing devices and things like strong encryption. That make it very difficult for us to men in the middle of the the network traffic, which brings this tension between security

and privacy at least in one. Temptation of adobe has no devices that, you know, his security people. We have course want these devices to prevent us from seeing the traffic. On the other hand, these companies may be using it as a way to hide. What they're doing from some observers. And so this is something that we started with all the time when we're analyzing the network traffic coming out of our lab in. So, you know, one approach is, you know, there's a lie, you can infer from and Cartoon Network traffic.

So, we knew that the ring doorbell was recording video without lighting up or alerting anyone about this happening because you can see the network that obvious when something is recording video because there's some things that you can tell without even be cooked. I think you another concern. Another thing that we think about is how might there be ways to get access to two devices? That would allow? You start allow researchers to do security analysis, privacy analysis

of the kind where you to say, man in the middle in, but this is not given to the public. These are very controlling and it's sort of like, how Apple has this program for security researchers where they met you and then they send you a device that you can do where you have ruined. So you can do this, kind of ability would never be made accessible to any consumers device but I think your balance is taking examples of this along with ways to use any devices that were not meant for testing purposes, to make sure they're behaving the same way can be another way that we can

move forward in this direction. But ultimately, I think this is this is going to be a problem that keeps coming up over and over again and it's for better or worse. I think we have to combine cases for you. Have some ground truth with as much information as you can with controlled experiments, which is a lot of the ways that we do our analysis in our laps. So that's that's just my take on that and let other streaming. Great to hear from other snacks there. That was very thorough.

So I'll type in I'm I'm going to be the past mr. I think we are. We are we are screwed for the foreseeable future because the cost structure of iot is too loud to support significant investments in security and privacy, except for the largest brands for whom, it represents a reputational damage until it becomes a commodity and people are suffering as a result of violations. And you know the Integrity of their homes that you're not. It's going to be very difficult to move. The needle will find with

the things I'm good here and there but that's what people are going to produce is going to be something that has low low time-to-market low-cost and that's been true, consumer electronics forever. And it's the classic problem of having how do you add security either before Market or aftermarket For devices where you know the bill of materials cost is he has forty bucks. It's it's it's almost impossible from an economic standpoint. The one one thing that I do think that are on in this is a little bit of a tangent to strict iot but one place I think that there is

not enough attention. Being focused is on location, privacy around carry devices, cuz I think that there is not enough appreciation about just how much truly fine grain location information is being distributed. Like, I would say, in step over is not on here. Pretty much everyone in the audience. Their location is being reported multiple times each day to people. They have no idea, are getting it and, and that sucks, I think a little scary. And something that we can spend more time looking at That was a great points, one of the thing you just

reminded me of his, you know, it's important. Also not to just think in technical terms like how do we improve the security? How do I block you? No privacy problem and it gets really important policy in here and that's something we're starting to. Do, I start working working with Woody Hartzog, who is in the law school, focus is on on privacy. And I think one thing we can do, is researchers is take what we find on the technical side and talk to lawyers, talk to policy people understand except this all has to exist

in the marketplace and there's need to attend to be Financial incentives. If anything's going to change the one where you get those incentives as penalties, if there's violations of existing box, I you can pick up some Illinois by Biometrics law. I'm being really effective to a late date for this and then there's an opportunity to I think we'll be fine in terms of security and privacy problems and convince lawmakers to pass new laws. That try to foreclose these things from a policy perspective, not trying to say that you suddenly so security or privacy problems. If

there's a law against it. But at least now maybe combined with enforcement, you have a way to push the industry towards making changes. I think those are awesome really great points across the board and that would be the right like goal. In the meantime, I'm going to have those things. There's a thing that looks like the same security which is that you have to build security into the foundation since day, one concern. And why we know it's a lie like we're constantly to assume that you have all of the possible security considerations that, you know, all

the Potential Threat models deployment considerations, the potential attacks ahead of time is unreasonable. We're always adding on security after the back one way or the other firewalls are great example of it. So I think it's actually important to try to get the security in early but when they aren't there. I think we also need to have some kind of principled. Systematic way of having security after the fact, I would be really great if it was the case that people building new features. Secrets of day. One concern concern. Imagine all they had to do with

having some kind of measures along the way for that is, I think really important until we can start. Thank you. A lot of really good insights there. So I think we are pretty much way over time we start over time so I'm going to not take full blame for that but I am respecting. Everyone's time. I think. We'll just close it out with with one final question in. And I think just to kind of maybe tire for a few things together in this last discussion on an IT security and privacy. Stephen was talking about location privacy. In one of the things that

he said, struck a chord with me, a lot of people have no idea. Who is getting this kind of data? I agree. I work on this stuff and then in talking to social scientists across campus to, like Halo to the school location data were you thinking to, you know, do contact tracing and, you know, track Mobility through parks. And I'm like, huh, you have that? But I guess my question is to kind of turned things around a little bit, which is to say that some of the recipients of this data R Us in, in the end.

It's, it's it's the researchers and, you know, we did, we typically think of variety privacy in terms of, in terms of advertising and, and, and surveillance and things like that. But we are also the consumers to the state as to, how should we be thinking about that? I mean, this is, this is not a hypothetical, I think we all in our Collaboration have the ability to, you know, walk across the campus or across the hallway and get one of these data sets or some of them. In some cases, we got them ourselves so as researchers, how do we use these

responsibly? Did I stump everyone as a result of your work? Did you make things worse for? People are better for people and I think certainly the mere Act of using. One of these data says we are such a small fraction like a pimple on the on a fraction of a percent of the use of the data. I don't think are you sore non-use is what is driving the Privacy risks that people have aunts and then the question is whether individual carms that you caused and a lot of that is about,

you know what are you going to look at? And what are you going to say about what you learned afterwards? And I think there are there are reasonable controls that one can put in place in that space. Now, it depends also on it. How the data was gathered. There are some things that just should be verboten. I think I think Nick your question is a really good one and I think there's also a broader question in there about, you know, they're there is a big tension here which is that is useful for things, and not

just for researchers, but for company, you know, there's also features and, and things that we want that we need data to be able to have been. So, there is, there's kind of a fundamental tension. Like, I think one of the important things in living in the world and making progress and all sorts of directions. Even if you do your research area, security, and privacy and that's where some of your core values are, is recognizing, that is not going to be effective to say, like, no, no, no, we can't do anything because I don't have a silver bullet answer for every situation, but I think

that there's some value in at least recognizing that there's a tension there and I collecting data and you some data are different things that you might need to think about differently. And maybe coming back to days earlier point about the role of policy like there's all these and maybe time together some of the scenes of the panel, there are lots of pieces here like a human part. Like what do people understand and expect we need to think about the high-level all those things and how they fit together. Thanks, will

that I think that ties everything together. Nicely. Great to see everyone, or actually, and thank you so much for for your time. This is a really fun discussion. Thanks for the recommendations on pointers for reading steam's. Big picture discussion is great, too. So we'll let you get back to Wherever you may be I'm in my office today for the first time in like a very long time because our power went out so I'm socially distant. But yeah, thanks for thanks for, thanks for joining us today and I think David Levin is the session chair for the next panel. So we just get to it. We have the rest of us

are bailing out but they've I think you're not done with me yet.

Cackle comments for the website

Buy this talk

Access to the talk “A Conversation On Privacy”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free

Ticket

Get access to all videos “SIGCOMM 2020”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “IT & Technology”?

You might be interested in videos from this event

November 9 - 17, 2020
Online
50
94
future of ux, behavioral science, design engineering, design systems, design thinking process, new product, partnership, product design, the global experience summit 2020, ux research

Similar talks

Kyle Schomp
Performance Engineer, Senior II Lead at Akamai Technologies
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Tong LI
PhD Candidate at Huawei
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Yuanjie Li
Assistant Professor at Tsinghua University
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free

Buy this video

Video
Access to the talk “A Conversation On Privacy”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
949 conferences
37757 speakers
14408 hours of content
David Choffnes
Dave Levin
Franziska Roesner
Stefan Savage