Jennifer is a software engineer on the Chrome team, focusing on media experience. She earned a bachelor's degree in computer science from the University of Washington.View the profile
Zoe is a software engineer with Google's Chrome WebM team and has been a key contributor to the open source, royalty free video codec standard namely Alliance of Open Media Video 1 (AV1). Zoe received her PhD from Purdue University and her PhD/ME/BE from Tsinghua University, Beijing.View the profile
About the talk
This talk will show you how to build stunning media experiences on the web by offering practical advice on using Shaka Player to build best-in-class experiences; taking media offline; and working with Autoplay, Media Capabilities, PIP, and Media Sessions. It'll also dive into AV1 - a state of the art video codec launched this year by the Alliance for Open Media that saves 30% bandwidth over VP9 and is open source and royalty free for anyone to use.
Good morning. wow interactive crowd today great Thank you. You don't even need a mic. That's incredible. Welcome to the talk at Austin web media. My name is John pallets. I'm a product manager on the Chrome media team and we're going to talk about several topics today. We got a great talk for my part. What I'm going to do is talk about a few updates in the ecosystem as well as API updates and changes in Chrome. After that. Jennifer will go into a really important topic of web media consumption, which is
multitasking as more and more viewers are using the computer not just to consume media, but also to do other things at the same time, this becomes a really important topic and finally to give an update on 81 video compression. I do not want to steal her Thunder, but it's pretty amazing stuff and will cover more details later. If you attended I owe last year you heard us speak about how this is the decade that is really transformed consumption of media particularly in the video space and the web has been a big part of that everyday 30000 hours of video
is consumed in Chrome. That's just video that doesn't even include audio. There's a massive amount of media consumption across a very wide range of sites and away is that users are consuming that media is continuing to change and that's driving you apis that we and the Chrome team need to add in support. Your free sample Progressive web apps. I've actually been an interesting difference for media companies because it gives them away to get to users that have limited storage capability it lets them get to them without requiring app install. For example, Instagram
doubled their retention of web users when they launched their Progressive web app last year. Another important case that's become relevant over the last few years on the web is offline media consumption as more and more people get access to mobile devices. They really want to watch media or listen to me and they want to do it even in places where internet access isn't available. This is really important countries like India and sites like voot.com or adding support for offline playback using some of the new web apis. So in this case what they can do
user could do satellites download select the quality level that they want to pull down to their device. And then after that looks through able to watch it offline. This is all starting to really come together partly because of progressive web apps and partly because of the underlying media apis to make it possible. Do we covered a lot of this last year at Ajo. And for those of you who couldn't make it the group message was at the mobile web is really ready for media. And in fact, you could do great things with media playback on the mobile. What is a representative sample of what's
available in the link above and I'd encourage you to check it out. But for this talk, we're going to talk about some different things and let's talk about desktop. What I want to do is talk about a recent change that address is something that users having a whining about four years and that is autoplay. What do you use our first goes to a site and it unexpectedly plays audio that is not good. And in fact we can track this and we know that a significant percentage of media that starts playing when they use our first visit to site without them interacting with the site it
all gets shut down within 6 seconds and that shut down either by a pause or closing the tab or muting. We're doing something like that. We recently launched a new policy and chrome to address auto play the policy applies when he was our first arrives to the site. So this does not take effect after they've been erected we decided they quick on the side of a tap on the site. The auto-play policy is not being applied with the policy applies when the first user first arrives at the site when our user arrives muted autoplay is always allowed. So if you're wondering OK what can I
safely do the answers you can do things that don't play audio and if you have a video element you can play it. There are a few cases where you can also play audio when we use our first to rise, of course after a click or tap, it's fine. You can play audio but whenever use our first arrives there couple of situations where you can play audio. The first is if your app is a progressive web app on mobile and the user is launching it or has it installed from the home screen in that case when they launched a progressive web app autoplay with sound in the wild. The second case on desktop is
that Chrome is learning which sites users go to and it looks to say okay. Wait a minute. If a user is going to this site and playing media most of the time they visit that site with sound after a Quick Remote app. They probably want video with sound on the time on that site. I do overtime as a user browsers Chrome identify those sites and say okay will allow auto play on a site where the media is usually being played with sound. This is actually an interesting case worker does have the ability to personalize the experience for the user. In fact, we launched this a few weeks ago
and we're so we get a really strong signal about auto place that are not wanted. We know that the ones that have shut down quickly are the ones that uses don't want and we we looked at the results after this launch. What we saw was that about half of those Auto Place get blocked now, that's not a hundred percent that's going to happen. Now why I only have is partly because Chrome is learning and it doesn't always get things right, but it's a massive user benefit that we just cut up half of unwanted Auto plays when users first go to sites. But for you the developer what this means is that
you cannot always expect that audio will just work. If you start playing it when the user arrives at a site what this means it would a user lands on your page. You will not know whether autoplay with sound is going to work until after the user gesture a click or tap or click or tap your fine. But when the views are first arrives, you won't know if you take a video or an audio element the way they detect this is to look at the results from the Play Promise You can tell by looking at the result whether or not autoplay started and if it wasn't if you tried to auto play something with sound,
you know that you can then try to play without sound. If what you're doing is hosting embedded videos so videos of the Cross origin eye frames and you want them to have the ability to autoplay in situations where they're allowed to you'll want to add a new attribute to the iframe the allow equals autoplay attribute that this does not mean that embedded videos could always autoplay what this means is that if you have an iframe on your site and your site gets the ability auto play Benny iframe will get the ability to autoplay. If you do not include this a tribute any embedded
videos in that cross origin iframe will not be allowed to autoplay, even if they say we want a lot of play they will not be allowed to do at the policy will prevent that so if your wedding videos and you want them to be able to Autoplan sound when Chrome won't allow it make sure to use the allow vehicles on a play a trivia. If I don't talk, I want to talk about Sears web audio. What we're looking at here is an example of a web audio site and it is reasonably typical which creates the audio contacts on low. That's the top half of the code up. Now
what the assumption is here? Is that audio always work when the user first arrives at the side. And as I said, that's no longer a valid assumption. Now the good news is that what will happen under the hood is something that you can react to with only a small change. If you start an audio context and you don't know whether or not it's running all you can do is look for a user interaction on the site a click or tap and then call resume on the audio contacts at that point. The audio will continue to work. Everything you can do when I don't have the code sample up here is you
can actually check the state of the audio context to determine whether or not it was blocked by the auto play policy check the state if it's running then adios play if it's suspended that is not and that gives you an opportunity to do things like update new controls to reflect whether or not audio is actually play. I want to pause for a second to save our team is working very hard to improve things for both users and developers as part of that. I want to acknowledge that we could have done a better job of communicating this policy change to developers. You're using the web audio API. I if you
have any questions on this particular change, we are available at both of the web on audio office hours at 12:30. And in addition we have media office hours today at 6:30 and tomorrow 1:30. And if you're not here at I owe if you're watching it home, or if you want more information about the policy changes the lake on the bottom on this slide has everything I just talked about. It also has more information that you will want as your debugging your website including instructions to completely disable or completely enable the auto-play policy. So you can try out the different user
interactions and different experiences. If you don't have this liquor you don't get a picture before I move to the next slide. You can also go to developers. Google.com and search for auto play. Let's talk about what we recommend as far as what the user experience should look like the ideal situation from our perspective is that when users arrive and you can if you're not allowed to autoplay with sound mute the media and play it to drive user engagement. And then when you get a user gesture, ideally give the user control where they can unmute the media and they can turn on the sound
tapping or clicking on that control will give you an employee policy and it will allow you to start playing with sound another alternative is to move to a collective play model Soul Capture user gesture and then start playing sound. This is common. We're seeing this for a friend of her video sites. It will also work under the current laws. Another for autoplay sanela switch to Jennifer was going to talk about it. If something completely different, which is multitasking. Text John hi, my name is Jennifer and I'm a software engineer on the crime scene. We've always multitasking while
consuming media. This could be anything from watching a movie while responding to emails for listening to wise Sports commentary while cooking. In fact some of you on Livestream may be tuning in all doing something else. Have you started to spend more time on our devices multi-tasking has evolved where the screen that were using may also be the media screen or at least the one used to control it. Last year, we introduced the media session API which allows you to show metadata and images on lock screens and on wearable devices. This experience is great for users because I'll know what's
playing and they'll also be able to control it take a quick refresher and how we can use it. First we'll check to see if the device or browser supports me to session and if so, we'll use it to make things appear. We simply need to provide the metadata the title artist album and artwork. That's all you need to customize the media notifications for your web app. We can also set of action handlers to respond to these controls speak backwards seek forward play pause next track and previous track want to control the playback state. If you're serving custom controls.
You can also reflect the meat accession State outside of a lock screen or the notification. Know what about the desktop experience personally. I spend the majority of time on the computer consuming media at the same time. I like to keep busy. So just messaging friends are reading the news. I find out whenever I want to watch a video on the computer. I'll resize a window to be really small and then move it to the corner of the screen. So it doesn't get in the way of everything else I want to do at the same time. I struggle with the number of Lee covering that window went
with one of the many other windows that I have open sound familiar. I am excited to share that we're bringing picture in picture to Chrome. The floating window won't appear on top of other windows allowing you to consume via while continuing to interact with the other content on your device. You also be able to interact with the window such as with the play pause controls resizing and moving to the window to your liking. You may remember that last year. We launched picture-in-picture on mobile with Android auto with the native API today. We're building a web API that
will allow developers to create and control the experience around picture-in-picture. This will be available for Chrome on desktop and on Android. Please take a look at how we can use this. First we'll need two things. We'll need the window for the user to interact with it today will use a button. The start us off. Let's enter picture in picture. We'll start by adding an event listener to our button and when the button is clicked or video will request to enter picture-in-picture mode. But why stop there? Let's make this fun tall entering and
exiting picture in picture. Will first check to see if a documents picture in picture element is our video and if it isn't will send a request to enter picture-in-picture mode. Otherwise fully which means that the video will appear back in the original tab. Request picture-in-picture function returns a promise and whenever video fails if our video fails to enter picture-in-picture mode for any reason we should let the user know surely user feedback is a great way of letting them know what's going on. We can also listen for two events enter
picture and picture and leave picture in picture. We're setting event handlers. We can Define experience for users such as showing other content on a page more prominently. This could be anything from showing of catalog of videos that the user could later watch or servicing live screen live stream chat. I'm finally in cases where picture-in-picture is not supported. We can hide the button so the user doesn't try to interact with it. Wall picture picture is still under development. You can check it out today on Chrome Canary with these flies enabled the first one will enable the
access to the web API. Well, the next two will be used to be able to feature. You can also find out more about the atin implementation status at these wings. Don't say we have multiple screens. Imagine if you were presenting in a meeting and you wanted speaker notes or you wanted to share photos and videos to a large group of people, but you only had a small laptop presentation API makes display such as monitors and TVs available to the web previously. This is only available for remote devices such as Chromecast. But today we now also account for wire display such as HDMI monitors.
What presentation API we can build experiences for users to to see and control local content while presenting something else on the other screen? And in other words, we won't just be merry was on our screen to the other monitor. Let me walk you through your Huskies the presentation API to present a webpage on a secondary screen. Components needed to get this to work the controller and the receiver. The controller is the page that will use to interact with on our local on our primary screen while the receiver content will be displayed on the other screen and today will call these components
controller. I used to mow and receiver. HTML. We'll start by putting together a controller page first will create a presentation object which will contain contain the Euro that we want to show on our secondary screen. This could also be an absolute Euro or we could also pass in a list of them, but they will just use receiver. HTML. Weather presentation request we can get the availability of the displays or the second screen and handle them. However, we want. We should also continue to handle this way availability changes by adding an event listener for the change event.
This might happen when monitors are disconnected or we moved on network without devices. The show a presentation display prompts. We require user gesture such as a click on a button and with that in mind, let's start a presentation request when our user clicks on or button wait for the promise to resolve. What's the user Celexa display This Promise will return a presentation connection. And when that happens the papers are we passed an earlier receiver. HTML will be presented to the other display. Once we have this active connection, we can do two things. We can either close it
or we can terminate it closing the connection allows us to reconnect later and terminating the connection means that it's gone for good. So how do we reconnect to a presentation we do this by simply calling reconnect and passing in the presentation ID of the previous presentation connection objects. There's two options of how we want to handle this. We can either automatically reconnect when the controller navigate or we can reconnect when we the user clicks on a button. We can also register event handler for the presentation connection events connect
message clothes and terminates. And with that in mind, we know how the components ready for our controller to work. Next we want to have the controller and receiver interact with each other. Sony receiver side, let's first retrieve all the existing connections and listen to any incoming once will keep track of these as they come in. Every time we had a connection will add event listeners such as message and clothes events. So what does this look like put together our team Hoppers the sample web app called photo wall which showcases a presentation API
we use a controller to select photos that we want to add to a slideshow and we use a receiver to prevent it. To get us started on the controller will use the presentation request and retrieve the list of receivers to create a presentation connection. And once were connected the controller to prompt us to add images to the slideshow. And what's more interesting than searching for photos of dogs here picking up photo will add it to it? So cute images to share on their slideshow. And how do we get that information to receiver? Calling connection. Sin will send a message between the
controller and the receiver and vice-versa. This message could be a string a blob an array buffer ordered array buffer View and a non example, the controller will send the description or the name of the dog as well as the image source, the receiver upon receiving this message through receiver will then handle it by updating the slideshow to include the image. So what happens when we send peanut nougat and Elsa the receiver. We'll get something like this. That's all you really need to get started with presentation API guys already available on Chrome. So be sure to check it out.
Familia session presentation in picture-in-picture, we believe these are all great additions to the web media story. You can check out all the apis at these links and we always love feedback. Let us know what you think now headed back to John. Think I heard somebody go book at 4 picture in picture. Where is everything? I didn't know you were putting my face up there. Thanks for that. Anyway for the presentation example of how Chromecast work across multiple displays. And so if you have multiple displays Chrome can use presentation a few ideas another
capability in Chrome for doing things with other displays and that's casting Chrome Super Bowl that you played Media or marry your tab or your desktop to cast devices such as Chromecast DirecTV is your rendering the tab and then you're capturing it and then during coding it and then you're sending it over to the Remote device. And if you're playing media, that's an awful lot of work just to capture the media Robert and coat it again. Send it over the device which is then go to decode it has been an important
change here, which just landed in from drinks have mirroring if the user goes full screen with a video promos. Hot compress video directly to the device and that eliminates most of the steps that I just mentioned. There's no need to decode there's no need to capture. There's no need to encode Chrome is still in the middle. It's sending the video friends directly to the vice that without modifying them in any way for me use our perspective. They'll still have playback control in Chrome like before but instead of the video playing on their local display, they'll see this This is good in
most cases but there is some cases where you won't want Chrome to remote a full screen video. For example, if you're doing 360-degree spherical video where you're likely doing is taking a canvas and rendering with webgl on top of it a piece of geometry with a video mapped onto it and that's good behind the canvas. Some of you might have a full screen video element which includes the entire video, which is effectively a texture map and you probably don't want that video to be sent to the television. If you see this on your site, it's very easy to prevent it. There's an attribute you can
add to any video element call disable remote playback and chrome will respect that. So if you don't want me to remote into work on a video element this afternoon, What are we doing this? Well, it's because it's so significant optimization and it happens automatically. If you look at the middle metric you can see that the CPU consumption goes down traumatically because we're skipping a d code and a compression stuff at Starlight. If you look at the top two metrics, especially the top one you can see that play back to smoothness increases significantly and freezing gets better as well. And
then the bottom to metric show that the video quality itself gets better because we're not decoding and re-encoding the video Belleza Chrome Touch This video the better and it lets you send the original compress video directly to the device. Of course that original compress video is only going to be good if your settings are good, and the algorithm using for compression is good, which is a really good segue for me to pass it over to Zoe's go to talk about updates on everyone. Thank you, Jon. So today I'm going to talk about the new video Codec
everyone. Baby Alive has been one of the significant experiences over the web. And that has the amenities manifested by the extensive bandwidth usage by the users. compression of the technology behind the scene that power this growth So why we do Kodak mattered? That's a notepad example. Here is a 5-minute high-definition video with the size of a 300 Mike bite. This is what we've observed in did is a compressed version. I supposed to be uncompressed version which is as large as 25gb almost a hundred times large.
Slow poke video is Kobe even 10 times larger. They're for confession at a Technologies without compression. There's almost no possible for any video service over the Internet an hour. You'll just know he's a result. Biking 2013 Google roll out. Its previous video call that a name light vp9. I supposed to add the time they don't darts Kodak 800 459 has achieved 30% performance thing. Do carrots at the local map of the YouTuber watching time off of AP 9 compared to that of its 4 at the time you see him from this light.
The doctor office light. The diaper of the color is the lung of watching time at IND v a penis supposed to be a season 4. The reason for longer watching, BP 9 ml is 3/4 horsepower is a faster loading time so I can play the Lights Apartments. I'm certainly which is the higher quality delivery. So that's a sizable vp9 has to be nice about it about all users weave that are longer. You'll say the face off of you cannot has knocked down. I noticed the most recent video Codec has been developed by the so-called Alliance of
open media short eyes, Kom. The name is 1/4 of generation Kodak ice 81. AOL, I'm at a museum from this slide has included the top of media and technology companies from almost every aspect of the industry. So here you can see we have the WEP password accountant providers. and Hardware makers Dentists Pecos about your amp 2 / B 1/4 we want to team up. I'll call Doc for the web. Even though that has been Quizlet overlap the way the TV security media. I know the other Industries the same kind of Target. I'll be om. What if you want
is to have open Voice Recorder. This is a significant for the industry that has been molested by Peyton's. And I'm not so I'll feel him is to have a real deployment. So it's a Target to be supported. Why do they pause all platforms? And it's at the Magic roster space? Do Canada landscape my pretty complicated cuz that Rapunzel in the major areas that we haven't had to do that. If you value of everyone we have evolved or a hundred compression Technologies for everyone. Today, I'm really going to talk about two of them.
We're going to have a little bit of deeper dive of the following two technologies. The first one we name it I went into production example. So here's a clip from a PS4 frame from a video clip and there's the next one. So what is the truth of facts of friends? You can't be saying there's a lot of similarity between them the major task of a video Codec is to remove the similarity between the most probably one of the most commonly used tools is called blank copy.
They can send a copy the paper box with a kind one to achieve a compression ratio. However, you look closely. Thank you shave of the blog which is where is not aligned. Wow with the the boundary so you look closely, How do you introduce each face shape? Which country allow better aligned with the boundary of the object exists in the frame, there could be a lot of potential that we can attain a folder compression thing. Therefore Here Comes The Impressions Halloween update that we refer to
as a wipe your face interpretation, baby. The first time that might be introduced by the way Kodak everyone Fort Worth times introduced the square based on a rectangular base, which ensure white polka.bow for the interpretation. Because it's a bad alignment with the boundary of the object. I said we just showed there's just exceeded the thing that has attained. The next to Wagon to talk about it's about the war commotion. Like look had an eye exam po so here is the frame again
drama video clip Henderson location for the next frame. So how do we describe the location and then to obtain a compression? So the China do you use a tool is to describe the motion gimple of all the plows that have resisted take a nap ject. It's time to suggest a lot of information. We need to signal. Even though he slay there's another way you can somehow we can only signal those emotions involve. What does a blue house located in the top left boundary office located object? Well, if there's a way that we can live the
motion m-44 all the remaining blocks of this object from the surrounding place within a condo window. There could be a lot of potential for us to attend a better compression. There comes another to introduce about everyone named the local work emotion. So everyone has to use a mathematical model here as examples of my mouth out first horizontally and then what equal I arrived. Most of the time the blog from those motion for a blister on your blog wavy in a car window in this way.
If a more effective and efficient to compress technology, I have attained so that a better cold again can be achieved. Who found the rise of all the technology that we have introduced in everyone here is the result. To compare everyone with a is it for the dinosaur vp9? Generally. There's nothing off of 30% performance Improvement for everyone for both to send the video monkey mount and a live streaming. kiss another presentation of This Means out for comparing everyone that supposed to be p9r betaine if I Google
Independent evaluation has been conducted by Facebook mainly on top of their user-generated Townsend ablota to their Facebook website. I can't sleep in the Peyton Heitz the fuel tank and the cheapest condos around this light for the more. You kind of deserve. It. It was thinking Craven of resolution a larger advantage of everyone can be observed. I supposed you but benign as another state of Dark Knight excuses for A tiny expected that everyone can have a larger cane for larger resolution. A lot of evaluation time is straightened out here
is conducted the five Moscow State University. I'm Ico indeed. Nola out punted comparison result report for almost all the top video Codec be collected. Do you want to come back to the very expensive comparison? Especially considering different categories such as the encoder speed the quality metrics as well as some of the rate control strategies and also that I read in the performance of the Poe died. This year be included comparison with everyone they report at 10th
Street and the result for the Vantage off 81 next. I'm going to show a devil to compare everyone. I supposed to weigh P9 and coding the same video sleep. to achieve the same quality level So the laptop hard is encoded by making that the ride is from everyone and the result of compression are shown in the slide. I think everyone has dissolved the light betray that that has been resulted from everyone that supposed to way Kenai. I'm trying to say that's our
baby was if we have already selected all the podium shoes and we have Frozen our co-pays and we haven't released our respect. The next app you can be 18 suspected. Chrome wheel support everyone later this year and for the hardware support if he's back in two years in the oven to 2020. But I know who the interesting everyone. You're welcome to visit our website showing this light. From there. You can download this whole face as well. As you can access a detailed. I'll just fax because of we are
open source Community. You're welcome to evaluate as well as to contribute. Next time you headed back to drive. Thanks Joy. Everyone is pretty fantastic. If we're really excited about the bitrate savings and the storage savings that is going to offer it where else would really excited for the start rolling out. But at the beginning it won't be available everywhere. And in fact, if you look at even vp9 and h.264 there varying levels of support for kodak's where some devices to Hardware decoding other ones to software decoding and it could affect how
smoothly and power-efficient the video decoding can be a selfish at the playback will be fortunately there's a new API available that will let you detect this ahead of time and then it wouldn't let you do is ask the browser not just with the decoding is supported but how performant it will be. Let's see how it works. So in this case, what we're going to do is create a media configuration and we're going to test Opus audio at stereo and vp9 video at 1080p configuration is will pass it in to the media capabilities API and ask for information about the decode.
You can see that you get three values back. The first is Kennedy code and why do why is this is kind of similar to can play type but it gives you a strong signal as to whether or not the Kodak has even supported the other to the second and third smooth and power-efficient are brand new and they're really exciting. They will tell you whether or not for example, is it going to be smooth can the GPU or the CPU? Keep up with the resolution in the frame right that you're going to be providing? It is really new and I can make a big difference how much of a difference such a big difference a day
to you too Brandon experiment with the API and what they found was that surgically adjusted the experience for just a few users. The Improvement was so big that it affected the overall average across all users. So let me explain what I mean. You can see that it improve mean time between Reba purse by 7% improve the rebuff rate by 10% Those are big aggregate average changes, but it only adjusted the average video hype by 4. What that means is it actually wasn't affecting a whole lot of users. It was only folks in the ones that actually could have used to change and
improve their experience a great deal. This is worth looking at right now and it's going to become more and more important as everyone becomes available. So we covered a lot of things and some of you may be wondering that's a lot. How can I get access to all of it? Easily? There are a lot of answers to that. I want to cover two other first I want to talk about is the open-source Shaka player and the packager which offer an easy way for developers to adopt these best practices. We are constantly updating Chaka and if you're using it, you're going to get a lot of these
new features. We will sit here automatically for free overtime. Forgetful recent improvements to the Shaka player with version 2.3 include support for hls live video and on the Fly transmog see between transport stream teen pregnant before this summer. The team is planning to launch a player user interface layer, which would make it a lot easier to deploy two player. on the Shaka pocketer side version 2.0 added support for vtt. And subtitles for created content as ready for ad insertion is also coming in version 2.2. Chabot soccer player to check a package or
give you a way to deploy easily fit and also keep up with changes as they roll out and you ate the eyes as they were 11. Similarly, the amp team has been working to implement many of the best practices and looking at some of these apis as well. And these are taking effect and amp video as well. As other related video tags in accelerated mobile pages on the user interface front rotate to full screen is already done and they're working on in page docking which will be coming soon. So again, this is another way that you can get a really great video playback experience without having to write
everything yourself. They're both Shaka and out. The goal is to make it easy. So they're both worth looking at. More generally as I mentioned before this really is a decade of video and as consumption patterns are changing and evolving. It does me. There's API changes and you able to learn but it also means that you're going to have more meaningful experiences for users, and we'll see watch time continue to increase whether it's picture-in-picture or offline playback or cross-device playback, and it's even more exciting. When you later video compression advances on top of it
where you don't need as much bitrate to get great quality, or you can send higher-quality video at the same that right and you also don't need as much storage on the device. If you're doing offline video. We are really excited about what's coming. It's a very exciting time to be in media not just as Things become higher quality, but if some of these new use cases of all at people start consuming a new and exciting ways, so with that. Thank you very much for your time as I mentioned before we have web audio office hours tomorrow at 12:30 with media office hours today at 6:30 and tomorrow at
Buy this talk
Access to all the recordings of the event
Buy this video
With ConferenceCast.tv, you get access to our library of the world's best conference talks.