Duration 1:46:32
16+
Play
Video

Google Keynote

Sundar Pichai
Chief Executive Officer at Google
  • Video
  • Table of contents
  • Video
2018 Google I/O
May 8, 2018, Mountain View, USA
2018 Google I/O
Video
Google Keynote
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
4.87 M
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speaker

Sundar Pichai
Chief Executive Officer at Google

As Google’s chief executive officer, Sundar Pichai is responsible for Google’s product development and technology strategy, as well as the company's day-to-day-operations. Sundar joined Google in 2004 and helped lead the development of Google Toolbar and Google Chrome, key consumer products which are now used by over a billion people. In 2014 he took over product, engineering, and research efforts for all of Google’s products and platforms. This includes Search, Maps, Communications, Google Play, Commerce,

View the profile

About the talk

Learn about the latest product and platform innovations at Google in a Keynote led by Sundar Pichai.

Share

Good morning. mahalo welcome to Google IO. It's a beautiful day. I think Walmart and last year. Hope you're all enjoying it. Thank you for joining us. I think we have over 7,000 people here today. As far as many many people are alive streaming this too many locations around the world. So thank you all for joining us today. We have a lot to cover. But before we get started, I had one important business, which I wanted to get ahold of it to us the end of last year. It came to my attention that we had

a major bag in one of our core products. It turns out we got the cheese strong and our burger emoji. Anyway, we went hard to work. I never knew so many people cared about where the Cheese's We fixed it in a tyranny of the whole thing is I'm a vegetarian in the first place. So we fixed it. But I hopefully we got the cheese straight but asks we were working on this this came to my attention. I don't even want to tell you that exclamation the team gave me as to why the phone is floating about the beer. But we destroyed the natural laws of

physics. So all is well we can get back to business we can talk about all the progress since last year sale. I'm sure all of you would agree been an extraordinary year on many friends. I'm sure you've always felt it important inflection point in Computing and it's exciting to be driving technology forward. And it's made us even more reflective about our responsibilities. Expectations for technology very greatly depending on where you are in the world.

Or what opportunities are available to you? For someone like me who grew up without a phone I can distinctly remember how gaining access to technology can make a difference in your life. And we see this in the work we do around the world. You see it when someone gets access to a smartphone for the first time. And you can feel it in the huge demand for digital skills receipt. That's why we've been so focused on bringing digital skills to communities around the world. So far, we have trained over 25 million people and we expect that

number to rise over 60 million in the next 5 years. It's clear technology can be a positive force. But it's equally clear that we just can't be WiDi about the Innovations technology creates. Turn up very real and important questions being raised about the impact of these advances and the role they will play in our lives. So we know the path ahead needs to be navigated carefully and deliberately and we feel a deep sense of responsibility to get this drive. That's the spirit which we

are approaching our core mission to make information more useful accessible and beneficial to society. I've always felt that we were fortunate as a company. You have a Timeless Mission V. Ask for 11 today as when we started. Very excited about how we can approach a mission with renewed Vigor. Thanks to the progress. We seeing AI may I see a blamed for us to do this in new ways solving problems for our users around the world? Last year at Google IO. We announce Google AI teams and efforts to bring the

benefits of AI to everyone. And the one this to work globally so we are opening a eye centers around the world. Yeah, it's going to impact many many fields. I want to give you a couple of examples today. Healthcare is one of the most important feels they are is going to transform. Last year the announcer work on diabetic retinopathy leading cause of blindness and we use deep learning to help doctors diagnose it earlier and we've been running field trials since then what are abandoned Central hospitals in India and the field trials are

going really well. We are bringing expert diagnosis to places that train doctors are scabs. It turned out using the same retinal scans. They were things which humans quite didn't know to look for what are AI systems offered more inside your same. I scan turns out holes information with which we can protect the five-year rest of you having an adverse cardiovascular event heart attacks or Strokes SEPTA me the interesting thing is that you know more than what doctors could find in this ice cans the

machine Learning Systems offer newer inside. This could be the basis for a new non-invasive a cardiovascular risk, and we are working. We just published research and we are going to be working to bring this to field trials with our partners. Another area where I can help to actually help doctors predict medical Evans and for them getting a advance notice 24 to 48 hours before a patient is likely to get very sick as a tremendous difference in the outcome.

And so we put up machine Learning Systems to work. They've been working with our partners using be identified medical records and it turns out if you go and analyze 400,000 data points per patient more than any single doctor could analyze we can actually quantitatively predict the chance of readmission 24 to 48 hours before then traditional methods. It gives doctors more time to app. Be a publishing a paper on this later today and we are looking forward to partnering with hospitals and medical institutions. Another area where I

can help accessibility. You know, we can make day-to-day use case is much easier for people. Let's take a common use case, you know, yo you come back home in the night and you turn your TV on now that I'm coming to see two people passionately two or more people passionately talking over each other. Imagine if you are hearing impaired and you're relying on closed captioning to understand what's going on. This is how it looks to you. Danny Ainge level as you can see

it's cheaper if she can make sense of what's going on. So we have machine learning or technology called looking to listen. If not only looks for audio cues, but combines with visual cues to clearly disambiguate the two voices. Let's see how that can work. Maybe in YouTube. You said it's alright to lose on purpose. We have a lot to talk about. But you can see how we can put technology to work to make an important day today yusuke's profoundly better not a great thing about Technologies. It's constantly evolving. In fact, we can even applied machine

learning do a 200 year old technology Morse code and make an impact in someone's quality of life. Let's take a look. Hi, I am Tonya. This is my voice. I use Morse code buying putting dots and dashes with switches mouth is near my head as a very young child. I used to communication word for word. I used to hide stick 2.2 the words. It was very attractive to say the least. Once Morse code was incorporated into my life. It was a feeling of pure Liberation and freedom.

I think that is why I like skydiving so much. It is the same kind of feeling through skydiving. I met Ken the love of my life and partner-in-crime. It's always been very very difficult is to find Morse code devices. The try Morse code is his why I had to create my own. I have a voice and more independent send my daily life, but most people don't have Ken. It is our hope not we can collaborate with the gboard team to help people who want to happen to the freedom of using Morse code

gboard discover. It's working on gboard. He said Dora and her pockets of population in the world who has never had access to keyboard that works in the language with Tanya. We've built support in gboard for morse code. So it's an input modality that allows you to type in Morse code and gets texts out with predictions suggestions. I think it's a beautiful example for in machine learning can resist someone in a way that normal keyboard without artificial intelligence wouldn't be able to

I am very excited to continue on this journey many many people will benefit from this and that Thrills me to no end. It's a very inspiring story. I'll be very very excited to have Tanya and Kim join us today. Connie and Ken are actually developers. They really worked with her team to harness the power of actually predictive suggestions in Gmod the in gboard in the context of Morse code. I'm really excited. The gboard with Morse code is available in beta later today. That's great arraignment products with

AI keyboardist. Actually great example of it every single day. We offer uses and uses to solar 8 billion auto corrections each and every day. Another example of a 104 core products which we are redesigning with AI is Gmail. We just had a new fresher. Look for Gmail or recent redesign. Hope you're all enjoying using it. We are bringing out of the feature to Gmail. They call it smart compose. So the name such as we use machine learning to talk. I just think phrases for you as you

type. All you need to do is to hit Tab and keep order completing. In this case that understands the subject is Taco Tuesday. It's so just chips salsa guacamole it it takes care of Monday and things like addresses so that you don't need to worry about it. You can actually focus on what you want to talk. I've I've been loving using it. I've been sending a lot more emails to the company. Not sure what the company thinks if it but it's mean great. We are rolling out smart compose to all our users this

month and hope you enjoy using it as well. Another product which be built from the ground up using AI is Google photos amazingly well and it's scales on one of these photos, but what we call the photo viewer experience where you're looking at one photo at a time so that you understand the scale Every Single Day dinner or five Lane photos beard by our users each and every day to help in those moments. So we're bringing a new feature called suggested actions. Essentially suggesting smart actions right in contacts for you to act on say for example,

you went to a wedding and you looking through those pictures. We understand your friend Lisa is in the picture and we offer to share the tree photos with Lisa and with one click those photos can be sent to her. So the answer is very everyone is trying to get the picture on the phone. I think we can make that better. Save, for example, if the photo in the same reading with the photos underexposed Rai systems offer a suggestion to fix the brightness right there one tap, and we can fix the brightness for you. Or if you took a picture of a document, but you want to stay for later.

We can recognize convert the document to PDF and make it. Make it much easier for you to use later to make all these simple cases delightful. By the way. I can also deliver unexpected moments. So for example, if you have this picture cute picture of your kid, we can make it better. We can drop the background color pop the color and make the kid even cuter. Or if you happen to have a very special memory something in black-and-white maybe of your mother and grandmother we can recreate that moment and color and and make that moment even more special.

All these features are going to be rolling out to Google photos uses in the next couple of months. The reason we are able to do this is because for a while we've been investing in the scale of our computational architecture. This is why last year we talked about our tensor Processing Unit Pizza special purpose machine learning chips. Besa driving all the product improvements you you're saying today and we made it available to our Cloud customers. Since the last year we've been hard at work and today I'm excited to announce our next Generation TPU 3.0.

These chips are so powerful that for the first time we've had to introduce liquid cooling in our data centers. Annie put these chips in the form of giant part. Each of these parts is now a text more powerful than last year well over a hundred petaflops and this is what allows us to devil better models larger models more accurate models and helps us tackle even bigger problems. On one of the biggest problems we are tackling with AI is the Google Assistant. Our vision for the perfect assistant is that it's naturally

conversational there when you need it so that you can get things done in the real world. And we are working to make it even better. If you want the assistant to be something that's natural and comfortable to talk to you and to do that. We need to start with the foundation of the Google Assistant the voice today. That's how most users interact with assistant. Our current voice score named Holly. She was a real person. She spend months in rstudio, and then we stitched those recordings together to create boys. 18 months ago. We announced a breakthrough

from our deepmind team call Vivint unlike the current systems. They've not actually models the underlying raw audio to create a more natural voice. It's closer to how humans be the pitch the pace even all the process that convey meaning you want to get all of that drive. So we both have it baby and we are adding as of today 6 new voices to the Google Assistant. Let's have them say hello. Good morning, everyone. I'm your Google Assistant. Welcome to Shoreline Amphitheatre. We hope you'll enjoy Google IO back

to you soon dar. You know how it goes Monday to get the right accents languages and dialects right? Globally easier with this technology. We started wondering who we could get into the studio with an amazing voice take a look. couscous a type of North African semolina and granules made from crushed durum wheat I want a puppy with sweet eyes and a fluffy tail who liked my haikus don't we all happy birthday to the person whose birthday? It is, Happy Birthday

to You John Legend. He would probably tell you he don't want to brag but he'll be the best assistant you ever had. Can you tell me where you live? You can find me on all kinds of devices phones Google homes, and if I'm lucky. in your heart That's why John Legend's Voice is coming to the assistant clearly didn't spend all the time in the studio answering every possible question that you could ask but they've not allowed us to shorten the studio time and the model can actually catch of the richness of his wife. His wife will be coming

later this year in certain contacts so that you can get responses like this right now in Mountain View, it's 65 with clear skies today. It's predicted to be 75 degrees and sunny at 10 a.m. You have an event called Google IO keynote then at 1 p.m. You have margaritas. Have a wonderful day. I'm looking forward to 1 p.m. So John toys is coming later this year really excited. We can drive advances like this with a I we are doing a lot more with a Google assistant and to talk to you a little bit more about it. Let me invite Scott

onto the stage. Call Maddie. Okay, darling now. Sounds good. Call my brother. Hey Google. Call my brother. Yo, Google, Kevin. That was great. We haven't made yo Google work yet. So you have to say hey. Hey, Google. Hey Google Play The Next Episode by the crown on Netflix. OK Google tell me three more you say hey, Google. Hey Google find my phone. No, Google lock the front door. I'm sure the engineer's would love to update everything. What can I do for you?

Two years ago. We announced the Google assistant right here at IHOP today. The assistance is available on over 500 million devices including phone's speakers headphones TV watches and more. It's available in cars from more than 40 Auto Brands and it works with over 5,000 connected home devices from dishwashers to doorbells. People around the world are using it every single day. For example, we launched the assistant in India last year and the response has been incredible daily

usage there has tripled since the beginning of the year. By the end of this year your system will support 30 languages and be available in 80 countries. So we've made great progress, but we're just getting started today. We're going to share with you some important ways that the assistant is becoming more naturally conversational and Visually assistive in order to help you do more and get time back. Haven't you heard from Sundar new voices that you can choose from to make the Google Assistant your own are an important aspect of making a conversation with your assistant more

natural, but to be a great conversation partner deeply understand the social dynamics of conversation. For example, let's be honest it gets a little Annoying to say Hey Google every time I want to get my assistance attention this Grandma who you might have seen on YouTube was definitely feeling that way. cuckoo cuckoo cuckoo cuckoo OK Google, I want to know what is the weather tomorrow? In Flagler Beach tomorrow, there'll be showers for the high of 65 and a low of 56. Hahaha, well the assistant eventually work for her, but it

shouldn't be so hard. Now. You won't have to say Hey Google every time check this out. Hey, Google did the Warriors win? Yes, the Warriors won 118 to 92 last Sunday against the Pelicans. The Warriors next game is today at 7:30 p.m. Where they will be playing the Pelicans. Sure. I'll remind you when you get home. Don't notice that I was able to have a natural back and forth conversation with my assistant without having to repeat. Hey Google for every follow-up request even more helpful. My assistant could understand when I was talking to it versus when I was talking to all of you

we call this continued conversation and it's been a top feature request you'll be able to turn it on in the coming weeks. All right. So let's say it's almost game time. Hey Google turn on the Warriors game and start the popcorn maker on. Can you dim the lights in the family room and in the kitchen? Sure dimming the lights in the family room and in the kitchen, but computer still understand. It's something we call multiple actions and we're rolling out this capability and the assistant now. They might think come on. It's easy.

Just look for the end. But it's not always so straightforward in this example, even though it has the word and it's not to request. It's actually just one but here even though it looks very similar. The assistant has to break it apart to form the two requests in linguistics. It's called coordination reduction. Number for actions can get way more complicated. So just for fun. Hey Google who was the governor of California when Kevin Durant was drafted and what team drafted in Arnold Schwarzenegger

with the governor in 2007. Kevin Durant was drafted by the Seattle SuperSonics. All right. So next we've also been working on improving the conversation with the Google Assistant for families last fall. We launched our family experience for the Google Assistant provides family friendly games activities and stories. We continue to grow our library and families of listen to over a hundred and thirty thousand hours of children's stories in the last two months alone. Now as we continue to improve the

experience for families a concern that we heard from many parents including people on the team of children is our kids learning to be bossy and demanding when they can just say hey Google to ask for anything they need Not a simple area, but one stuff that we've been working on it something we call pretty please some of the parents on the team have been testing out with her family's take a look. Hey Google talk to voice Tron. OK Google. Please tell me a story.

What a nice way to ask me. Tell me a story, please. Thanks for asking. So nicely once upon a time. There was a wacky. Walrus. Please help me with my homework. You're very polite. I know. So the assistant understand and respond to positive conversation with polite reinforcement that we've been Consulting with families and Child Development experts and we plan to offer pretty please as an option for families later this year. So with new voices for your assistant continued conversation multiple actions and pretty please a

is helping us make big strides. So everyone can have a more natural conversation with your assistant voice and visual assistance together. Jose Scott and good morning everyone over the last couple of years. The assistant has been focused on the verbal conversation that you can have with Google today. We're going to win fail and Nuvigil canvas for the Google Assistant across screen. This will bring the Simplicity of voice together with a rich visual experience and I'm going to fight Maggie to come out cuz we're going to be switching to a lot of live demos and we gave you an early leg

at our new smart displays at CES in January. We're working with some of the best consumer electronic brands. And today. I'm excited to announce that the first smart displays will go on sale in July. Today, I'll show you some of the ways that this new device can make your day Easier by bringing the Simplicity of voice with a glance ability of a touch screen. Felicity over to the live demos now. This is one of the Lenovo smart displays the ambient screen integrates with Google photos and grease me a pictures of my kids Bella and had send those are really my kids and best way to start

my day every morning. Because the device is controlled by voice. I can watch videos or live TV with just a simple command. This makes it so easy to enjoy my favorite shows while multitasking around the house. Hey Google, let's watch Jimmy Kimmel Live. okay, sign Jimmy Kimmel Live on YouTube TV I had a funny thing happened here something from my life. I'm driving my daughter to school this morning. Fair trade I need UTV you will be able to watch all of these amazing shows from local news life's worth and much more and they will be available on Smart displays. Now, of course, you can also

enjoy all the normal content from YouTube including how to videos music and original shows like the brand new series Cobra Kai which we started binge-watching this week because it's so good that cooking is another instance where the blend of voice and visuals is incredibly useful Nick and I are always looking for simple family friendly recipes. Hey Google show me recipes for pizza bombs. Sure, here are some recipes so we can choose the first to start cooking.

Piscina video demonstration along with the spoken instructions is a total game-changer for cooking especially when you have your hands full. Thanks, Maggie. So we showed you a couple of ways that smart displays can make life at home easier, but there are so many more from staying in touch with family with broadcast and do a video calling to keeping an eye on your home with all of our other smart home Partners to seeing in advance with the morning commutes like with Google Maps were thoughtfully integrating the best of

Google and working with developers and partners all around the world to bring voice and visuals together in a completely new way for the home. Now inspired by the smart display experiences. We've also been working to reimagine the assistant experience on the screen that's with us all the time. Are mobile phones so I'm going to give you a sneak peek into have the assistant on the phone is becoming more immersive interactive and proactive. So we're going to switch to another live demo. Hey Google. Tell me about Camila Cabello. According to

Wikipedia Karla Camila Cabello, estrabao is a American singer and songwriter. Turn down the heat. Sure, cooling the living room down and for smart home request what you can see here as we're bringing the controls right into your fingertips. And here's one of my favorites. Hey Google order my usual from Starbucks. Hello, welcome back to Starbucks. That's one tall non-fat latte with caramel drizzle anything else up at the usual place a PS. Okay your orders in see you soon. Yeah.

Brookside you to share that we've been working with Starbucks Dunkin Donuts doordash. Domino's in many other partners on a new food pickup and delivery experience for the Google Assistant. We have already started rolling some of these out with many more partners coming soon. Now Richmond interactive responses to my requests are really helpful, but my ideal assistant should also be able to help in a proactive way. So when I'm in the assistant now And swipe up. I now get a visual snapshot of my day. I see helpful suggestions based on the time my

location and even my recent interactions with the assistant. I also have my reminders packages and even know it's a list organized and accessible right here. I love the convenience of having all these details helpfully curated and so easy to get to Edison New visual experience for the phone is so awfully designed with a i at the core it will launch on Android the summer and iOS later this year. Now sometimes the assistant can actually be more helpful by having a lower visual profile. So like when you're in the car, let's say you should stay

focused on driving. So let's say I'm heading home from work. I have Google map showing me the fastest route during rush hour traffic. Hey Google Play and play some hip hop. Okay, letting Nick know you're 20 minutes away and check out this hip-hop music station on YouTube. That's so convenient to share my ETA with my husband with just a simple voice command. I'm excited to share that the assistant will come to navigation in Google Maps this summer. Sarah cross smart displays

phones and in maps this gives you a sense of how we're making the Google Assistant more visually assistive sensing went to respond with voice and went to show a more immersive an interactive experience. And without I'll turn it back to Sundar. Thank you. Thanks for the in great to see the province with assistant. As I said earlier Iration for a system is to help you get things done. It turns out a big part of getting things done. It's making a phone call. You may want to get an oil change schedule. Maybe call the plumber in the

middle of the week or even schedule a haircut appointment. You know, we are working hard to help users through those moments. We want to connect uses to businesses in a good way businesses actually rely a lot on this but even in the US 60% of small businesses don't have an online booking system setup. We think a I can help with this problem. So let's go back to this example. Let's say you want to ask Google to make you a haircut appointment on Tuesday between 10 and noon. What happens is the Google Assistant makes a

call seamlessly in the background for you? So what you going to hear is the Google Assistant actually calling a real Salon to schedule appointment for you. Let's listen. Men's haircut for a client. I'm looking for something on May 3rd. Do I give me one second? Or what time are you looking for a while? At 12 p.m. We do not have a 12 p.m. Available. The closest we have to that is a 115. Do you have anything between 10 a.m. And 12 p.m. Depending on what service she would like? What service is she looking for? What's the women's haircut for now we have at 10:10 a.m. Is fine.

Okay, what's her first name? The first name of Lisa, so I will see Lisa at 10 on May 3rd. Thanks. Have a great day. Bye. That was a real call you gestured the amazing thing is to a system can actually understand the nuances of conversation. We've been working on this technology for many years. It's called Google duplex and brings together all our investments over the years and natural language understanding deep learning text to speech by the way. I can give me a confirmation notification saying your appointment has been taken care of.

Let me give you another example, let's say you want to call the restaurant, but maybe it's a small restaurant which is not easily available to book online. The call actually goes a bit differently than expected. So take a listen. Hi, I'd like for the rich people for Wednesday the 7th. 74 it's for 4 people. Wednesday at 6 p.m. Oh actually the Lisa fraleigh Opera like a How long is the wait usually to be seated? Lancome for next Wednesday at 7 Oh I got you. Thanks.

Again, that was a real call with many of these examples for the cost quite don't go as expected but the assistant understands the contacts to New Orleans it new task for r a x in this case and handling traction gracefully technology and we actually want to work hard to get this right get the user experience and the expectation right for both businesses and uses but done correctly. It'll save time for people and generate a lot of value for businesses. We really wanted to work in case it's safe here too busy planning in the morning and your kid is

sick and you want to call for a doctor's appointment. So we're going to work hard to get the stripe. There's a more straightforward case where we can roll this out sooner. We're for example every single day. We get a lot of credits into Google where people are wondering on the opening and closing hours of businesses. But it gets tricky during holidays and businesses get a lot of calls. So we as Google can make just at one phone call and then update the information for millions of years. I need to save a small business countless number of calls

and make the experience better for uses. This is going to be rolling out as an extra man in the coming weeks and so stay tuned. A common theme across all this is we are working hard to give you this back time. You've always been obsessed about that a Google search is obsessed about getting uses the answers quickly and giving them what they want which brings me to another area digital well-being based on our research. We know that people feel better to their devices sure. It resonates with all of you. There is increasing social pressure to respond to anything you get right away.

People are anxious to stay to stay up-to-date with all the information out there. They have fomo fear of missing out. We wanted you think there's a chance for us to do better. We've been talking to people and some people introduced to us the concept of Jomo the actual Joy of missing out. So we think we can really help users with digital well-being. This is going to be a deep ongoing effort across all our products and platforms and we need all your help. We think we can help users with their digital well-being in four ways.

Do you want to help you understand your habits focus on what matters? Switch off when you need to learn about all find balance with your family. Let me get a couple of examples you're going to hear about this from Android a bit later in their upcoming release, but one of my favorite features is dashboard in Android. We're actually give you going to give you a full disability into how you're spending your time the apps where you are spending your time. The number of times you unlock your phone on a given day the number of notifications you got we're going to really help you

deal with this better. You know apps can also help YouTube is going to take the lead and if you choose to do so relaxing you remind you to take a break. So for example, if you've been watching YouTube for a while, maybe I'll show up and say hey it's time to take a break. YouTube is also going to work to combine if you just want to come by and all that notifications in the form of a daily digest so that if your phone notification it comes to you want during the day YouTube is going to roll out all these features this week.

You know, we've been doing a lot of work in this area family link is a great example where we provide parents choose to help manage kids screen time. I think is an important part of a we want to do more here. If you want to pick up kids to make smart decisions. So we have a new approach a Google design approach called be internet awesome to help kids become safe explorers off the digital world. We want kids to be secure kind Mindful and online and Via pledging to train an additional 5 million tests this coming year. All these tools you're saying it's launching with our

digital well-being site later today. Another area where we feel tremendous responsibility is news. music score to our mission Also times like this, it's more important than ever to support quality journalism. Its foundational to have democracies work. Always been fond of news growing up in India. I have distinct memory of I used to wait for the physical newspaper canceled. My grandson of my grandfather used to stay right next to us. It was a clear hierarchy. He got his hands on the newspaper first

then my dad and then my brother and I would go there. I was mainly interested in the sports section of the time but over time I double up the phone that's for news and it stayed with me even till today. It is challenging time for the news industry recently relaunched Google News initiative and we committed three hundred million dollars for the next three years. Do you want to work with organizations and journalists to help develop Innovative products and programs that help the industry. We've also had a product here for a long time. Google news actually

built right after 9/11 was a 20% project by one of our Engineers who wanted to see news from a variety of sources to better understand what happened. Since then if anything the volume and diversity of countenance only grown. I think there's more great journalism being produced today than ever before. Tulsa true that people turn to Google in times of need and we have a responsibility to provide that information. This is why we have re-imagined our new product. We are using AI to bring forward the best of what journalism

has to offer. We want to give uses quality sources that they trust if you want to build a product that works for Publishers about all we want to make sure we're giving them deeper inside and the full of perspective about any topic that interested and I'm really excited to announce the new Google news and tears Tristan to tell you more. Thank-you sindar. With the new Google news, we set out to help you do three things first. Keep up with the news you care about II

understand the full story and finally enjoying support the sources you love after all without news Publishers and the quality journalism vapor juice. We have nothing to show you here today. So let's start with how to make it easier for you to keep up with the news you care about as soon as I open Google news right at the top I get a briefing with the top five stories. I need to know right now as I move past my briefing there more story selected just for me. Rai constantly reads the firehose of the web for you the millions of Articles

videos pot toss and comments being published every minute and assembles the key things you need to know. Google news also pulls in local voices and news about events in my area. It's this kind of information to make me feel connected to my community. This article from The Chronicle makes me wonder how long it would take to ride across this new Bay Bridge. What school is I didn't have to tell the app, but I follow politics love to bike or want information about the Bay Area. It works right out of the box and because we've applied techniques like reinforcement learning throughout the app the

more I use it the better it gets at any point. I can jump in and say whether I want to see less or more of a given publisher or topic And whenever I want to see what the rest of the world is reading I can switch over to headlines to see the top stories in the generating the most coverage right now around the world. But keep going. You can see there are lots of big gorgeous images that make this app super engaging and a truly great video experience. This brings you all the latest videos from YouTube and around the Wynn. All about design choices focus on keeping the app light easy

fast and fun. Our guiding principle is to let the story speak for themselves. That's pretty cool, right? What we're seeing here throughout the app is the new Google material theme the entire app is built using material design are adaptable unified design system is being uniquely tailored by Google. Later today, you hear more about this and how you can use material things in your products. We're also excited to introduce a new visual format week old newscast. You are not going to see these in any other news app. Used Casa Cana like a

preview of the story and then make it easier if you get a feel for what's going on. Check out this one on the Star Wars movie. Here are using the latest developments in natural language understanding the bring together everything from the Solo movie trailer 2 news articles 2 quotes and from the cost and more in a fresh presentation is it looks absolutely great on your phone. Newscast give me an easy way to get the basics and decide where I want to dive in more deeply. And sometimes I even discovered things. I never would have found out otherwise. For the stories I

care about most or the ones that are really complex. I want to be able to jump in and see many different perspectives. Let's talk about our second goal for Google News understanding the full story. Today, it takes a lot of work to broaden your point of view and understand a new story in depth with Google news. We set out to make that effortless. Full coverage is an invitation to learn more. It gives a complete picture of a story in terms of how it's being reported from a variety of sources and in a variety of formats. We assemble full coverage using

a technique week old temporal locality. Best technique and everlasting map relationships between entities and understand the people places and things in a story right as it involves. Reply this is the Deluge of information published to the web at any given moment. And then organize it around storylines all in real-time. This is by far the most powerful feature of the app and provides a whole new way to dig into the news. Take a look at how full coverage works with the recent power outage in Puerto Rico. There are so many questions I had about this story. Like how did

we get here could have been prevented and are things actually getting better. We built full coverage help make sense of it all all in one place. We start out with the set of top headlines and tell me what happened and then stuff to organize around the key story aspects using a real-time event understanding. Bernie's events that have played out like this one over weeks and months you can hit you can understand the origin of developments big by looking at our timeline of the key moments. And while the recovery has begun we can clearly see they're still a long way to go. They're also

certain questions were all asking about a story and we pull those out so you don't have to hunt for the ounces. We know context and perspective come from many places. So we show you tweet some relevant voices and opinions analysis and fact-checks to help. You understand the story that one little deeper. In each case Rai is highlighting why this is an important piece of information and what unique value it brings. Now when I use full coverage, I find that I can build a huge amount of knowledge on the topic I care about it's a true 360 degree view that goes well

beyond what I get from just standing a few headlines. On top of this our research shows that having a productive conversation or debate requires everyone to have access to the same information, which is why everyone sees the same content in full coverage for a topic. It's an unfiltered view of events from a range of trusted news sources. Thank you. So I got to say I love these new features and these are just a few of the things we think makes the new Google news so exciting but as we mentioned

before none of this would exist without the great journalism museums produce every day. Which brings us to our final goal helping you enjoying support the new sources you love. We put publishes front-and-center throughout the app and hearing the New Sensation. It's easy to find and follow the sources. I already love. And browse and discover new ones including over 1,000 magazine titles like wide National Geographic and people would all look great on my phone. I can follow Publications like USA Today by directly tapping the star icon. NFS of

publication I want to subscribe to say the Washington Post. We make it dead simple. No more forms credit card numbers or new passwords because you're signed in with your Google account your step. When you subscribe to a publisher, we think you should have easy access to your content everywhere. And this is why we developed subscribe with Google. Subscribe with Google enables you to use your Google account to access your paid content everywhere across all platforms and devices on Google search Google news and Publishers own sites. We built this in collaboration

with over 60 Publishers around the world and it will be rolling out in the coming weeks. Thank you. And this is one of the many steps were taken to make it easier to access Dependable high-quality information when and where it matters most. So that's the new Google News it helps you keep up with the news you care about with your briefing and used cars understand the full story using full coverage and enjoying support the new sources You Love by reading following and subscribing. And now for the best

news of all the rolling out on Android iOS and the web in 127 countries starting today. I think so too pretty cool. It will be available to everyone next week. At Google we know they getting accurate and timely information into people's hands and building and supporting high quality journalism is more important than it ever has been right now and we are totally committed to doing our part. We can't wait to continue on this journey with you and now I'm excited to introduce days to tell me more

about what's going on in Android. Android started with a simple goal of bringing Open Standards to the mobile industry today. It is the most popular mobile operating system in the world. If you believe in openness if you believe in choice if you believe in Innovation from everyone then welcome to Android. Everyone, it's great to be here at Google IO 2018, I think. 10 years ago when we launched the first Android phone the T-Mobile G1. It was with a simple but bold idea to build a mobile platform that was free and open to everyone

and today that idea is thriving our partners have large tens of thousands of smartphone use by billions of people all around the world. I'm through this journey. We seen Android become more than just a smartphone operating system power new categories of computer including wearables TV Auto a r v e r i o t of the growth of Android of the last 10 years has helped fuel the shift and computer from desktop to mobile and a sooner mentioned the world is now the precipice of another ship AI is going to professionally change Industries like health care

and transport and it's already starting to change hours. I'm just brings me to the new version of Android Wear working on Android p i don't be as an important first step for his vision of a eye at the core of the operating system in a underpins the first of three themes in this released which our intelligence Simplicity and digital well-being. We live smartphone should be smarter. They should learn from you, and they should have ducked you technology such as on device machine learning can learn your usage patterns and

automatically anticipate your next actions saving you time because it runs on device. The data is kept private to your phone. So let's take a look at some examples of how we're applying these technology to Android to build a smarter operating system. And pretty much every survey of smartphone user as you'll see battery life as the top concern, but this is my version of Maslow's hierarchy of needs. Powerball be there, you know, your battery is being okay, but then you have one of those outlier days where is draining faster than normal leaving to

run to the charger work on a new feature. We call it. Active battery. It's designed to give you a more consistent battery experience adaptive battery uses on device machine learning to figure out which apps still use in the next few hours and which she won't leave until later if at all today and then with this understanding the operating system add apps to your usage patterns taxi services that you care about and the results are really promising. We're seeing a 30% reduction in CPU wake-ups for asking

general and this combined with other performance improvements including background processes on the smallest CPU cores is pretty cool. Hello. Example of how the OS is adapting to the user is auto brightness. Most modern smartphones will automatically adjust the brightness given the current lighting conditions. What is a one-size-fits-all they don't take into account your personal preferences on environment. So often what happens if you eat just the brightness slider resulting the screen later become

too bright or to death and you on device machine learning feature. We call a doctor brightness given the ambient lighting and then does it for you in a power efficient way it'll literally see the brightness slider move at the phone adapt to your preference is and it's extremely effective if we're seeing almost half of our test users now makes fewer manual brightness adjustments compared to any previous version of Android. Rosa making vui more intelligent last year. We introduced the

concept of predict it out if feature that places the next off the OS anticipates you need on the path. You normally follow the launch that out and it's very effective with an almost 60% prediction rate with Android P. We're going Beyond simply predicting the next Act 2 launch to predicting the next sure you want to take we call this feature app actions. Let's take a look at how it works at the top of the launcher. You can see two actions to call my sister Fiona and another to start a workout on Strava for my evening rotten. So what's happening here is that the actions are being predicted

based on my usage patterns. The phone is acting to me and trying to help me get to my next task more quickly. Another example if I connect my headphones Android with surface and action to resume the album, I was listening to the Support options developers just need to add an action to XML file to their app and then actually surface not just in the launcher but in smaller text election the Play Store Google search and the assistant take Google search we're experimenting with different ways to surface action for a Steve installed and use a lot. For example, I'm a big fan Dyno

user. So when I search for the new Avengers movie Infinity War I get into regular suggestion. I get an action to the Fandango app to buy tickets pretty cool. Actions are a simple but powerful idea for providing deep links into the given your contacts. But even more powerful is bringing part of the app UI to the user right there and we call this feature slices. Voices our new API for developers to Define interactive Snippets of the rack UI they can be surface in different places in the heroes in

crisis first and search will take a lot to work if I have all of my phone Lyft is using the slice apis Rich array of UI templates to render a slice of their app in the context of search and then lifted able to give me the price for my trip to work on the slices interactive so I can order the ride directly from it pretty nice. The frights templates of bricks about so developers can offer everything from playing a video to say checking into a hotel is another example if I search for Hawaii photos with my vacation pictures and we're

working with some amazing Partners on a factions in slices. I will be opening in Early Access program to developers more broadly next month. So excited to see how actions and in particular slices will enable at Dynamic to eggsperience, but they asked you I can intelligently show up in contacts. So that's some of the ways that we're making a Droid more intelligent by teaching the operating system to adapt to the user machine learning is a powerful tool but it can also be intimidating on costly for developers to learn and apply and we want to make these fields accessible

and easy to use to those who have little or no expertise in machine learning. So today, I'm really excited to a 9ml kits in used set of apis available through Firebase to text recognition face detection image labeling out a lot more MLS get also supports the ability to tap into Google's cloud-based. Ml Technologies. architectural you can think of MLK it as providing ready to use models built on tensorflow Lite and optimized for mobile and best of all and no kid is cross-platform ADD

iOS We're working at an early sat of that Partners on MLK time spent with them really great results. For example, the popular MMO kids customer leave your eyes to automatically classify 200 different foods through the camera and more about ml kit at the developer keynote later today. So we're excited about making your smartphone more intelligent, but it's also important to us of the technology Fades to the back. When one of our key goals over the last few years old Android UI

to be simpler and more approachable vote for the current set of users and the next billion Android users. We put a special Sim listening by addressing many penguins were before and you told us the experience was more complicated in an altered being and you'll find these improvements on any device. That is Google's version of the Android UI such as Google pixel an Android one devices. Let me walk you through a few live demos on my phone what could possibly go wrong before the 7 types of people in the mafia tear OK

system navigation that we've been working on for more than a year. Now easier to understand is the single clean home button and the design recognizes that ran towards smaller screen at places an emphasis on gestures over multiple buttons at the edge of the screen. So when I swipe, I'm immediately brought to the overview where I can resume apps I've recently used I also get 5 predicted apps at the bottom of the screen to save me time. Now if I continue to swipe right swipe up the second time, I guess all apps. So what we've done is

Combined Locks and overdue spaces into what The swipe gesture works from anywhere No Matter What by man so that I can quickly get back to all apps an overview without losing the contact on the end. And if you prefer you can also use the quick scrub gesture by sliding the home button sideways to scroll through your recents out of act like so Now what are the nice things about the larger horizontal overview is that the app content is not responsible. If you can easily repair back the

information in the previous app even more is we've extended Smart Text selection to work in overview cir. For example, if I tap anywhere on the phrase The Killers all of the phrase will be selected for me. Then I get an action to listen to on Spotify like so we extended Mark text elections neural networks recognize more entities. Why sports teams the music artist on flight coats and more. I've been using this new navigation system for the last month and I absolutely love that. It's a much faster more powerful way to multitask on the go. Navigation works, it's a pretty big deal. But

sometimes small changes to make a big difference to to take volume control video starts. But instead you turn down the ringer volume and then the video blast everyone around you. So how we fixing a simplified volume control is here and located beside the hardware buttons. So there and shoot up but the key difference is that the slider now adjust the media volume by default because that's the thing you want to change most off at and for the ringer volume. All you really care about is on silence and off like so okay

the right now, I'm in the lock rotation mode. Let me launch an app. You'll notice that when I rotate the device and you rotation button appears on the knob bar, and then I can just tap on it and rotate under my own control. All right. Let me picture of some of the ways that we simplify user experience and Android P. There's lots more everything from a redesigned work profiles to better screenshot to improve notifications. Not as with arm or sticking your notifications management. We want to give you more control over demands on your attention and this

highlights the concept that sooner or alluded to make it easier to move between your Digital Life and your real life to learn more about this important area and are third team. Let me hand over to Samir. Thanks. part 1 on a recent family vacation. My partner asked if she could see my phone right after we got to our hotel room. She took it from me walked over to the hotel safe locked it inside and turned and looked me right in the eye and said you get this back in 7

days when we leave. Well, I was shocked. I was kind of angry, but after a few hours something pretty cool happen without all the distractions for my phone. I was actually able to disconnect be fully present and it ended up having a wonderful family vacation, but it's not just me. Our team has heard so many stories from people who are trying to find the right balance with technology as you heard from Sundar helping people with their digital well-being is more important to us than ever people tell us a lot of the time

they spend on their phone is really useful. But some of it they wish they'd spent on other things. In fact, we found over 70% of people want more help striking this balance. So we've been working hard to add key capabilities right into Android to help people find the balance with technology that they're looking for. What are the first things we focused on was helping you understand your habits? Every people show you a dashboard of how you're spending time on your device. As you saw earlier. You can see how many how much time you spent an app's how many times you unlock your device today

and how many notifications you've received and you can drill down on any of these things. For example, here's my Gmail data from Saturday. And when I saw this, it did make me wonder whether I should have been on my email all weekend, but that's kind of the point of the dashboard. Now when you're engaging is one part of understanding what you're engaging within apps equally important catching up on your favorite shows at the end of a long day can feel pretty good watching an infomercial might leave you wondering why you didn't do something else instead many

developers call this concept meaningful engagement and we've been working closely with many of our developer Partners who share the goal of helping people use technology in healthy ways. Sony Android P developers can lead to more detailed breakdown of how you spending time in their app from this new dashboard. For example, YouTube will be adding a deep link where you can see total watch time across mobile and desktop and access many of the helpful tools. That should send our shared earlier. Understanding is a good start. But Android P also gives you control to help

you manage how and when you spend time on your phone. Maybe you have an app that you love but you're spending more time. And it then you realized Android P lets you set time limits on apps and will nudge you when you're close to your limit that it's time to do something else. And for the rest of the day that app icon is greyed out to remind you of your goal. People of also told us they struggle to be fully present for the dinner that they're at with a meeting that they're attending because the notifications they get on their device can be distracting and too tempting to resist and come on

we've all been there. So we're making improvements to do not disturb mode to silence not just the phone calls and texts, but also the visual interruptions that pop up on your screen to make do not disturb even easier to use we've created a new gesture that was affectionately codenamed shush. If you turn your phone over on the table, it automatically enters do not disturb so you can focus on being present no pings vibrations or other distractions. Course

in an emergency, we all want to make sure we're still reachable by the key people in our lives like your partner or your child School Android P will help you set up a list of contacts that can always get through to you with a phone call. Even if you're not disturb is turned on. Finally we heard from people that they often check their phone right before going to bed. And before you know it an hour or two has slipped by and honestly this happens to me at least once a week getting a good night's sleep is critical and Technology should help you with this not prevent it from happening.

So we created wind down mode. You can tell the Google Assistant what time you aim to go to bed? And when that time arrives it will switch on do not disturb and fade the screen to grayscale Which is far less stimulating for the brain and can help you set the phone down. It's such a simple idea, but I found it amazing how quickly I put my phone away when all my apps go back to the days before color TV. Don't worry all the colors return in the morning when you wake up. Okay, that was a quick tour of some of the digital well-being features were

bringing to Android P this fall starting with Google pixel digital well-being is going to be a long-term theme for us to look for much more to come in the future. Beyond the three themes of intelligence Simplicity and digital well-being the Dave and I talked about their literally hundreds of other improvements coming in Android P. I'm especially excited about the security band since we've added to the platform and you can learn more about them at the Android security session on Thursday. but your big question is that's all great. How do I try some of the stuff

will today we're announcing Android P beta. And what efforts in Android Oreo to make OS upgrades easier Android P beta is available on Google pixel and seven more manufacturer Flagship devices today. You can head over to this link to find out how to receive the beta on your device and please do let us know what you think. Okay, that's a wrap on what's new with Android and now I'd like to introduce Jen to talk about Maps. Thank you. He does change dangerous on what time you can actually be part of it

on Google Map healthy response to emergency crisis like this hurricane hit turn Houston into Island and the roads were changing constantly. We kept saying thank God for Google it. What would we have done? It's really cool. That isn't holding people to keep doing what they love doing. Keep doing what they need to do. Building Technology to help people in the real world every day has been cord who we are and what we focus on it Google from the very start.

Recent advancements in Ai and computer vision have a lot of stuff dramatically improve long-standing products like Google Maps and it also made possible brand new products like Google Lens. Let's start with Google Maps. Maps was built to assist everyone wherever they are in the world. We've mapped over 220 countries and territories and put hundreds of millions of businesses in places on the map. And in doing so we've given more than a billion people the ability to travel the world with the confidence that they won't get lost along the way. But we're far from done making that

smarter and more detail as advancements in a I have accelerated. We're now able to automatically add new addresses businesses and buildings that we extract from street view and satellite imagery directly to the map. This is critical in rural areas in places without formal addresses and in fast-changing city is like Legos here. We literally changed the face of the math in the last few years. Hello, Nigeria. We can also tell you if the visits you're looking for is open how busy it is what the wait time is and

even how long people usually spend there. We can tell you before you leave whether parking is going to be easy or difficult and we can help you find it. And we can now give you different routes based on your mode of transportation. What do you're riding a motorbike for driving a car and by understanding how different types of vehicles move at different speeds. We can make more accurate traffic predictions for everyone. We've only scratched the surface of what Maps can do. We originally designed maps to help you understand where you are to help you get from here to there.

But over the past few years, we've seen our users demand more and more of maps. They're bringing a harder and more complex questions about the world around them and they're trying to get more done. Today are you just aren't just asking for the fastest route to a place. They also want to know what's happening around them what the new places to try are and what locals love in their neighborhood. The world is filled with amazing experiences like cheering for your favorite team at a sports bar or a night out with friends or family at a cozy neighborhood Bistro. We want to make it easy for

you to explore and experience more of what the world has to offer. We've been working hard on an updated version of Google Maps that keeps you in the know on what's new and trending in the areas you care about and helps you find the best place for you based on your contest and interest. Let me give you a few examples of what does it's going to look like with some help from Sophia. First we're adding a new tab to maps called for you. It's designed to tell you what you need to know about the neighborhoods. You care about new places that are opening what's trending now and personal

recommendations. Here I'm being told about a cafe that just opened in my area. If we scroll down I see a list of the restaurants that are trending this week. People because of the zero work mass is giving me ideas to kick me out of my rotten inspire me to try something new. But how do I know if a place is really right for me? Have you ever had the experience at looking at lots of places all with four star ratings? And you're pretty sure there's some you're going to like a lot of mothers that maybe aren't quite so great, but you're not sure how to tell which ones. We

created a score called your mass to help you find more places that you'll love. You are match uses machine-learning. The combined what Google knows about hundreds of millions of places with the information that I've added restaurants. I've rated Cuisines. I've liked and places that I've been to. If you cook into the match number, you'll see two reasons explaining why it's recommended just for you. It's your personal score for places in our early testers are telling us that they loved it. You can confidently pick the places that are best for you whether you're planning ahead or on-the-go.

I need to make a quick decision right now. Thanks in West of you. Good For You tab and then you are match score are great examples of how we can help you stay in-the-know and shoes places with confidence. Another pain point we often hear from our users. Is it planning with others can be a real challenge? So we wanted to make it easier to pick a place together. Here's how Long press on any place to add it to my shortlist. I'm always up for ramen, but I know my friends have lots of him to their own so I can add some more

options to give him some choices when you collected enough places, but you like she has a list with your friends to get their input to you can easily share with just a couple of Taps on any platform that you prefer. Then my friends can add more places if they want to or just go out with one simple quick so we can quickly choose a favorite. So now instead of copying and pasting a bunch of links in sending texts back and forth decisions can be quick easy and fun. This is just a glimpse of someone what's coming to maps on both Android and iOS later this summer and we see this is just the

beginning of what Maps can do to help you make better decisions on the go and to experience the world in new ways from your local neighborhood to the far-flung corners of the world. Discovery experience wouldn't be possible without small businesses because then we help people discover new places. We're also helping local businesses be discovered by new customers. These are businesses like the bakery in your neighborhood or the barber shop around the corner these businesses with the fabric of our communities, and we're deeply committed to

helping him succeed with Google. Every month we connect users to businesses nearby more than 9 billion times including over a billion phone calls and 3 billion Direction requested their stores. In the last few months, we've been arguing more tools for local businesses to communicate and engage with their customers in meaningful ways. You can now see Daily Posts on events are offers from any of your favorite businesses and soon you'll be able to get updates on the new for you stream to and when you're ready. You can easily book an appointment or place an order with

just one click. We're always inspired to see how technology brings opportunities to everyone. The reason we've invested over the last 13 years and mapping every road every building and every business is because it matters when we map the world communities come alive and opportunities arise and places. We never would have thought possible. It is Computing involves. We're going to keep challenging ourselves to think about new ways that we can help you get things done in the real world. I'd like to invite a part of the stage to share how we doing. This goes in

Google maps and Beyond. The cameras in our smartphones they connect us to the world around us in a very immediate way to help us save a moment captured memories and communicate with his advances in Ai and computer vision that you hurts under talk about. What is the cameras can do more? What is the cameras can help us answer questions questions, like where am I going or what's that in front of me? Let me paint the familiar picture you exit the subway,

you're already running late for an appointment or a tech company conference that happens and then head south on Market Street. So what are you doing one problem? You have no idea which way is south. So you look down at the phone you looking at that blue dot on the map and just starting to walk to see if it's moving in the same direction. If not, you're turning around. They've all been there. Sofia Castro self-winding the camera can help us here. Our teams have been working really hard to come by in the power of the camera to computer vision with street view and maps to

reimagine walking navigation. So here's how it could work look like in Google Maps. Let's take a look. You open the camera? You instantly you instantly know where you are on the phone all the information on the map the street name the directions right there in front of you know, you said you also see the map. So that's where you stay oriented. You can start to see nearby places. So you see what's around you. And just for fun. Our team's been playing with an idea of adding a helpful guide like that there.

So that I can show you the way that's pretty cool kinds of experiences though. GPS alone doesn't cut it on what we call VPS visual positioning system positioning and orientation to think about the key inside here is just like you and I when we are in an unfamiliar place you looking for Visual Land Mart looking for the store from the building facades of Sarah and it's the same idea BPS uses the visual features in the environment do the same so that maybe help you figure

out exactly where you are and get you exactly where you need to go pretty cool. That's an example how how we're using the camera to help you in maps but I can also help you do more with what you see. That's why we started working on Google now people are already using it for all sorts of answers. And especially when the questions are difficult to describe in words answers like for the cute dog in the park. That's a Labradoodle oldest building in Chicago is the Wrigley building and its 425 ft tall. What is my

nine-year-old son says these days that's more than 60 Kevin Durant. Noted a lens is a capability in Google products like photos and the assistant but they're very excited. The starting next week lands will be integrated right inside the camera app on the pixel the new LG G7 G7 and a lot more devices this way. It makes it super easy for you to use lands on things right in front of you already in the camera. Very excited to see this. No, like voice vision is a fundamental shift in Computing for us and it's a multi-year journey,

but they're already making a lot of progress. So today I thought I'd show you three new features in Google Lens that can give you more answers to more types of questions more quickly recognize and words words are everywhere you think about it traffic signs posters restaurant menus business cards, but now with Smart Text selection, you cannot connect two words you see with the answers and actions you need so you can do things like copy and paste from The Real World

directly into your phone. that's like. Let's say you're looking at you can paid on a page of words into a page of answers. So for example, you looking at a restaurant menu, you can quickly tap around figure out every day is what it looks like. What are all the ingredients at Sarah by the way has a vegetarian good to know Ratatouille tomatoes. Really cool now Indians examples lenses not just understanding the shape of characters in the letters visually. It's actually trying to get at the

meeting in the context Behind these words and that's where all the language understanding that you heard. Scott talk about really comes in handy. Okay, the next feature I want to talk about it's called Style Match. And idea is this sometimes you question is not what's that exact thing instead of your question is what are things like it you're at your friend's place you check out this trendy looking Glam and you want to know the things that match that style and now Lance can help you. Or if you see an outfit that catches your eye you can simply

open the camera tap on any item and find out of court specific information like reviews a set of any specific item, but you can also see all the things and browse around that match that style. They're stupid suit of course length have to search through millions and millions of items. But we kind of know how to do that search but the other part actually complicates things which is if they can be different textures shapes sizes angles lighting conditions at Sarah technical problem, but you're making a

lot of progress here in really excited about it. So the last thing I want to tell you about is how we making lens work in real time. So as you saw in the side match example, you start to see you open the camera and you start to see land surface proactively all information instantly and it even anchors that information to the things that you see. This kind of thing where is sifting through billions of words phrases play something Justin real time to give you what you need not possible without machine learning using both on the intelligence, but also tapping into the power

of cloud TPU switch be announced last year and I owe to get this done really excited over time. What you want to do is actually over play the live results directly on top of things like store France street signs for a concert poster. So you can simply Point your phone at a concert poster of a and a music video just talked to play just like that. This is an example of how the camera is not just answering questions, but it is putting the answers right where the questions are and it's very exciting. So smart

text selection Style Match real-time results all coming to lands in the next few weeks. Please check them out. So, those are some examples of how Google is applying AI in camera to get things done in the world around you when it comes to applying a eye mapping and computer vision solving problems in the real world, but it doesn't get more real than self-driving cars. So, please join me in welcoming the CEO of John krafcik. Thank you. Hello everyone. We're so delighted to join our friends a Google onstage here today. And well, this is my first time at Shoreline. It actually isn't

the first time first self-driving cars. Is he back in 2009 in the parking lot just outside this theater some of the very first tests of self-driving technology took place. It was right here where a group of Google engineers roboticists and researchers set out on a crazy mission to prove that cars could actually drive themselves a back then most people thought self-driving cars were nothing more than science fiction. But this dedicated team of dreamers believe that self-driving Vehicles could make Transportation safer easier and more accessible for everyone. And so

the Google self-driving car project was born 2018 in the Google self-driving car project is now its own independent alphabet company called waymo and we've moved well beyond tinkering and research Jay waymo is the only company in the world with a fleet of fully self-driving cars with no new driver seat on public roads. The members of the public in Phoenix Arizona have already started to experience some of these fully self-driving rights to let's have a look. Hey day, one of those self-driving.

Are you ready? Baird The future look like for it. I never know that we wasn't going to drive ahead. Vikki Carr pretty cool All of these people are part of what we call the waymo early Rider program remembers of the public use are self-driving cars in their daily lives over the last year. I've had a chance to talk to some of these early writers and their stories are actually pretty inspiring one of our early writers. Mayhaw witnessed a tragic accident when she was just a young teen witch scared her into never getting her driver's license, but now she

takes away my to work every day and there's Jim and Barbara you no longer have to worry about losing their ability to get around as they grow older. Then there's the Jackson family waymo helps them all navigate their gympact schedules taking Kyla and Joseph to and from school practices and meet us with friends. So it's not about science fiction when we talked about building self-driving technology. These are the people were building it for. In 2018 self-driving cars are already transforming the way they live and move. So Phoenix

will be the first stop for waymo driverless transportation service, which is launching later this year soon. Everyone will be able to call waymo using our app and a fully self-driving car will pull up with no one in the driver seat to whisk them away to their destination. And that's just the beginning because it waymo were not just building a better we're building a better driver and that driver can be used in all kinds of applications ride-hailing Logistics personal cars connecting people to public transportation, and we see our technology

as an enabler for all of these different Industries. And we intend to partner with lots of different companies to make this self-driving future a reality for everyone. Now we can enable this future because of the breakthroughs in Investments. We made in a I am back in those early days. Google was perhaps the only company in the world investing in both Ai and self-driving technology at the same time. So when Google started making major advances in machine learning the speech recognition computer vision image search and more. Waymo is in a unique position to benefit example back in

2013. We were looking for a breakthrough technology to help us with pedestrian detection. Luckily for us. Google is already too playing a new technique called Deep learning a type of machine learning that allows you to create your own that works with multiple layers to solve more complex problems. So are self-driving Engineers teamed up with researchers from the Google brain team and within a matter of months. We reduce the error rate for detecting pedestrians by 100x. That's right. Not a hundred percent but a hundred times. and today

today I place an even greater role in our self-driving system unlocking our ability to go truly self-driving not to tell you more about how machine learning makes waymo the safe and skilled driver that you see on the road today. I'd like to introduce you to Dimitri. Good morning, everyone. It's great to be here. Now. It's way more and I touch every part of our system from perception to prediction to decision-making to mapping and so much more capable and safe driver. Our cars need a deep semantic understanding of the world around them are Vehicles

need to understand and classify objects interpret their movements reasonable than 10 and predict. What they will do in the future may need to understand each object interacts with everything else and finally our cars need to use all that information to act in a safe and predictable Manner and used to say there's a lot that goes into building a self-driving car and today I want to tell you about two areas where a I have made a huge impact perception and prediction. The first perception detecting and classifying objects of the key part of driving

pedestrians a particular pose a unique challenge because they come in all kinds of shapes postures and sizes. For example, there's a construction worker picking out of a manhole with most of his body obscured. Your The Pedestrian crossing the street concealed by blank of wood and here we have two best friends for dressing inflatable dinosaur costumes. Now we haven't taught our cars about the Jurassic. But can still classify them correctly. We can detect and classify this pedestrians because we applied you can us to a combination of sensor data a traditionally computer vision.

Neural networks are use Justin camera images and video what are cars have a lot more than just cameras will Temple users to measure distance in shapes of objects and readers to measure their speed and by applying machine learning do this combination of sensor data. We can accurately detect pedestrians in Old forms in real time. What's an area where machine learning has been incredibly powerful for waymo is predicting how people will behave on the road. Now, sometimes people do exactly what do you expect them to and sometimes they don't take this example of a

car running a red light. Unfortunately, we see those kind of thing. I'm more than we'd like, but let me break this down from the Cars point of view. Our car is about to proceed straight through the intersection. We have a clear green light and cross traffic stop with the red light. But just as we enter the intersection all the way in the right corner, we see a vehicle coming fast are models understand that this is unusual behavior for a vehicle that should be decelerating. We predict the car

will run the red light. So we preemptively slow down which you can stay here with this red fence. And this gives the Red Light Runner room to pass in front of us while it barely avoids getting another day. We can detect family because we train our male models using lots of examples today. Our fleet has self-driven more than 6 million miles and public roads, which means we can hundreds of millions of real-world interactions. Mount it takes more than go down good algorithms to build a self-driving car will send you the really powerful infrastructure and waymo we use the

tensorflow ecosystem and Google data centers, including CPUs to train on your own network and with CPUs, we can now train Arnett's up to 15 times more efficiently. We also use this powerful infrastructure to validate our models in simulation. And in this virtual world, we're driving the equivalent of 25000 cars all day everyday all told was driven more than 5 billion miles and simulation and with this kind of scale both and training and validation of our models we can quickly and efficiently future cars new skills and one skins

Gill we started to tackle is still driving and difficult weather, but just know as you see here. And today for the first time I want to show you a behind-the-scenes look at what it's like for a cars to self driving snow. This is what a car sees before. We apply any filtering. I driving a snowstorm can be tough because snowflakes and create a lot of noise for our sensors. But when we applied machine learning to the data, this is what a car sees we can clearly identify each of these vehicles even for all the sensor noise and the quicker we can unlock these

types of advanced capabilities the quicker we can bring our self-driving cars to more cities around the world and sitting here you can't wait to make our self-driving cars available to more people moving us closer to the Future where roads are safer easier and more accessible for everyone thinks everyone. Now, please join me in welcoming bag Jen to close out the morning session. Thanks Dimitri the great reminder of how I I can play a role in helping people in new ways all the time. I started a Google as an engineer as an engineering intern

almost 19 years ago. And what struck me from almost a very first day I walked in the door was the commitment to push the boundaries as possible with technology combined with a deep focus on building products that had a real impact on people's lives. What is the years of past I've seen time and again how technology can play a really transformative roll from the earliest days of things like search and maps to new experiences like the Google Assistant. If I look at the Google it today, I see those same early values alive and well.

We continue to work hard together with all of you to build products for everyone and products that matter. We constantly a fire to raise the bar for ourselves even higher and to contribute to the world and just Society in a responsible way. We know that it's really build for everyone. We need lots of perspectives in the mix. So that's why we brought in Ohio this year to include an even wider range of voices. We invited additional speakers over the next 3 days to talk to you all about the broader role that technology can play in everything from promoting digital

well-being to empowering ngos to achieve their missions. Along with of course the hundreds of technical talks that you come to expect from us at Ajo and that we hope you can enjoy and learn from as well. Welcome to Ohio 2018. Please. Enjoy and I hope you all find some inspiration in the next few days to keep building good things for everyone. Thank you.

Cackle comments for the website

Buy this talk

Access to the talk “Google Keynote”
Available
In cart
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “2018 Google I/O”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “Software development”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
159
app store, apps, development, google play, mobile, soft

Similar talks

Laurence Moroney
Staff Developer Advocate at Google
+ 1 speaker
Kaz Sato
Developer Advocate at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Jason Titus
Developer Product Group at Google
Available
In cart
Free
Free
Free
Free
Free
Free
Jumana Al Hashal
Product Leader at Google
+ 1 speaker
Todd Kerpelman
Developer Advocate at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Google Keynote”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
558 conferences
22059 speakers
8190 hours of content