Jason is the vice president for the Developer Product Group at Google. Before that, he worked as the chief technology officer at Shazam and as the vice president of Communications for Mail, Messenger, and PIM at Yahoo. Prior to that, Jason co-founded iAtlas, a start-up acquired by Alta Vista. Jason has extensive experience in engineering, product management and design for large-scale consumer products, and has managed large global teams.View the profile
About the talk
Learn about the latest updates to our developer products and platforms at Google in a Keynote led by Jason Titus.
03:20 AI for everyone
08:27 Average APK size over time
13:04 Android jetpack
16:26 Increased engagement
21:58 Actions on Google
26:32 Tal Oppenheimer
34:50 Material design. New approaches
40:00 Material components
46:14 TensorFlow Lite
51:06 ML kit
55:40 Creating a 3D app
1:00:14 Wrap up
1:02:44 / Jason Titus / Developer Keynote
Good afternoon, everyone. I'm Jason and it's great to be back here in Ohio. This morning you got to hear all about what we're doing for a billions of users. Now. I'm excited to share what we're doing for all of you. The developers were solving everyday problems and Powerful ways. What are you joining us here at Shoreline? We're watching the livestream or joining us through one of 500 I owe extended events. Welcome. You know, one of the reasons I love my job is I get to meet with developers from all around the world and hear what you're up to and what you're
finding difficult. It helps me understand what we need to do is Google to make things easier. As well as what we can do better in our own product. And one thing that becomes instantly clear from Lagos to Warsaw to Jakarta is how important the broader development Community is in helping each other figure out things and solve problems together. He's going to pay it forward culture. It's one of the great things about working in this industry. And I am constantly inspired by the stories. I hear out of our Google developer groups a Google developer expert solving this to me.
I see we have a few of them here today. Every year with their help Rabil to reach more than 1 million developers do in-person events across 140 countries. Are women techmakers program is able to engage more than a hundred thousand women Advance their careers here. Our developer Community is truly important and all of this would not be possible without their efforts. It's also great in the last two years. We have had the number of Google developer expert to double. These are amazing people like Ryan zahab.
Cheat cheat codes found at her local Google developer group women techmakers Lebanon on top of that. She started a web development training program for Lebanese and Syrian refugee girls We also have folks like Rebecca Franks and iot an Android expert in Johannesburg. how to how to cook passion for cultural preservation she works on an open source app that lets children read books in their native African languages Google would not be what it is today if it weren't for developers. I cry on Rebecca and all of you. So, thank you.
how to hurt this morning Sunday was calling out. The pace of AI Innovation is breathtaking only set of capabilities is going to change the way we do things and what's really exciting to me is that we're at an inflection point is I used to be something but only deep x / 2 phds could use and now it's becoming accessible to everyone. Let me show you what I mean. I would totally describe myself as an average high schooler. I like hanging out with my friends watching movies getting my nails done and lately. I've been into machine learning using convolutional
neural network. I don't know why you're laughing at me since about as well. She's always wanted to know everything. Why is that? Is that what you do? You just wants to be always learning moving doing different stuff. I haven't always been into computer science, but my mom goes rose bushes in my front yard every season they get disease and then my mom and I would have to diagnose it. Do you want to do something about it? I had the idea as a potential thing I could do for my research class with Miss Sun why anyone would sit around their summer vacation teaching a machine to identify
plants? Not sure but that was what she wanted to do. I wanted to have a way people could diagnose plant diseases just by taking a photo of it. And so that's when I started looking into tensorflow tutorials and read blog post every night Shahs of took it upon herself to do all the background research necessary and then start to ask what can this really do plan either takes a picture of the plant and it tells me what plant it is and whether it's healthy or disease and if it is diseased what disease it is, somebody has downloaded this app
and using it. That was like wow. something that I just Bristol Palin I don't think you have to be a Super Genius to get into coding really anyone can do it with an idea and with perseverance. I feel like open source Technologies and the wealth of information on the Internet is empowering My Generation. I know that I can do anything. I put my mind to and so can anybody else? Amazing, right? And we're delighted to have shots at our family here with us today.
So you can see why we're so excited about bringing technology like tensorflow and ml kid to all of you as developers. You can drop modules and your applications add a few lines of code and give them intelligent new capabilities. And this is only the beginning of what is possible as a company. We're committed to empowering developers with the latest technology to build things that matter and today will show you what we're doing to make that easier. So for the next hour, I'm going to talk more about new capabilities we're releasing as well as the improvements were making to the platforms
and tools you use everyday to that. Let's get started with Steph is going to take the latest on the Android. Android is growing all over the world with billions of devices and every month billions of apps downloaded our developer Community is growing rapidly, right alongside in China India Brazil. The number of developers using ride almost tripled in just two years with all this growth. We feel a responsibility to support its vast ecosystem. And listen if you feel like
your feedback drives what we say here each year you are right like kotlin last year. Simply made it a fully supported language. We've launched more and more support taking advantage throughout Android already 35% of pro developers use kotlin that number grows every month and 95% of kotlin users say they're really happy more and more Android development is going towards kotlin. We are committed for the long-term and if you haven't tried it, I would just as your feedback shape Investments so far like play console Android studio and caught your feedback shapes this year too.
So let's cover three things today first distribution making apps radically smaller. So you get more installs second development helping you build faster with better API and third engagement ring users back more and more. Let's go straight into driving insults. So Android is growing. That's great. AB size is also growing and that is not great apps are targeting more people and more countries, which means apks have more languages more features. The larger your app gets the less installs. You get the most
people think that's an emerging-markets issue, but it's true in all countries. So how can we make it easier to build smaller apps are best idea was hard for us if memory architecting our entire at serving stack, but it was the right way to do it today. We are excited to announce the new app model for Android using the Android app bundle a new publishing format. You can dramatically reduce app size apps need a lot of resources to work on every device the bundle contains it all but modularized so it when
you download Google Play's new Dynamic delivery Only delivers the code and resources a specific device needs now. He's tested this with many of you and we've seen huge savings LinkedIn save 23% Twitter saved 35% Gel most a 50% now if you're wondering how many devices does that work on? The answer is 99% IMAX because it's built on long-standing platform Concepts like splits and multi APK. We also wanted this to be almost no work for you. And so I'd love to have you see a demo from someone who's been instrumental along with several teams who built this together, please welcome tour. Thank you
Steph. Here I am in Android Studio working on my app and I'd like to make it smaller. So what I'll do is open up our APK analyzer. And as you can see most of the download sizes for resources, we have several large image folders with different densities. So this is an app which can really benefit from the new Dynamic delivery facility. To use it. I don't have to change a single line of code. All I have to do is rebuild my app as an app bundle. So I'll invoke degenerate APK dialogue that you can see. We have a new Android app bundle option next appointment Mikey store and I can
also tell it to export my encryption key for Google Play This is what lets the Play Store break apart my app and reassemble it into smaller versions. So then I'll trigger the build and once that's done we can take a look at the bundle file. We just created. So as you can see, this looks just like APK files that you already know one love but there's some extra meditative here, which is what lets the Play Store do it's magic. So now images of time let's assume that I've already uploaded my a bundle and my son and key to the play console. So then I land here we have a new page and it play
console call the app bundle Explorer take a look at the big number on the right. This tells me that when it publishes update. I'm going to save nearly 29% on downloads for my users and this page let's make drill into just how that's possible. So in short trading internet bundles with Studio is super easy, and I hope you'll all tried out and see similar games for yourself and your users at my nose and dynamic delivery are launching for production use today. Now we're also working
to increase installs. In other ways by GDC. We announce Google Play instant. So you can try a game without having to install now have increased the number of players by up to 20% today. We're announcing that this is available for all game developers. If you want to try it right now try launching Candy Crush Saga because that launches today and to make development easier. We're also releasing a Unity plugin and Coco's integration. Now I'm starving North Second team making app development easier Androids apis
could be easier one person said on Android there's six ways to do everything last year. We launched architecture components. It was a test bed for new ideas starting in top areas that you Flags like life cycles and data today. So many top apps use movies in production and more than half of you have said you already use them or plan to in the next year. Today we're announcing Android jetpack. The next generation of Android API to accelerate Android development jetpack is a set of libraries and tools we set the basic DNA
including support library and architecture components together coherently and add even more new libraries Work Management navigation paging slices and kotlin extensions across everything all libraries are backwards compatible, which means they work a 95% of devices. So it's been a pain to schedule background tasks. Now with work manager you get a single easy-to-use API that works nearly everywhere. Inject hack is all about concise apis. Those of you who tried
it say, you're writing up to 1/3 less code jetpack and kotlin are intentionally designed to work together. So you write only the code you need for a pleasant reading and writing experience and Jeff. Hexphase Time by embodying opinions about what we found works best for Android development, like rxjava or material design. Jetpack sleepy eyes are integrated with it to France and Android studio now includes a navigation editor which works with the library so you can visualize your rap flow almost like you're sketching on a whiteboard. You can
add new screens position them in your flow and under the covers will help you manage the back stack conditional close until you get it just right. overall ID codes we think are great helpers to make development fast. That's why I everything you seen bundles using jetpack comes with Studio support and the team also worked on making everyday tasks faster. You told us work on emulator boo time and I'd like you to see it now. Let me show you how quickly or emulator can start now Ready Set Go. I was not cheating.
It was not running in the background. The reason it's so fast is that we support snapshots. We store the full stimulator into file that we can then load back quickly and you can create these snapshots of yourself. So now I'm going to bring back a Snapchat. I took in the middle of a complex opengl 3D stress. Test Ready Set Go. Because he's about two seconds and it's up and running. There are more speed enhancement. We added an energy profiler integrated and improve system Trace. There's a C plus plus profile now we promoted d8
compiler to stay after testing it on our own Android platform, which means smaller Faster binaries by default added an ADB connection assistant to fix Android development faster and easier. We hope you try Android studio and Jeff had previous today, including all the alpha stage new libraries. Jetpack is just a beginning for us. We are testing so many more new ideas, and we hope you watch for them in the months ahead. Okay. So what's your app has built its installed we want to get users coming back. The slices Dave showed are a cool new way
to drive re-engagement. We want to be is to be easy to build. So you'll find that are rich in flexible. So you can compose you can start with something simple like a set of rows or grades. Then you can add content like text and images but not just static content. You can also has real-time data and Rich controls and once you get the pieces together in a good set-up you can add the code to make slices interactive like pause a song or go to the next one. So then you end up with this cool mini snippet of your app and because it's jetpack slices work on
95% of devices showing the power of building new features in a jetpack world. This is also ready for everyone here to try today. Your slices will start showing up in search the summer time with a p launch and assistant later this year. So that's a quick tour of some of the biggest new areas of investment for Android and how your feedback has been shaping the landscape. Now, let's switch gears and talk about devices at CVS this year. We announced that Lenovo Harman l g and I homes are all building consumer products powered by Android things
a powerful platform for developing iot products this week Android things graduates from developer preview to one. O ready for everyone to build commercial devices and everyone here and I will get a free Android things developer kit. If you want to get it first join the scavenger hunt today, or everyone here can pick one up tomorrow as we continue to expand some more devices and more surfaces. Are you an assistant team have been working closely together to hear more please welcome Brad Abrams.
Awesome. Thanks Def. This morning you heard how the assistant is becoming even more conversational and visual helping users get things done save time and be more present and developers. Like you have been a big part of the story making the assistant more useful across 500 million devices. Doordash Fitbit to do is Starbucks Disney and many many others are engaging with users through actions that they built and smart home oem's like GE Arlo. Xiaomi And Logitech Hemi more than
5,000 devices that work with the assistant in total. The Google assistant is ready to help with over 1 million actions bill by Google and all of you. In the pot was momentum has been growing everyday. It's now available across 16 languages and multiple devices including phones smart speaker. Stevie's cars watches headphones and whole new categories of devices like smart displays, which are coming in July. And we're delighted to see that many of you are starting to test the waters with this new
emerging era of conversational Computing. In fact over half-a-million developers are learning and building with dialogflow our conversation development tool. Today, I'm excited to share. How are making it even easier for app developers and web developers to get started with the Google Assistant. You can think of your assistant action as a companion experience to your main property that users can access from a smart speaker even in the car wherever they're not using
their phone or laptop. And if you want to personalize that action for users account linking lets you easily share State between your app and your action. And now with seamless digital subscriptions your users can enjoy the content and digital Goods. They purchased in your Android app directly in your assistant action taken Economist, for example, because I'm a premium subscriber in there app. I can now enjoy that same premium content on any assistant enabled device. But of course creating engaging actions
doesn't end with digital subscriptions as you saw in the demos this morning. The assistant is becoming a canvas that blends conversation with Rich visual interactions for the phone and other devices like smart displays. And starting today, you can be deeply customize the appearance of your action. You saw a glimpse of what was possible earlier today with demos from tasty and Starbucks, but let me show you another one check out the game King for a Day here on a smart display. It looks beautiful here on the phone and on the TV.
Of course, once you build an action for the assistant, you wouldn't get lots of people engage with that experience and for that I've got three things to share. First we're making it even easier for you to promote your action was something new we call Action Lee. These are hyperlinks that you can use from anywhere that point directly into your action. Let me give you an example headspace has built a great experience for the Google Assistant that can help people meditate now and they have some new
content. They might share a blog post about it. This post contains an actually right there at the bottom and that special link triggers the headspace action directly in the Google Assistant. If you've already built in action, and you want to spread the word starting today, you can visit the action console to find your shareable Action Link. Okay. So now that you've acquired some new users you want to engage them and for this we've got action notifications. What
users often action notifications gives you away to connect with them about new features and content these notifications work on the phone. Even if users don't have your Android app installed? And now with cross surface notifications coming to the assistant you'll be able to re-engage with your users on speakers smart displays and other assistant enabled devices. But to consistently re-engage with users you need to become part of their daily habits and for that the assistant supports routine.
This is the ability to execute multiple actions with a single command for things like waking up in the morning getting to work or many other daily tasks. and now with routine suggestions After somebody engages with your action, you can prop them to add your action to their routine with just a couple of tops. So for example on my way to work each morning, my assistant can tell me how to beat traffic and it can also help me order my Americano from Starbucks.
We're excited to see how action links action notifications and routine suggestions will help you drive engagement, but the broader challenge of helping people connect with a right action is reminiscent of the early days of the web. Over the past 20 years. We've built up a lot of experience in connecting people with the right information services and content and we're putting that expertise to work in the Google Assistant. For example when somebody says Hey Google, let's start a math quiz.
The assistant should immediately suggest relevant games. For that happened. We need to understand the user's basic intent and that's hard because there are thousands of ways that users can ask to play a game. The handle this complexity we're beginning to map all the ways that users can ask for things into a taxonomy of built-in intense. Today we're making the first set of those intense available. And with these you'll be able to give the assistant a deeper understanding of what your action can do will be rolling out hundreds of built-in intense in the coming months. So with
that, I'm excited to see how you all extend your experiences to the Google assistant how you build Rich immersive interaction and create consistent engagement for your users to learn more and get starting billing actions today visit actions that google.com. And with that. Let me introduce tauhu tell you about the web platform. Thank you. Thanks, Brad. I don't think it's an exaggeration to say that the web is the world's most critical resource for ensuring the free flow of information.
The web is a fundamentally open an independent platform super developers. The what makes it possible to reach users around the world on almost any device and for users the web provides a truly frictionless experience you talk on a link and load a page and these properties have allowed the web to reach a massive scale with over 5 billion devices accessing the web each month. I'm here at Google from the very earliest days of pagerank to building our very own browser. We've been deeply invested in the continued growth and reach of the
web and it's part of this. We have two main goals first to make the web platform itself more powerful and more capable and second to build tools to help you easily take advantage of this power. Over the past few years on Chrome, we worked alongside other browsers to add capabilities to the platform to support a new web experiences. We've been calling Progressive web apps or pwa pwa Tsar websites that take advantage of modern Cloud platform apis to build experiences that can do things like work will that users offline send push notifications or be added directly
to a user's home screen? And universally businesses that have bills pwas have seen incredible results take Instagram Instagram launched their pwa last year to increase their reach to users with low-end devices and they were able to double the retention of their web users. Times internet has been launching cwa's across their products and saw an 87% increase in time spent per user for their Economic Times pwa. And when the Starbucks team rolled out there feel the way they doubled their daily and monthly active users on their
website. And because the web web adapt seamlessly two different devices and platform their mobile pwa also works well for their desktop with audience. In fact, they found that the number of orders placed on the desktop version has grown to be about equal to the number of orders placed on their mobile version. And it's not just these businesses across sites that advertise with Google. We see an average mobile conversion rate boost of 20% when the site switches to a pwi. And we also build many of our own products here at Google as pwas Google
browsers including recently Edge on Windows and Safari on both desktop and mobile. This is probably the most important Leap Forward for the web in the last decade. Superior waves have fundamentally changed what the web can do but that's the only part of it webassembly enables websites to run high performance low-level code written in languages like C & C plus plus and it has broad support across browsers and devices and because this code has access to all of the webs ati's webassembly enables a new class of contents run on the web platform as just one
example, the AutoCAD team took us 35 year-old codebase that's older than the web itself and were able to compile it to run directly inside a browser using webassembly. So now all of the Power of AutoCAD is just a link away. So the web platforms been dating all these great new capabilities, but we want to make sure it's easy for you to take advantage of them. So we've been working on the tools to help. Lighthouse is a feature of chromes built-in devtools that analyzes your site and gives you clear guidance on how you can improve your users experience half a million
developers are running Lighthouse against their site or as part of their continuous integration process performance regressions, or even to keep an eye on the competition. And today we're launching Lighthouse 3.0 which makes lighthouses performance metrics even more precise and its guidance even more actionable. The lighthouse tells you understand how you can upgrade your side. But we don't want to just give you advice. We want to give you the tools to help make sure any new sites you bills are high quality by default. We started the amp project two years ago to help make building
fast web pages much easier and I'm happy to share that amp is evolving in some big ways. We're expanding the kinds of things you can do with an added a bunch of features that support critical e-commerce experience in an like search autocomplete a full-featured a thicker and soon infinite scrolling list and businesses are seeing the benefits as just one example, Overstock saw 36% increase in revenue on their amp pages. I need introduced amp stories and easy-to-use format for creating immersive stories on the mobile web. No all am content benefits from a fast free
privacy-preserving cash that optimizes page loads, but they've had these google.com you are else. So we're fixing that with a new standard Caldwell the packaging. This is also the first step towards our ultimate goal for any fast responsive web content to be able to take advantage of all of the benefits of amp. And today we're announcing a new way to take advantage of all of these tools. We introduced Chrome OS almost 7 years ago to showcase the best of the web and make him cheating accessible to all Chromebooks grew 50% this past year units sold
and 28-day active users. I need expanded to tablets and detachable, but it's not just about access to technology. It's also about access to create it and that's why we're expanding Chrome OS to support developers with the ability to securely run Linux app on Chrome OS So this means that many of your favorite tools editors and IDs Now work on Chromebooks. So starting with the death channel on Pixel books. You can now build for the web on a platform built around the web and soon you'll even be able to run Android Studio on Chrome OS
so all in all it's an incredibly exciting time to be a web developer businesses around the world are consistently seeing substantial returns from deeply investing in their web experience by building Progressive web app. And the reach of these PW age is now truly everywhere with support across every major browser and it's easier than ever to build great web experiences with tools like Lighthouse and amp and still many more of our web developer products that you'll hear about throughout IO. The web platform is light years ahead of where it was and Giggles early days and it remains
as clear as ever to our mission. We're excited about the host of new use cases made possible by today's modern web. So be sure to check out the website Forum State of the Union talk later today to learn more and now please join me in welcoming Rich to talk about updates to material design. Thanks, all I'm going to talk about materials designs new approach to customize your apps and share some the new tools and resources. We've created we introduced material design in 2014 as a system for creating bold beautiful and consistent digital products across
platforms. We were responding to the desire that we heard from you and developers around the world for Clear design guidance for advice. That would make the experiences you create better for your users. Material has become the design foundation for all Google products and you've taken it and launched it into millions of apps. But we've heard two sentiments especially clearly after hundreds of design reviews with product teams and developers first that you did know. We see material as flexible enough that products
from different brands look too similar and second that are engineering support for material needed to be stronger. So you could more easily realize your design Bishop. Our goal has always been to provide more than just a design blueprint. We are committed to delivering resources tools and Engineering components to make it easier for product teams to work together seamlessly from design to development so that you can deliver customized experiences for your users. That's why we're proud to announce material theming
and major update to the material design system with material theming we're delivering a truly unified and adaptable design system. It enables you to express your Brand's unique identity using Color using type and shape consistently across your products material theming retains the consistent guidance from material and expands on it. Let's take a look. What is it was possible to design and build better. So your ideas could scale from a single sheet to a system with material. You can apply Color Dynamics so that every touch enhances
the character of your design. Typography is adaptable. So you can express more by customizing for style and legibility craft and experience as responsive across device and platform. Streamline collaboration with ghouls and open source code making it easy to express your Brand's unique identity balancing Form and Function down to every Last Detail material design isn't a single style. It's an adaptable design system inspired by paper and ink an engineer so you can build beautiful usable product.
What will you make? Material theming puts you in charge when you make just a few decisions about Aztecs like color or type for example will apply those throughout your design. You'll see options for customization across our design guidelines. And we've added hundreds of new examples all still material but reflecting a much wider range of products and styles when you see something you like you can use our handy Redline viewer to check the dimensions padding and even the hex color values. Today, we're also releasing two new tools
to make it faster to go from design to implementation first material theme editor this plugin for the popular application sketch helps designers create and customize a unique material theme change one thing value and a Cascades across the design. Then that work can be shared easily using material Gallery. This is the tool used by product teams at Google to review and comment on design iterations. And it's now available to everyone when it's time to build you can see automatic red lines. You can identify exactly
which component is being used and how it's been customized. But we need to make all of this easier not just to design but to realizing code so are open source material design component for Android iOS the web and flutter all with material theming are available today. These are the same components. We use a Google to build RX and just as Google creates a custom material theme that best reflects our brand and product starting with the new Gmail for the web. You can use the same components to create an experience and a theme that is right for you
more material components and material theming customizations will be coming soon. So keep keep an eye out. We'll be launching regularly and we'll keep listening. So please keep the feedback coming material theming a new approach to customize your apps powered by open-source components new tools and design guidance all available today to make material yours get started at Material. IO and now hear jaw to talk about progress in AI. Thanks, Rich. Hi, everyone at Google We Believe developers are driving the real-world impact away. I of course.
I understand. How come it can be machine learning is a challenge any environment. But today is developers open Target multiple platforms and devices. We also have to juggle issues like processing power band ways latency and battery life. That's why we're committed to making a easier to use no matter what platform your own. This begins in the cloud which makes machine learning accessible to customers in every industry We Believe success in a I should be determined by your imagination not for structure. We invest the
best technology to power Google's products. And Google cloud is making these resources available to you. For example, our Cloud CPUs a widely available in public beta making machine learning more affordable than ever before. Until recently training a model to a high accuracy on the image that data sent thousands of dollars Benchmark. Our tpus reach the same level for less than $50. Photp use and now available to everyone and getting started is as simple as following this link. And
this morning? We announced our new 3rd Generation CPU demonstrating our ongoing commitment to a I have one. The cloud also lets us share our best technology through a growing range of machine learning apis. Our latest is cloud text to speech based on deep mines witness the same technology behind the voices in the Google Assistant. a general speech with 32 voices in multiple languages and veterans You just a few lines of count. Your application can realistically speak to your users across many problems and devices. Fomo sophisticated
interaction with your users text to speech can be paired or was that work flow Enterprise Edition makes a easy to build a conversation with agent evil with no prior experience. It understands that you tons and context of what are user says and generates accurate responses, whether you're shopping for clothes or scheduling a bike repair. Finally Cloud AI is always exploring new ways to put more power in the hands of Developers. This efforts continues with Cloud offline Mall, which all the lights the creation of machine learning models. Our first widely available
release will be out on the vision which makes it possible to recognize images. You need to your vacation without writing any and you provide the training examples and also notice the rest. Are all my users are already accomplishing a lot with all-time ball from identifying poisonous spiders to helping the blind better understand images. A vision is just the beginning. We're looking forward to Bringing. Also. I'm off tomorrow machine learning pass very soon.
To learn more about all Cloud AI products including all time or early adoption, please visit our sides. But Cloud AI technology isn't the only thing that were pushing you knew directions. how to spell flow has become a standard in machine learning since we open source in 2 years ago was 13 million download now we're focusing on bring the answers to new flower poems. We recently announced to tensorflow Js. Which brings machine learning to millions of web developers through
project vision and voice kids can Labradoodles. I save an Anker earlier this morning with releasethememo kids in beta SDK that brings Google machine learning capabilities to mobile developers to Firebase the same technology that has power Google's all the experiences like text recognition in Google Translate and smart reply in Gmail will be available to power your own apps. Our vision for a i at Google is turning the latest technology into products that make better life for everyone, but we cannot do this without
creative developers like you again. We can bring a guy to the world. To talk more about how Google is supporting mobile development. Let me introduce Francis ma. Thank you. Thank you, Jah. Our mission for Firebase is a help mobile app to succeed by providing a platform to help you sell a key problems across the life cycle of your app from building your app to improving a quality to Growing your business. We've come a long way since we spend at Firebase two years ago from a real-time database to a broad app development platform and it's so exciting to see if there are now over 1.2
million app actively using Firebase every month including many of the top Android and iOS apps like Pandora Pinterest and flip cards. We appreciate so many of you are trusting us with your apps, whether your soul developer a working and large teams. We're committed to helping developers at companies of all sizes to succeed. One of our cheagles is the help you take care of the critical, but sometimes less glamorous Parts about development so we can focus more users and build cool stuff as an example with work hard to ensure. We're ready to meet
your compliance needs with many privacy and security standards That's Why Us was the upcoming gdpr. We are also continuing to expand the platform to further simplify everyday developer challenges. A little over a year ago the fabric team joined forces with us to bring the best of our platforms together. And we've made a lot of progress since then in the last couple of months. We've brought crashlytics Fabrics Flagship crash reporter into Firebase and we've been proved it by integrating it with Google analytics. So you can see what users did in your after
that led up to a crash from much easier diagnosis with a combination of crashlytics performance monitoring and Google analytics Firebase is not just a platform to help you build your apps infrastructure, but also to help you better understand and improve your app. Another major set of advancements. We've been making to Firebase over the last year is introducing machine learning to the platform a few months ago. We took our first step with the release of Firebase predictions predictions applies MLG your analytics data and predicts the future behavior
of your users. So you can take proactive actions to optimize your app. For example, you can lower the difficulty of your game for users who are likely to abandon it or send special offers to users who are likely to spend and you can also run a B test with a setup to see which of these performed best. So predictions with the first step that we took to bring the power of Google's ml to work for you. And today we're taking our second step with the release of MLK. It's in public beta. MLK will bring it together Google machine learning Technologies from across
Google and making them available to every mobile developer working on Android and iOS MLK provides 5 out of the box apis like image labeling and text recognition and these apis can run on device for low latency processing or in the cloud for higher levels of accuracy. You can also bring your own custom tensorflow Lite model if you have more specific needs, so let's talk to an example of how MLK can be used in a nap. Now. I'm out of two young kids are always so curious about the world around them and I thought it'd be fun to build an app or we can take pictures and
identify objects in them. Am I using ml kids on device image labeling API? It's easy for me to apply ml to identify different objects. Like it's a dog a tree or bridge now as many of you are aware one of the key challenges of failing for mobile. It's having a low latency response and assurance that it can work even without a network connection and You by using ml kiss on device API. I can be assured that image labeling the work even if I'm out on the remote hike with no network connectivity. So on device apis are gray for low latency processing but there are other times when I want to
optimize for highest accuracy possible. For example, when identifying landmarks. I want my kids to know this is the Golden Gate Bridge and not just any bridge and MLK This Cloud part apis give me that they have a much higher level of accuracy and can be easily integrated right from my app line code. Then there are other situations where you may have more specific needs and what the out-of-the-box apis can cover and for those times you'd want to bring your own custom ml models. Now, it's mobile developers. We know how important it is to keep your app binary small and be
able to generate on the experience rapidly with ML kids. You can upload your tensorflow Lite model and surf that through Google's Global infrastructure your app can dynamically retrieve these models and evaluate them on device. This means you don't need to funnel the model with your at binary and you can also updated with a republishing your entire app. And since MLK, it is available through Firebase. It's easy for you to take advantage of the broader Firebase platform. For example experimenting with different ml models using a b testing or
storing your image labels with cloud firestore or measuring processing latency with performance monitoring or understanding the impact on user engagement with Google analytics. We are so excited about the possibilities that machine learning unlocks whether you want a supercharger growth or built amazing user experiences. We want to harness Google's advances to help you build and grow your app and with us. I like to turn it over to Nathan march to talk about another area exciting advancements in Mobile Computing augmented reality. Thank you.
Everybody I got to say it is incredibly exciting to be here today. Hey Google, we believe that augmented reality represents one of the most exciting advances in Mobile Computing today. Volume label your devices to see and sense the world much like we do AR allows us to interact with digital content and information in the context of the real world, which is exactly where it's often the most useful and accessible. And as our phones are to see the world in new ways, they unlock new possibilities for the kinds of experiences. That developers can
create. That's why I just the three months ago. We launched arcore AR platform for building augmented reality experiences to allow you to take advantage of this incredible new technology. And we've already seen some amazing apps. For example, you can now create a floor plan just by walking around your home. You can visualize the intricacies of the human nervous system it real world scale or if you like transform your dining room table into the home of your next virtual pet and as you've been building we've been learning and listening to your feedback today.
We're rolling out a major update to AR Court to help you create even richer more immersive and interactive experiences. First we know the creating a 3D app can be challenging especially when you have to write directly to lower-level apis, like opengl. That's why we've created seen form a brand new 3D framework that makes it easy for Java developers to create arcore applications. And the thing is seen for was not just for people creating apps from scratch. Its design actually makes it really easy to quickly add arcore features to apps that you've
already released. Under the hood the scene form SDK includes an expressive API a powerful physically based rendering and seamless support for 3D assets inside of Android Studio. Best of all seems warm is optimized for mobile architected from the ground up with performance memory and binary size in mind. We also know that sometimes is a developer you want to build an AR app that can react to the specific physical objects in the real world. That's why today we're introducing augmented
images a new capability in arcore that makes it possible to attach a are content and experiences to the physical images in the real world and images doesn't detect the presence of a picture. They can actually compute the precise 3D orientation of that image in real time so that I can use augmented images to do things like bring a textbook to life create new forms of artwork. You can even see what's inside the toy box before you open it. Finally though, we know that if we truly want our computers to work more like we do
they need to involve not just the things in our world, but the people in our lives. Fundamentally our time here is spent with family and friends and colleagues. We collaborate we create we share our experiences together. That's why we believe that every writable thing that you can do by yourself and they are today you should be able to do together with someone else. It makes me very proud to announce the next major step and how our phones see and understand the world a new capability and arcore that we call Cloud anchors.
Was Cloud anchors, we actually allow multiple devices to generate. Thank you. I shared synchronized understanding of the world. So that multiple phones can see and interact with the exact same digital content in the same place at the same time. And this allows you as a developer to create applications that are collaborative and creative multi user and multiplayer. To show you exactly what I mean. We actually integrated Cloud anchor support into our experimental open source AR drawing app just aligned. Let's take a look.
Yeah. Thank you. We know the cloud anchors will enable many new kinds of our experiences are experiences that combine the power of your device your creativity and the people around you but because these experiences are so powerful, we believe that they should work regardless of the kind of phone you own that's why I'm very excited to say that we're making arcore is cloud Acres available on both Android and iOS devices. pretty awesome everything that I've just talked about today from seeing form to augmented images to Cloud anchors and more is available for you to use
right now. You can learn more about that start training your own airport powered up by going to developers. Google.com AR and with that. I'm going to change it back to Jason to wrap things up. Thank you all very much for your time. Thanks, Nathan. Hopefully you can see if more talks today that we're committed to meet you where you are. No matter what platform you're building on and help me take advantage of the latest Innovations for machine learning to augmented reality. And what you what you heard of today is built on top
of our Cloud Technologies. This includes powerful apis run machine learning smarter data analytics with bigquery and a focus on low latency and high reliability. And whenever possible we trying to take things out as open sore, like kubernetes in tensorflow so they can be broadly adopted and openly evolved across the industry. With our Cloud technology, we're taking the lessons. We've learned building large-scale software and bringing them to you. So I owe this year is bigger than ever. We got some great content lined up for the next few days
and you'll see her doing something new this year inspiration talks. They take a broader. Look at how the technology that you build affects the world around us. So we cover topics like building for a better Tech life balance the future of computing using AI to transform Healthcare. We all have a responsibility for what we build as well as the people that we build for. So I hope you have a chance to check this out. And one of my favorite parts of a having IO. So close to campus is it we're able to bring in more than 2,000 googlers who built these product? They're
here not only to help you. But more importantly to get your feedback on what we can do better. My Hope for this conference is that each of you walks away feeling that you can do something that you couldn't do before and they're able to use that knowledge to build things that matter thank you very much and have a great Google IO
Buy this talk
Access to all the recordings of the event
Buy this video
With ConferenceCast.tv, you get access to our library of the world's best conference talks.