Duration 44:22
16+
Play
Video

Integrating your Android apps with the Google Assistant

Krishna Kumar
Product Manager at Google
+ 1 speaker
  • Video
  • Table of contents
  • Video
2018 Google I/O
May 9, 2018, Mountain View, USA
2018 Google I/O
Video
Integrating your Android apps with the Google Assistant
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
41.25 K
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speakers

Krishna Kumar
Product Manager at Google
Mariya Nagorna
Senior Technical Program Manager at Google

Krishna is a product manager in Actions on Google, responsible for expanding the Actions platform to new surfaces and platforms. Previously, Krishna was the product lead for the original Google Pixel smartphone. Prior to Google, Krishna held a variety of product and engineering leadership positions at Qualcomm, Mozilla and Nokia. Krishna has an MBA form UCLA, MS in computer engineering from Clemson University and Bachelors in EE from Birla Institute of Technology & Science.

View the profile

Mariya leads programs that enable developers to build apps for the Google Assistant. Previously, she worked on software and hardware programs in various domains, including Google’s satellite division Terra Bella, Google [x]’s balloon-powered internet program Project Loon, and some of Google’s core products like Google My Business and Gmail. Mariya holds a Master’s degree in Electrical Engineering from Princeton University and two Bachelor’s degrees from Rutgers University in ECE and CS. She's been a longtim

View the profile

About the talk

Android devs who’ve implemented Android app Actions can already benefit from the Google Assistant’s rapidly growing reach. In this session, you’ll learn how to reach new users and serve existing ones better by integrating your Android app with the Google Assistant at an even deeper level - from adding conversational functionality to your projects, to defining custom grammars and participating in vertical programs.

Share

My name is Maria and I'm a technical program manager here Google and with me as person who does product management we both work on the developer platform cold actions on Google and in the session discuss how you can create action for your existing Android app and then we'll talk about how you can leverage those actions to build deeper Integrations with the Google assistant so that you can reach more users on your devices. Great. Everybody's here bright and energetic in the morning. All right, let's have a quick show of hands first. How many of you attended the app action session on

Tuesday? All right. So today you're going to talk a little bit about how you can enable actions on your Android apps and how you can make them surf face across various touchpoint in the Android platform. If that sounds interesting, we had this video session 2 days ago, which had a lot of cold sample takes a crap. So definitely request you to take a look at those code samples that video and they don't have an office hours immediately after that you can so you can come and ask us a question so we could have a quick recap on app actions and Android apps then

we're going to talk a little bit about how you can use the same telephone mechanism that you use for app actions to visit Lee create conversational actions to take your services today. Okay, so that's the agenda for today, but let's start with One Direction. Add devices are our portal to the world. We do everything with them. We watch videos we consume content. We navigate the word using all devices to primary critical user needs for how people use devices one is the actually consume content

that includes everything from listening to music to watching videos to get in your best recipes xcetera the other music devices to get things done. The world has changed right like you no more running back to your desktop to find maps or to book a piece of the book a ticket or to purchase something online. It's a single portal to the world. How old are there something like 2 million Android apps out there and it's probably like several hundred millions of websites. That's awesome. That gives that uses in Norma's choice and possibility but they

can also be a little bit overwhelming just try thinking immediately of like which app do I immediately use to find the best fish taco in San Diego by there just a lot of cops out there because it causes problems of Discovery and re-engagement Android authority in March 2016 that 77% of users don't use a app within 3 days of installation that man that's crazy and that number goes up to 90% within a month. So this very crowded face. It's hard to get the three engagement and your developing all these cool new features. But even though the user might have installed your app. It just don't know

what's going on. They don't know what are the new capabilities of the app. So you're riding a sneak peek on walkie call s app actions. They work mentioned this in the consumer Kino today couple of days ago and actions is a way for you as Android developers to surface the capabilities of your app and bring that up so that Google can Surface your capabilities your actions and your content across many different touch points across the Android platform. The Android platform at Wallace to an AI first world via moving from just

predicting the next app to predicting the next action that they use on my dick and the capabilities of your app at just the right moment either based on the use of context that use a routine or based on things like the use of query. So this gives you instant increase in both rich and re-engagement. But how does this all work? How does it show up in in the in the Android platform that start with Google Assistant? You can you can visit e get your Android apps to have a basic integration a shortcut to the Google Assistant for this

query. So in this particular scenario, the user is not something like manage my budget and that immediately brings up the Mint app straight to the budget Beach. How is this done it is because the make-up in this example has registered for an intent of budget Management in the spinal cord is action products similar to a lot more in detail of how exactly Do social works for content-based query so I have asked what is Lady Gaga's real name and understand and predict. What is the next following action that they use them? I

do if I find a searching for Lady Gaga, I think it's a natural conclusion that I might want to listen to the latest album or watch a video or buy tickets for a concert. So we try to predict the next action and show that esthetician chips down the down the bottom any of those search and chips immediately opens the appropriate Android app straight to the Lady Gaga page. Just think about what a good like vomiting through your app dollar to find out the app when you're not typing Lady Gaga there again, and then you're not trying to find out what it

all of this happened to magically so that it just freezes up across the Android platform. Beatles exploring how we can actually start face apps that you use a lot directly in Google search. So I use Fandango to book tickets. If I type of movie like Avengers shows me videos projection chips for Fandango right of the bottom. So clicking on any of those suggestion chips will basically take it directly into find angle. Where can I purchase the ticket? Android walls last year in Android o be brought up this concept

of predicting the next app that you might use that is at the top the top call him that you can help with that. You'll see that this actually had a 60% prediction accuracy rate. It was wildly successful know what you're doing is actually moving from predicting the next app to predicting the next action that you might do. So in this does blue highlighted, there you see basically two solutions and these are two such actions that happened in my phone at around 5 to 5:30 in the evening. And this is based on the frequency of usage for various tasks connections that do so

here's my routine at 5. I called my spouse Moana right and then I argue with her husband who has to pick up the kids. We lose that argument and then I just navigate to my kids school to pick up the kids. So to actions have And therefore it is showing up as positions. No additions are contextual and drooping based super example. If you live in San Francisco and you use a transit app at the transit station when you approach the transit station is knows that particular context and it'll automatically she was tradition for the transit

app and types of predictions that you can do based on many different inputs of context and routine. Beetlejuice working with app actions and things like smart select sewing an email or in the browser if I choose a piece of text. What happens is that smart selects Smart Text selection uses machine learning to predict the full entity. So if I click on floor in this particular example AP Statistics the whole floor and water which has a restaurant and shoes me the

action for reserving a table using OpenTable. Now when I click on OpenTable it immediately takes me to the Open Table Florin water PH babies with me. I can book the table immediately. So these are all the ways in which action surface across many different places for the apps that you have installed on your device which leads to more engagement because we understand the paperwork to keep up with you set your apps how and continue to expand upon but what about your not installed app actions also shows up on the Play Store Peach in the Play Store

page if I type and Credit Lady Gaga, you know flight tickets. It shows me videos actions that have apps have bought for apps that are installed on my device and app switcher not installed this allows for Great Discovery because you actually understand the capabilities of the apps that can act upon the query that you're provided in this case of Lady Gaga you get like a bunch of suggestions for apps we can do everything from play music to show lyrics And if that particular Show lyrics app is not installed when I click on it. It will basically give

me the option to install that app and then take me to the Deep link for Lady Gaga. No, I thought you was a very interesting. This is part of the larger actions on Google framework, development mechanism, which includes intense which is how you actually show the capabilities of your app that is a piece of your app and that seemed intent mechanism can be used for a factions conversational action medical programs and so on and so forth. They are that enables you to create actions

across multiple platforms operating systems on surfaces we go a lot more into that. Straight to those were some really cool examples of how the actions of your Android app can be featured and suggestions across the Android surfaces. Now, let's take a look at some of the steps of creating app actions to achieve just that the first using tools you're all familiar with for building and Publishing Android app name of the Android studio and the play developer console. You'll create an actions XML file and this is the central place for all of the actions that your Android app

can support there are two key pieces of information that you'll need to provide for each action. And that is the intent and the Fulfillment which describes the what and the how of your actions So let's say deeper into some of these Concepts and we'll start with built-in intense, which is how you indicate what your action does. My team is Google has built and published a catalog of Bilton and 10th. And of course I mentioned this is one of the core foundational elements of a faction. If you think about the way users ask for information, there's a myriad of linguistic variations that they can

use to construct their query the for example, they can say calming activities or they can save breathing exercises to relax to buy my presentation or they can ask for 10-minute meditation techniques, all of these queries. They indicate that the user would like to destress or they'd like to calm down. So we have designed built-in intense in such a way that they are abstract away all of the natural linguistic variation, and we passed on only the relevant information to your application from the user squeri using parameters. I said to give Google a deep understanding of what your Android

app can do. What you need to do is register for the relevant built-in intense that are relevant to your app. And this will help Google show the right actions to users at the right time. Share some of the intense that we're working on now and the ones that have a star next to them are available for developers to try out today in preview and throughout the remainder of the year will be continuously extending the scalawags to cover as many of you use cases as we can and if you have an Android app where we're not covering the youth case currently, please do give us feedback. We take it very

seriously. I will show you how to do that at the end of this talk. Now let's talk about fulfillment which describes how a user would invoke a specific action within your app. So when users Express an action that they like to accomplish a task or Nintendo they have you can help them fulfill that action by specifying deep links into your Android app sending your actions XML. You can Define the mapping between the built-in and 10th and the Deep link URL and this will enable Google to show the right content to your users from your Android app and he playing them directly

into the experience that they seek and not specific moment. So we have two models for fulfillment in the first model, which is the Euro template model. We constructed you blink your row based on the user's query parameters and your actions except no will tell us how to map the parameters from the built-in intend to the URL parameters. And this model is Azo for action center caps with d plan. Apis and in the content River model we discover that the Fulfillment Euro. Through your web content or the structured data that you give us and then based on the user's

queries. We find the relevant content and then use your actions XML to connect the content to the appropriate incense. So what's an example of how this would work and we'll use the Coursera Android up and we'll show you how they registered for the action of take force. And the reason that they want to do this is so that when a user comes to the Android and they want to they want to know information about courses or education we can ensure that the Coursera app is shown to them as a suggestion and for this down will also show you how the suggestion will show up on

the Google Assistant. So for the built-in in Centre created one and it's called to take horse and it takes a single parameter of the type, of course and this is the parameter that if you were to implement to register for this particular intend you would need to handle it from the user's query and so for example, the user might say take a machine learning course with Coursera and the assistant will know that machine learning is of type quiz. Name or if they ask for find data science courses on Coursera, the assistant will match the data science

parameter to the correct type, of course, it needs that about because that's what the course is about. Now, you might wonder how does Google exactly knows this particular mapping. sew-in cristeros case each course page is annotated with schema.org markup and there's a page called machine learning and it is associated with a specific URL. So this is an example of the content-based Fulfillment that we talked about a couple of slides back. And said to bring these two things together. We have our actions XML

and we tie these two things together in a really simple way. So first you register for the built-in incentive take force. And then we take the course parameter and map it to the main web page URL for Coursera for Learning and that's it. Is that simple? So let's not see how this a faction will show up on my pixel device in the assistant. So let's go to the assistant. And we'll ask about machine learning. machine learning great. So what we see here is a basic part about machine learning from the assistance. But now Below in the suggestions, we see the tape

course of action that we just created and we know that it's from Coursera because of the apps icon now, let's see what happens when I tap into it. Awesome. So it takes me to the Coursera app directly into the machine learning course page that I just asked about and from here. I can do things like a troll and I can also explore the rest of my app. The other user that was really in freshening about how quick and simple that experience was for me and just two tops. I was able to enroll for my Forest right away and for Chris. I was also fairly simple Rice just by

creating a single action victim L file for Sarah can now get users to discover and re-engage with their Android app across multiple touch points on a Android device, but there's one caveat here. This will only work on Android devices today. We know that users are beginning to increasingly to turn to a new device types to accomplish their daily tasks and sending this next part will talk about how you can go beyond Android and how you can reach these users that are using these new devices that don't run the Android. Thanks for your very cool. Just a single file actions are an

example and you can get your actions to freezing up and videos touch point across operating system works great for Android devices that you can bring up the content and capabilities of your other devices which are coming up in the market. You cannot go to an electronics store without smart smart TVs smart displays smart water and all of these are Google Assistant enable Android devices Android devices you as a service provider bring your services bring your functionality to all these new

devices, and why should you even care about that? So Google have been doing a bunch of user research on how people interact get information and get their tasks done throughout the day and are such as shown that the big people in track. Cooper State Parks throughout the day are very different and the type of tax that they do is also very different for example in the morning when you cook a turkey cooking breakfast when you're in a rush to get out you might quickly catch up on news find the prospect find the

better check your calendar and then rush out and when you're commuting you might might you might be checking the news on listening to music navigating and that's really different from any other at home in the evening relaxing cooking dinner watching the TV with your watching the TV with your family and you want to find out the latest Buzz Netflix show the types of devices you use is different than the types of information that you want to get is very different in the contacts that you are and you're relaxed. On the couch or you're driving just contacts are also very different and

increasingly users are starting to use many different types of devices when you're cooking breakfast when your hands are tied your hands are greasy. You don't listen to me want to pull up your phone your brand new spanking new customer phone and when you come you think you might want to get some information on the latest, or what is your days. How does your day look like, but your hands are busy or at least you shouldn't be using an Android phone in those contacts. When you are going for a jog, you might want to check your calories. Are you one of my phone to check the steps that you're

done? But again, you may not necessarily be carrying out your phone and find me when you're in front of the TV and want to get the latest Netflix show the foreman of Venus Ali the best context. Users are increasingly becoming more sophisticated on how they interact with devices and they expect that their devices will provide the right information to them as well as help them accomplish the top in the most hassle-free in the most natural way possible. They don't want to talk themselves into trying to use the device where it's not the right for that particular context. And

Google Assistant has been spending a lot of time thinking of you is critical user journey and being there for the user in all of these contacts. Google Assistant enabled you to interact with views herself completely New Wave using a combination of voice reach you I called you. I and money sort of is Intermodal Behavior. It enables you to create fundamentally new experiences. Stretching and engaging your users in a different way for these different contacts. And the Google assistant is not one across 500 million plus devices everything from phones to Smart speakers to

Smart TVs to Smart displays and headphones. Sweet like you can reach your user in the right context in the right place that where they are. So the answer that increase your breath of adoption of your services across all of these different country. No, making a conversational response in traction Sims complex, but we have built of cutting-edge Technology stack to take away a lot of the complexity of Bulldogs natural language processing individual voice recognition machine learning algorithms for inflection Etc different voice types and using course Google

score a text in the knowledge graph identity payment and put together a technology stack so that we can make it easier for you as developers. Can you imagine building all of this by yourself or every single app? It's just not feasible of your business technology stack and we invite you to build a phone on top of our technology stack and focused on building compelling new experiences for your users. Conversational actions are again for the larger Google framework. You will use the same intensive Maria talked about the Belgian engine set magic

talked about that you use for Apple actions also for conversational action elements that enables your Android apps and services to work more seamlessly with your conversation. But you're probably thinking now home and has Google been asking me to reboot my whole app for wife. Come on OK, so I'm not asking your app. I want to bring this concept of a companion conversational action action is basically providing a snippet of information or helping the user complete a

specific task that the that the user might use of your service in the different context the best way to think about this is through an example Let's take the kids of gold alert us a pretty popular soccer app, which has Rich information on peins in Europe League tables Point all that good stuff. Fantastic. I use it everyday during the World Cup, right? So what happens when I go to a completely different context I'm driving to work and I just want to catch up with the latest Court as to what happened yesterday. Keep in mind an Android app such as this

which is not part of messaging or media will not show full screen on Android auto to show how do you actually get this information to the user and extended reach to a new context to what God has done is the phone that piece of their service such as check the latest score which makes most sense for the user in a completely different context and he'll created a simple conversational action which provides the response for what are the results of the Premier League what happened to the match yesterday or what? So they created a rich conversation for Z Wireless phones, which can be used

across all assistant devices. Lipstick under example to do it is a popular to-do app is a rich to do a bit more than 10 million downloads. It has all sorts of details task information in different ways. You can manage plated Etc. What do you do when you're cooking breakfast and you just want to find what is what tasks do I have due today here to do is text talk to the critical use a Journeys that their uses how and figured out that what tasks do I have you today? It's a it's a

service which goes across multiple contacts across multiple devices action to provide a single easy response to what do I have you today? action schools for building this conversation actions foreup actions you familiar with this your created an Android APK. You're going to add an action start XML file using Android studio in Android studio and you'll publish it using the standard play control mechanisms for publishing your Android app. That stick. For conversational

for conversational again use the same built-in intent mechanism to build conversational actions. You would build simple conversational actions using dialogflow, which is provided by actions on Google and your actions console to manage publish your conversational agent in conversational action. Let's see how all of this works together from a high-level architecture perspective if I'm creating an app and it says it is krishna's tutuapp. It's a tough based on

what I'm going to do is I'm going to put up a web server. No. Jsr. Apache of water were connected to a database there. All of my users to do tasks. Then I'm going to create an Android app, which basically talks to the web server to get that users information pointed back to my app, which will render it based on the UI design of my app. This team concept works for offense down the street and it also works for a bed base interface to your app. When you didn't go to conversational agents, what you going to do is you're going to create a conversation agent,

which basically will talk to your web server using the same API same infrastructure that you're provided. What happens to the equity of what do I have today? The accident will take care of all of them larger language processing query understanding it understands what a query is and then in walks your agent your agent will then basically. So I could do a full sermon for the credit. So to do the Fulfillment, what you're going to do is you're going to use a webhook which will connect to the

web server on your back end and then a response back to the user. The key Point here is that they're used you as a developer a using the exact same infrastructure API. Set your only used for your apps to also provide a conversation response for all of us assistant enabled devices. No, let's talk a little bit more about this foundational layer me explain a little bit about building in 10, which basically lets you create easy seamless interaction between your Android apps

and your conversational agent. Mexican go back to our to-do list or to do app and it said think a little bit about account banking you probably as an Android developer, especially if you have some sort of service back-end have some form of an account management system. Todoist uses biggest identity mechanism including Google has an identity and as a user when I start using todoist on Android, I can create my identity using Google or through some of the mechanical. When you go to the assistant and when you set up a to-do list for the first

time on your assistant, you would have to link your to-do list identity to the assistant. Once you do the Lincoln, you can use your conversational restaurant across any assistant enabled device without having to login into to do it. Again. The key Point here is that in in actions console you can link your account so that things work seamlessly and with one step of accounting for the user. The user can use todoist across any assistant enabled device. Thanos account linking let's talk a little bit about play

entitlements and seamless digital subscription. The interesting thing is that booty assistant and Android uses Google Play for inventory and Order management and entitlements management. Both of them use the same Google Play infrastructure entitlements subscriptions in app purchases books in the city across both assistant and Android Take the case of Economist espresso Economist. It's a great magazine and they have an app called that Economist subscription

feature to get their premium articles event in logging into my Android app and pay the subscription in Android. Now, when I go to the assistant and use Economist espresso immediately get access to all of the premium content because the assistant also knows that I have a premium subscription to The Economist espresso and hens am available eligible for all of the premium content and this works reverse. Also you can you can purchase subscriptions in the assistant and it'll work on your Android app also. End of this be this works again can be done in the actions console

assistant first will check whether you have access to these premium subscriptions. You can then what she's content using the sink inventory in order management system and then your pay your premium subscriptions work across it and it's mention also across Android companion conversational action music. This is what they thought of this very specific Tuesday and build a conversational action to hear a my principles for building a companion conversational action. You're not just an

app provider. You are a service provider and if it still has provider you want to take your service your functionality too many new contacts to think beyond the app and think of the service that you're providing. Also, what is the critical user Journey for your service? Like how do people actually interact with your service in? What context in what device and can you think of new content devices in which your service actually make sense to the user? I know so you're going to expand the reach of your service quite a bit, but he's not think

of just replicating your whole app for voice. You need to think of the specific service that makes sense for the user to my challenge to u.s. What are the company in new experiences that you can create by integrating with the Google Assistant? Please join these number of providers will only create an Android app, but it was expanded to reach to a number of Google enable devices by building a conversational agent. Thanks for son. That was a really high level overview for how you can think about companion actions to your Android up how to really think about

the experience that you can provide reusers for this times in their life when they're Android devices with might not be handy. So there's two such times that comes to my mind right away. And first of all first of witches while I'm driving and my full attention used to be on the road and my Android devices usually in my pocket and then the second is while I'm cooking and my hands are usually either busy or they're dirty and it's really inconvenient to have to keep going back and forth between washing my hands on liking my phone reading the recipe for making my food washing my hands

unlocking my phone you get the idea, right? This isn't normal cooking process. So I have two devices to help me in these situations. I have the assistant in my car and I have the smart display at home. Thought I'd like to go back to the Coursera example that we introduced earlier and I'd like to show you how we help them. Take their app Beyond Android so that it could accompany me throughout my day for the situations that I just described. So we all know machine learning is a pretty hot topic right now. And so as we saw earlier, I did register for it on for Sarah and I

like to go through my courses after work. So I like to listen to the course podcast while I'm driving in my car and then when I get home, I like to continue watching the video for the course that I started to listen to in the car line, either my car nor my smart device that home run Android right both of these devices. They only work with the Google Assistant. And so let's see how we helped build the Corsair companion app that works on the Google Assistant. Here's the actions XML that we showed you earlier where they registered for the tick course built in in 10, and now after

their APK was approved and published in the play developer console there. Now, if you're an option there to enhance actions with actions on Google and clicking their lansas in the actions console project claiming page. Now. This console is a really focused way to develop and manage your companion action for the Google assistant and we're very excited because this week we announced the major redesign of the console and it basically helps you do three main things. The first allows you to configure set of the metadata and the directory listing information for your companion app.

And then it allows you to manage the developments the testing and the deployment process in a very fine-grained way. And then once your action is published, it gives you an illiterate so that you can track how your action is doing out there with real users in the real world. Now, there's a lot of magic that happens here during this project cleaning stage, for example, the automatically import most of the information that you provided any actions XML to make it easier for you to develop your companion app. Now, let's see what the actions console looks like for Coursera after they've

gone through the claiming process. Does this is the actions console and we can see that it's already set up with the cristeros demo out. Let's look into this action section. And we see hear the tape course built in intense that we had an Android that's automatically imported from actions XML. We went through that claiming process. and when we click in will hopefully see the magical information choucair. There we go. So we see all of the triggering all of the important information that you would

want to know about this action. So for example, the triggering phrases of their these are sample and vacations that users can use to invoke this particular action and we've see the parameters as well as the foreman information that you provided us in your actions XML and that was automatically imported into here. Now that we have this basic wiring setup. What we can do is we can create a rich response using garlic low to cover those two cases that are described. I wanted to have earlier for Coursera to to listen to my first podcast while I'm driving and then to continue watching my

video when I get home and to do that you would just add the Fulfillment of conversational type here and this will land Us in dial a flow where you can specify the details about the Fulfillment time today. I won't show you how you can build fulfillments from scratch but there are a few key components that I'd like to walk you through so that you can see how to do this for yourself. So this is the deluxo to and this is the JavaScript for our fulfillment code. We see here that we first use the tick course built in intense that we walk through earlier. And then we

reference the chorus parameter to understand. What is the name of the court that is coming in from the user squaring. We also created a function that cause the back end for Sarah service to get the details from the chorus that the user is asking about and then we get the response. What we can do is Parson and for this particular example will just feed a simple card that shows has all of the course information and then we use our media response API to return the audio or the video of the corps is

based on what device I'm running on Now One really really amazing thing about dialogflow is that it's simply integrated with the actions consulate has a really deep integration with that console. And so this provides a really nice development environment where you can do things like test how this would work on the Google Assistant directly from Des UI just by simply clicking in the section. So, let's see how the Visa card that I just showed you recreated would look like in the simulator. So this is the simulator

and we also have a test on device option. So you can test during your process to see how it would work on the device for free for now. Let's see how this would work when we invoke are Coursera demo apps that will say ask for Sarah demo to start my machine learning course. Sure. Here's the test version of Coursera demo. Here's more information about machine learning. Do you want to start the course that we created? Right? So all of this description information down here the title of the car this image the suggestion chips down there. They were all specified in the Fulfillment code

that I showed you and dialogflow. So now let's say yes to start the course podcast. Sure, here's your course on machine learning and that I started doing that in my car. And then when I get home, I want to continue watching my video on the smart display device. So let's try it here. Hopefully it work. OK Google continue my course. Sorry, I don't know how to help with that yet. I tried it on trying to learn. OK Google continue my course Sorry, OK Google continue my court.

Okay, let's get Coursera Wilson before Sarah. You want to continue your machine learning course? Yes. Sure, here is your course. Thank you. My God. Alright, so let me see what we just did. We showed you how to create actions XML and how to use built-in intense to enable actions in your Android app. And then if we showed you how by doing so your app cannot be discovered across too many surfaces on Android now, we briefly lost over these two, but we had our colleagues give a whole talk on how to build a back since 2 days ago. And if

you missed it, don't worry, please do go ahead and watch the video on YouTube in the main part of the demo Wii walkthrough how to claim your actions in Google project so that you can enable your actions to work on your devices. And finally we showed you how to build and test the simple companion app that works great with audio and video on a smart display now, we certainly think this is very cool were very excited and we try to make it really easy for you so that a companion app like this is just another interface to your existing service building experience. Is that work on devices? Which

don't run Android is not about bringing up completely new infrastructure from scratch. It's more about just extending your apps to be more action Centric and more focus on connecting with your users throughout their day, wherever there are weather at home or they're on the go sofas that let me start with that. Let me invite you to visit our web page of actions. Google.com to learn more about the concepts and the tools that we covered today and we want to hear from you. So please do sign up at G. Co backslash apps app access to that. You can

give us feedback on our built-in intent catalog and also by sending up you'll get notified when abductions become available, but most importantly what we encourage you to start thinking about the experience. You can provide your users for those times in their life when they're Android Device just might be impractical or just inconvenient to use in that moment. The next up today and Bill companion actions on devices that work for the Google Assistant. We want to thank you for joining us. And if you have any questions, please come by our office hours in the sandbox area. Please enjoy the

rest of your time that I am.

Cackle comments for the website

Buy this talk

Access to the talk “Integrating your Android apps with the Google Assistant”
Available
In cart
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “2018 Google I/O”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “Software development”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
159
app store, apps, development, google play, mobile, soft

Similar talks

Ilya Firman
Software engineer at Google
+ 1 speaker
Luv Kothari
Product Manager at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Matt Carroll
Developer relations engineering at Google
+ 1 speaker
Daniel Situnayake
Developer Advocate at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Madan Ankapura
Product Manager at Google
+ 2 speakers
Benjamin Poiesz
Software Engineer at Google
+ 2 speakers
James Smith
Software Engineer at Google
+ 2 speakers
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Integrating your Android apps with the Google Assistant”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
558 conferences
22059 speakers
8245 hours of content