Duration 38:39
16+
Play
Video

What's new in AR

James Birney
Product Manager at Google
+ 1 speaker
  • Video
  • Table of contents
  • Video
2018 Google I/O
May 8, 2018, Mountain View, USA
2018 Google I/O
Video
What's new in AR
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
135.41 K
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speakers

James Birney
Product Manager at Google
Eitan Marder-Eppstein
Engineering Manager at Google

James is a Product Manager on some of ARCore's exciting new features. Prior to the AR team, James spent five years leading two engineering teams that built revenue critical tools for the AdWords sales teams. Before Google, James worked at Cisco as a Portfolio Manager. James holds a BBA from the Ross School of Business at the University of Michigan (Go Blue!).

View the profile

Eitan Marder-Eppstein is an engineering manager at Google on ARCore - his team writes software enabling great Augmented Reality applications. Prior to joining Google, Eitan was President of hiDOF, applying advanced algorithms from robotics to real world applications. He also worked at Willow Garage as a core contributor to the Robot Operating System (ROS), writing autonomous navigation software for mobile robots. Eitan holds a M.S. and B.S. in Computer Science from Washington University in St. Louis.

View the profile

About the talk

Learn how to create shared AR experiences across iOS and Android and how to build apps using the new APIs revealed in the Google Keynote: Cloud Anchor and Augmented Images API. You'll come out understanding how to implement them, how they work in each environment, and what opportunities they unlock for your users.

Share

How's everyone doing today? All right. Well, welcome to Google. My name is a town murder Ops Team and I am an engineering manager Google and I work on augmented reality and you are familiar with augmented reality in general really really great thing like refresher about augmented reality. For those of you who may be aren't quite as familiar with it and especially really really excited about our platform for augmented reality and the capabilities that we give

to some of these devices. All right, so I need my clicker. So I'm actually going to go over here to get the presentation started but off we go so smart. They are kind of steps from this observation that over the last decade our phones have gotten immensely more powerful CPUs and gpus have improved a lot but the ability of phones to see and understand our environments and really make sense of the world around them until very recently was largely unchanged and limited. So if you pointed your phone at this table and would allow you to take a picture of the table or even a video of your

friend climbing over the table, but your phone wouldn't really have an understanding of the geometry of the table of its position relative to the table as it moves through space. And so what augmented reality seeks to do on smartphones, Is to take all of this amazing advancements in computing power and leverage it to bring new capabilities to your phone and to take your phone from Beyond just the screen Beyond its own little box to expand it to understanding the world around it. So now what my phone looks at this table it can see that there's a surface there

that there are chairs next to it. And as I move through the environment, my phone can actually track its position as it moves and we think at Google that augmented reality is really exciting and we've been excited to see some of the stuff that you've belts and we've kind of categorized it into two main buckets where we think augmented reality can be really really great for applications. The first bucket is we think that augmented reality can be useful on smartphones. So recently I was remodeling my kitchen another to happen if you ever modeled anything in the house. All right, so

if you've done that, you know That measurements is a real pain and what I needed to do with measure for a backsplash subway tile for a kitchen and I instead of taking a tape measure out actually pulled out. My phone went to my counter and measured from point A to B moving any of my appliances where I would have normally had to move in order to get an accurate measurement with my tape measure. So AR to be useful in that way just from providing a better geometric understanding about your environment. They are can also be useful for shopping applications. So

recently we had some very old chairs at my house and my partner and I were looking to replace them kind of like these chairs here and we were getting into a debate over which chairs we like to party and so with reality we were able to take a 3D model of a chair place it in the environment see the exact size and scale and color and we could have our arguments about inevitably what kind of Terry would have at home rather than exposing. The store and be more targeted about how we made our purchase and even by this furniture online and feel much more comfortable with it,

but they are can also be fun. So imagine character running across the floor jumping onto this chair and jumping onto this table or sitting in one of these chairs and having the floor drop out from under me to create an ice fishing game ice fishing sounds a little bit boring, but I can tell you that in this game. It's actually a lot of fun and they are can also be used for Creative expression here. Now in your pocket you have a lot of ability to go out and create new things that were previously

only capable to be created by professionals. So you can generate computer generated content on the go on the Fly you can take your favorite character and put them in and have your friend or you can take Are hot dogs are your favorite food items and put them on the table in front of you. But now you have this amazing video editing capability in your pocket. And for those of you who has seen are AR stickers application on the Google pixel phone, you know what I'm talking about. And for those who haven't please check it out. It's really really cool to have this creation

power in your pocket. That's great. They are two useful. AR can be fine. But how do you actually build applications for AR? How do you get involved as developers? This is the developer conference. So how many of you are familiar with arcore when I say I require all right about half of you so arcore is Google's development platform for augmented reality. We want to make it easy for you to build applications that take advantage of these new capabilities that sounds provide of the ability of phones to see and understand their environments and to build

applications that actually react to this understanding And arcore was launched a few months ago and it provides three main capabilities to allow you to do this. The first is something we call motion-tracking. So here consider the example of the Scarecrow from The Wizard of Oz and wanting to place the stair Co at a and make it seem like he's waiting in line because everyone loves tacos look at the Scarecrow with my phone. Arcore actually understands its position relative to a virtual object that I placed in space. So as I move a meter forward the phone knows that I've

moved a meter in this direction and as I turn left the phone also knows that it's able to track its motion as I move through space. And now if I combine that with my desire to produce the Scarecrow a meter in front of me, I can put the Scarecrow, right and as I move my phone around I can change where I'm rendering the Scarecrow in the virtual seen to match my physical environments. So that allows you to register virtual objects to your physical seen in a very natural and intuitive way II capability that arcore provides

automation engineering our Wizard of Oz theme we got the Cowardly Lion and when you turn off the lights we want to make the lion afraid because it's cowardly So here arcore is looking at the camera feed and it is estimating The Real World Lighting of your environments. And with that estimates arcore tune out light characters in a realistic fashion helping you to build a more immersive experience that looks natural because the virtual objects that you're putting in your skin. Look, correct. So you can see the tone on the lion change when it goes from light to dark and you

can even script interactions for your characters in this case making the lion afraid when the lights go off and the third capability that arcore provides is environment understanding as arcore is moving around the world and it's also estimating the lighting of the environments are also trying to recognize surfaces might recognize this plane below me which is the ground or this surface here, which is the table or even maybe this vertical surface behind me and it allows you to place objects that are ground. It's reality. So if we want to place the Android

character on this table, I can detect the surface and actually placed my virtual character on a physical object in the world. So there's a bilities motion-tracking lighting estimation and environment understanding and when you combine them together, it allows you to build these experiences that were previously impossible that bring the virtual and physical worlds together and melt them into a new reality that enables people to see and experience your application in a new and different lights are really excited about this and the opportunity to bring apps to our ecosystem for it.

And so we have worked really really hard to expose support for arcore on as many devices as possible and with help from our partners in our Android OEM ecosystem today arcore supported on over a hundred million devices, and we're working to increase that numb. Every single day we believe that augmented reality is a next shift in Computing and that soon everyone will take for granted that this power is in their devices. So that's our scale, but we're also interested in stealing the capabilities of arcore. We

want to teach arcore to do new and interesting things and that's what the rest of the talk is going to be about So today, we're announcing some new things and they are Korra and they fall broadly into two categories. The first is we're not seeing some new capabilities for arcore improving what these devices can do. Those are augmented images and Cloud anchors and we'll talk about them in the talk today. And then we're also announcing some new tools for are corn one new tool is how you can use augmented reality on the web and which we think is really exciting and you can check how to

talk to that later today at 12:30 p.m. And another is how you can more easily write 3D applications for Android and a are specifically we've introduced arsene form Library, which is a helper library for 3D rendering on Android and we encourage you to check out that talk at 5:30 today. So enough about the Preamble we're now going to get into the meat of it and talk about what's new in arcore. And I'm going to take it off with our first feature chart augmented images images stem from your feedback. We've heard you as you develop augmented reality

applications ask us. Hey AR is great. Wouldn't it be better if we could also trigger augmented reality experiences off of 2D images in our environment like movie posters or text books? And so augmented images images provide a mechanism to take a 2d texture in the world and make it more engaging by expanding it to a 3D interactive object and to consider the case where we have a new children's toy. It's called a castle toy, I think and we have Told arcore. Hey, we want you to recognize the surface of this castle toy box. So now as part of the product you can hold up

your phone to it and you can actually have an immersive experience out of that box a more engaging experience for your product. So augmented images and behaviors and take this flat surface and turn it into 3D which we think is really exciting and it's based on your feedback. You told us that you wanted this feature and now we have it so that's the future in a nutshell, but I want to tell you about how it works and also how you can use it in your applications. So augmented images fundamentally work in three major steps.

The first step is you need to tell arcore. What images you're interested in and there are two ways that you can do this. The first way to do this is to tell arcore that you wanted to text certain kinds of images in real time. So you could download an image from a server in your application. Can you tell arcore at runtime that hey, please load this image? Learn how to detect it in the scene and tell me when you do the second option is to tell arcore in advance. So we provided tools where you on your desktop computer can take up to a thousand images and

trained arcore on them in an offline fashion saying I would like you to be able to recognize any of these a thousand images when I run my application on device. All right. So the next step is now that we've trained arcore to recognize these images. We actually want to detect them. We want to show Eric orenstein and have it detect the images that we trained. So now when they are core moves around the environment with your phone will also look for Textures in the environments and try to match those to the textures that you trained on and when it finds a match arcore provides you information

on that match with the third step, which is it gives you a tract object. So for those of you who are familiar with arcore track two objects are notion for the physical objects in space that arcore knows about to this point that's been planes like these surfaces both horizontal and a vertical but it also can give you points in the environment of interest that you can attach to and now an augmented images just another track table jecht. So you use it just like you would use any planes or any points and you can attach your virtual contents to the detection of the physical object in the world.

That's it really simple. Three simple steps number one. Tell Eric or what you're looking for number to have arcore detect objects in the team and number three. I attach your virtual contents to these physical objects. And because this is a developer conference. I want to show you those same steps and code. We're going to go through them in a really quick. But this is also the same for Unity and unreal the concept Supply across all of our development environments. So we'll go through the same exact steps again. Step. Number one is you need to add images to a r course memory you need to

tell it what image is interested in and so here we're creating this new augmented images database and just adding an image to it and we're doing this in real time on the phone. Now, this is a little bit expensive you have to pay a cost computationally for each him if you add so a little bit later also show you show you how to create it with the alternates on the computer, but once arcore has a database of images that I can detect. We got to the Second Step. So the second step is arcore is always looking for those images for you and you can get it from the AR frame each and every frame that

Arc sword that AR course, he's in the world. So now you've got a list of all the augmented images in the scene and you want to attach virtual content to it. So that brings me to the third step. So for some number 3 you just take the augmented images that dog meant to damage that you want and you create an anchor off of it. And then you can attach virtual content to that anchor and it's the same as you would for any kind of plane detection or points of texting that you've been used to in the past. That's it three simple steps. And if you want to do the Kree computation on

the computer, this is what you run. So there's a command called Bill DB and you can pass up to a thousand images into this command it'll build an image database in advance that you can then load in AR chords using this code. So this load the database from file pulls it in its computationally efficient because they are court has already done the work that he needs to do to be able to recognize these images later and now you can go off and running with the same other two steps that we showed before which is detecting the image and then placing content relative to it.

All right, pretty simple. Now. I want to show you a demo of this inaction. So we're going to switch to the pixel phone here. And we're going to run this augmented images demo. We've actually trained arcore to recognize this poster on the wall until when I look at the poster. You can see that it fades out and it goes from 2D into 3D and now as I move the perspective that I see changes. So I've got a 3D object coming out of this 2D texture. Nothing has really changed in the world, but I can make

it more engaging and immersive. And now I want to talk a little bit about some use cases posters are great for demos, but we think augmented images have a lot more potential as well. The first use case that were excited about his education imagina textbook coming to life in front of you or going into a Museum tour where artwork on the wall jumps out at you and gives you more information about the art or maybe their progression as they were sketching a painting. We think augmented images are useful for advertising

advertising is all about engagements. Imagine being at a movie theater and holding your phone up to a movie poster and having content come out or telling you showtimes or imagine being at a bus stop with a little bit of time to kill in engaging with the ad that you have on the side of the bus. Stop station Reading augmented images can also be useful for the products that you're advertising. So here you can build products that meld the physical and digital worlds that bring both together. It could be castle toy where you have an experience that comes out of the box itself or it could

be a how to guide for your coffee machine as you try to make coffee for the first time with your expensive espresso. You have no idea what to do. So we think augmented images expand the capabilities and they are in general and we're really really excited. And we also are not done yet. We're going to talk about one more feature today and for that I'm going to bring up the product manager works with me and he's going to talk to you about Cloud anchors. Thanks very much. So real quick before we

get started you guys been sitting for a while and I really like doing this the being ever thinks we're going to do the wave real quick going across the room. All right, you guys ready? Laptops ready? Alright 321. Yay arcore. It works. All right. Thank you guys. Alright, so like a town mentioned, my name is James Bernie. I'm a project manager on the Air Corps and specifically on cloud anchors anchors announcement yesterday. All right, good. That's like slightly 1/2 awesome. So that's what we're going to cover in this section. Hopefully you guys are going to be really excited by the time

we get through talking without anchors and you going to want to meet Lee start building. So before we happen to clankers, it's really important to start with where AR is today. So can I get quick hand if you've built an air out before? All right, that's roughly about half of you. So for the other half what happens when would say that together? We're going to build an app where we're going to play some dinosaurs. And so we're going to have a T-Rex over here and maybe a Triceratops over here and they're going to interact the way that we would do that in an AR app.

Today is we would plant anchor and then the T-Rex and Triceratops would be places relative offsets from the from those anchors that becomes your reference frame in your AR app. Now it say that a timer to come back up on stage. He's not going to come out cuz that's a long walk but a tan go ahead and creates a separate dinosaur rap over here and he places a bunch of pterodactyls. And again, he played plant an acre and a pterodactyls real place relative that anchor now what's missing is 8 tons

is running in a different reality different augmented reality than the app that we have over here. And the reason why is those two anchors can't talk to each other? So this is what cloud anchor salts is we give you the ability to create a shared reference frame so that reference before we have the anchor you have the offsets with two are T-Rex and Tori pterodactyl that now we can have a common anchor in the middle and a tall they are content. So everything from pterodactyls two T-Rexes are able Glen interact. Play and then create these really

funny experiences. We're not only is my content interacting with a x content but I can control aton's content. He can control mine. That's pretty cool. So that's kind of an abstract thing where I like literally moving my hands around onstage more concrete example would be are just a lineup which if you haven't seen it before is a experimental apps that we use Google built. It literally draws a single line in space and what we added to it is the ability to do not just one artist but multiple artist drawing in the same space. So I'm going to show you a extended

version of the video. They showed you really quickly yesterday. Where you can see multiple artists drawing together. Hopefully you see from this video the powerful experience that you get out of this. We're no longer is it we're now you're able to interact with your friends and draw together and what one person draws the line you can build on top of that. So give me a second here for the video to finish and for you guys to absorb what's going on. Cuz that's a New Concept. Okay, so let's talk a little bit about how we create these

car tankers. We've done an awful lot of work to make it very simple so that it's only a few stops. Let me walk you through them. So Step One is let's take in this example. We make our stick woman. Her name was going to be Alice. And Alice is going to place a call to ankur. Now the verb that we use to create a contact group called hosting the reason why is we're going to host that native anchor up to the cloud. So when we host that caught anchor the features which are the visual features in the environment. So let's say that Alice is standing here and As I was looking

at the table, she places a cloud anchor or the Apple place to place a cutting her for her on the stage right here next door beautiful succulent you guys like her? Okay. Thank you. I appreciate the one person. So what the phone is going to is going to extract from the environment is all the points where these leaves come to what the phone will see as contrast points where the colors change for the lighting changes. So the edge of the table the edge of the table cloth every point where the leaves kind of change those are the visual features, they get

extracted and then get uploaded to the cloud that then gets saved. And processed and what Alice gets back in a couple seconds. Is that cloud anchor anchor? Is a really important attribute that attribute is the cloud anchor ID. So you can kind of think about the cloud anchor ID as anchors as a the same way you think about a file so say you're going to save a file to Google Drive and when you save it, you need to create a file name, right? What was caught anchors we're going to create that essentially that file name or that ID

for you? And that I D is the way that you're going to reference it later be really hard to find the file without knowing the name, right so that the cloud anchor ID is the same concept. So how this comes into play is all Alice needs to do to get Bob are Stickman over there to connect to Alice's Cloud. Anchor is to seize me is to send over. Cloud and Grady to Bob that's all changed all she needs to send out. One Stop has the cotton karate. He then uses the cut anchor ID to Ann Arbor. Resolved. And resolving will add The Cutting Crew ID to

Bob's reference ring. So what say that Bob is standing right here as well. He looks at the same area the visual features that will get upload to the cloud in the cloud will match those visual teachers against the visual features that Alice had previously uploaded and we will give Bob back a cloud anchor that will be relative to where his devices. So even though both devices are in different locations will create the cloud anchor in a consistent physical location. And that's the magic because there any consistent physical location you then have a shared reference frame and then at that point

we can place again. What's the use dinosaurs? Cuz everybody loves dinosaurs were meeting place are dinosaurs relative that caught anchor and we can start or shade of Sheridan AR experience. Hopefully that makes sense. Okada here comes back. And I'm going to tie it all together here. We create a very fancy visualization the orange dots that come up. Those are the visual features were talking about pick up to the cloud Bob upload his visual teachers up to the cloud. I think it matched and then the two of them create the same shirt reference ring. And then once that shared reference frame

is created. Wait a second for the gift of Lupron you'll see that that spaceship show up and then the two of them can follow the spaceship around the room and once they're paired then the devices can go anywhere in the room and they're in the same reference frame. They can interact together. Alright, so let's keep on going one level deeper like Inception. Resume sample code so same format as before. But before we get to those two methods of hosting and resolving, it's really important to be to be an able to feature. So when you're working with Eric or interact with the

session. Config and turn on our feature you need to do this on all devices, but hopefully it's it's pretty straightforward. Then on the first device, so this is Alice's device. The one that creates the cloud anchor. The main method we need to call here as host Cloud. Anchor. On host with huascar. Anchor you can feed in any pre-existing native anchor. So as a Thomas mentioned before normally secreted from the horizontal plane or now from vertical plane and you can pass in that anchor into host Todd anchor asynchronously that call

will complete in a couple of seconds and what comes back as your Cloud anchor now what we talked about the really important thing that comes from the car tanker. All right. Call me crazy. Thank you. So then it is completely up to you what means of device to device communication you want to use the demo they were going to show you in a second uses Firebase. There's also two other Demos in the sandbox is encouraged a check out those out to use fire bases while it's a great means to communicate between but you can use any any means you want. So then

on Bob's device and it's a really important Point here. This is not limited to just Bob. We could also have Bob Jerry Johnny a tan when you can be as many users as we want that all they need to do to join that cloud. Anchor is received the cotton. Karate. That's the one that Alice just sent over. And then we need to resolve that caught anchor in order to resolve the tanker dead. Simple. All you need to do is pass in the cloud and Grady in the background under the hood. We will take those visual feature from what the user is currently

looking at something for that. The users again currently looking where Alice was. And will upload those features and I can give you that caught anchor back and then at that point you're good to go. You can start placing assets relative to that caught anchor. so quick question what operating system for those devices not code example running on I want all right. So the really important Point here is cloud anchors work on both Android which means any arcore enabled Android device and Nei Nei o sarkar

kit and Abel device which for today is going to be iPhones. And we believe this is a really important. This is incredibly important to making sure they are a reality. There's no reason that we should discriminate which of our friends can play a game with us based on which operating system they run on their phone that's not really important to whether or not a tan and I are friends if he has an iPhone. He should be able to play shared AR with me, right? so no, I'm going to buy a tan up on stage. And

we're going to give you guys a lot of time cuz it's one thing to. Show you it's one thing to say that everything works cross-platform, but it's another thing to show you guys with live demo. All right, so maybe one last poll just to get started. I'm going to win this game crazy, man. Stop at the rest area. Right so you guys are getting better every minute and it's really important 08 on sandbags a lot. Ty I just got to join this room Okay. Now I'm going to set up

my board and shame she said up yours and you want to get close to me. Right? Same state is being reflected in both at the same physical location. So I'm going to press that hears our futuristic-looking light boards. And we have to go on the back of the organizing pets in case anybody wants to make money off of here is to turn the other person's board your color and I feel like James has been sandbagging me in all of our practice sessions cuz he's doing much better than he has in the past. Let's

see. Oh no. I was so close. Hold on. All right, one more shot. Did I get them? All right. Thank you. All right, and it just to reiterate so that was an iPhone that a Tom was using this is a pixel 2 but this very well could have been any Android arcore arcore enabled device that could have been any are kitten able device. And there's my Clicker. So let's talk about use cases. There's that was gaming. It was an example of gaming working really well, but there's shared AR does not need to stop at gaming. We think there's a whole lot of other categories

where I can make a big difference in the world Hoops. Can you help me please? Pretty please. Thank you. Okay, so four categories that free fluid talk about so one is in the education space. This is an example of let me ask a question instead. So which would be raised your hand after I say two options you can learn about what it's like to explore on Mars in the Mars missions from a textbook option a option. You can learn from an interactive 3D model of the Rover that you can play with with your

friends option be alright improvements and how people learn this is example that NASA built for us d This isn't need to stop at space exploration. Hello, like that's a pretty big area to explore you could do this as well as any sort of visual area such as biology. There's a couple cool demos where you can explore the human body together. I'll leave it at that. Creative the top on down to creative expression. See you saw her just the line example, which is where we draw the white line in space, but we can go beyond that take for example this block building at that

was built by a for you where you can build a full block willing thing and then 3D printed later. It's very very cool. And you can imagine what this would look like as well with the AR stickers raise your hand if you play with ar stickers so you can imagine what it would look like if now is replacing Stormtroopers or help me the demogorgon as you're placing demogorgon. Someone else can play Sal and have the fight be between your different phones. That would be very fun experience. Gaming so now you can do ice fishing with your friends.

Haven't you guys always want to do that? Believe me. It actually is an awful lot more fun than it sounds when you just ate ice fishing with your friends. It's particularly fun on a hot day in San Francisco to be able to look down the sidewalk and turn the side walk into a ice fishing pool. Beyond ice fishing you could also play you can imagine playing laser tag with your friends and I'll be just with your phone. So you don't need to buy special gear. You can just two people quickly pair dude host and resolve and then you're off and going and playing laser tag with as many of your

friends as possible because Cloud anchors are not limited just to two devices you can use and number of devices. And then shopping. So how many of you guys have bought something and then head your partner when it actually showed up veto it. Then you had to return it show hands. Yeah, that's like a big pain right that you have to go through or like find a UPS Store the FedEx store or mail it back. That's not a good experience. It's a lot better if you can preview it with your partners. So now with Cloud anchors if I'm placing a speaker system here. I

can have my wife also, look at the speaker system from her phone and there's a certain there's a feeling of consistency and a feeling of trust that you built. If you're the advertiser or the e-commerce site that if you have two users looking at it and shows up consistently for both of them you build this trust that the product on buying when I'm previewing it is actually going to look that way when it shows up because it's showing up on multiple devices. All right, so that's everything for cloud anchors now. Let's talk about getting started.

So Eric or no surprise already supports unity and unreal your standard game engines and then obviously we support Android Studio for Android native development. As well since car tankers are cross-platform. We we provide a SDK so you can do your development xcode as well. All four of these environments are live as of yesterday at 1 p.m. Thank you. So for the folks here, I owe you guys have a bunch of resources. Are you folks have a bunch of resources that you have at your disposal.

Please take advantage of them. There are three awesome Demos in the sandbox. If you guys like playing late light Ford and especially if you want to play a tan and light board Are sandboxes right over there in the air? Sandbox 8 time will be there up until somebody Pizza in Friday's on. Thank you. We also have the just aligned demo over in the experiment sandbox. Please check that out and then the demo that a tan showed with the with his picture frame as well as to others are available and the AR sandbox. It's a really really funny joke. Please

go ahead and play around with you. I suspect it'll give you a bunch of very cool ideas for what you can what you can build. We recovered Labs we have over 80 workstation setup. Please play around with them. Every workstation is also paired with a Android device. So not only can you go to the code but you can actually compiled it onto the phone and then you can play with you can see what the code you just built actually works like on a phone. Anime outside of office hours. Please take advantage of that. We have some incredibly intelligent Guru staff to answer any questions you have. 90

quick Shameless plug our team. The Air corps team is incredibly busy giving talks this week. Please take advantage of those done an awful lot of work putting those into you to give you a very concise explanation are there's two more today and two more tomorrow. And then after I owe or the folks online developers are, they are developers. Google.com AR. Has all the extra resources process all the code labs are also available on there. And again all four of our SE case are available as of yesterday. So thank you very much.

Appreciate your time.

Cackle comments for the website

Buy this talk

Access to the talk “What's new in AR”
Available
In cart
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “2018 Google I/O”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “Software development”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
159
app store, apps, development, google play, mobile, soft

Similar talks

Tom Salter
Tech Lead at Google
Available
In cart
Free
Free
Free
Free
Free
Free
Clemeintine Jacoby
Software Engineer at Google
+ 1 speaker
Rose Yao
Director of Product at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Brittany Mennuti
Product Manager at Google
+ 1 speaker
Bruno Oliveira
Software Engineer at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “What's new in AR”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
558 conferences
22059 speakers
8190 hours of content