Duration 40:12
16+
Play
Video

Building AR apps with the Sceneform SDK

Lucy Abramyan
Software Engineer at Google
+ 1 speaker
  • Video
  • Table of contents
  • Video
2018 Google I/O
May 9, 2018, Mountain View, USA
2018 Google I/O
Video
Building AR apps with the Sceneform SDK
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
39.85 K
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speakers

Lucy Abramyan
Software Engineer at Google
Romain Guy
Senior Staff Software Engineer at Google

Lucy Abramyan is a software engineer on Daydream where she leads development of APIs that make rendering more intuitive for Android app developers. Prior to Google, Lucy worked at the Jet Propulsion Laboratory researching immersive display technologies and writing software for Mars rovers. Lucy has a BS in Computer Science from Harvey Mudd College and a MS in Computer Science from University of Southern California.

View the profile

Romain is an engineer at Google. He worked on the Android Framework team, leading the UI toolkit and UI renderer, and now works on new Kotlin and graphics related projects.

View the profile

About the talk

Sceneform SDK is a new library for Android that enables the rapid creation and integration of AR experiences in your app. It combines ARCore and a powerful physically-based 3D renderer. In this session, you'll learn how to use the Sceneform SDK, and how to use its material system to create virtual objects that integrate seamlessly with the environment.

Share

Welcome if you want it. I'm home again from January to come work team and no can send me to reveal some of the stuff. I've been working on. I'm actually going to start off by telling you something very personal to me and that I'm very passionate about. But it's not just one thing. It's actually a combination of things that I love. Ever since I was a child, I've been Marvel by space. Isn't it amazing to think about it for a second the diversity of the planets everything that's going on around the solar system right now. Wouldn't it

be amazing if we could just see it up close? There's just one problem though humans can't go to another planet not even Mars just yet or can we? And this starts the second part of my passion augmented reality is delightful. It helps you interact with the world that is within a whole new experience that is catered to your environment. Arcore is Google platform for augmented reality applications. But even if we use arcore to help us understand the environment, what do we do about 3D rendering? Okay,

so while it's not exactly rocket science today 3D rendering has a steep learning curve. But if you're like me, you're still really passionate about space and a r in 3D and rendering and graphics and Matrix math. So you just start coding and coding and coding and coding and then you quickly realize there's a lot of code. So first of all, I'm so sorry. A second ago. You probably know how did utilities I've been doing this kind of stuff for but you can 10 years at Google. And it's a pain. Proves my point right here. It's difficult.

And we've noticed that there's so much in common throughout all a RF things like streaming the camera image to the background or even just making an object appear on the screen. So we wanted to take care of that so you don't have to go through all the pain. That's why we developed the scene from SDK to make air development quick simple and familiar to any Android developer. C form is a 3D framework that makes it easy for you to build a Ark wrath and includes an Android Studio plugin that allows you to import you and

even edit 3D models the API offers a high-level way of working with redeem and it's tightly integrated with arcore. So it makes it especially easy to build a RF because it integrates with the Android view framework. You can easily add a AR into an existing app or create one from scratch. The New York Times you seen form and their new AR articles. You can download the New York Times app today and search for augmented reality and not only read about David Bowie, but also walk around a mannequin wearing his costume. Auto

V online Furnishing retailer you seen for him still allow customers to see what a piece of furniture looks like in their living room before they buy it. If you're from a European country, you'll be able to download the app and try it out to Orange alien somewhere through all the early versions of UTI always working rights to thank you because I love it that solar system. I was talking about that. I wanted to bring to you. Here it is right inside your living room and trust me without the thousands of lines of

rendering code. You can see this code online right now by going to our GitHub free repo Links at the end of the presentation. So let's walk through how we built this app. So first, we'll start off with some common concept of AR apps and then show you code Snippets using same form after recover. The basics remodel go into detail about physically-based materials and give you all sorts of rendering knowledge to help you optimize and make beautiful 3D objects in your app.

the scene Forum API consists of two concept the scene and The View the scene represents the objects you are adding to the world like the 3D models that you want to place in your AR at The view is where you're seeing will be drawn. In this case. It's your device screen and where the devices in the world. Render will draw the scene from this perspective. If you type in to the Android view framework or system and that's your hook into the app as a developer. You

will build your scene by defining the spatial relationships of objects. To do this thing for improvised a high-level seen graph API for defining be hierarchy of objects on their spatial relation relationship analogy that I like to think about is the Android. Do you hide a key? But instead of is obviously in 3D and instead of use we use bass notes. each node contains all the information that seem form needs to render it and to interact with it and finally notes can be added to other nodes

forming a parent-child relationship in our solar system example planets orbit around the Sun and Moon's of these planets orbit around the planet a natural way to define this the Define the solar system and a scene graph is to make the Sun the root node and add planets as a son's children and add the moon as their children. That way if you want to animate the Earth orbiting the Sun the moon will just follow the Earth. You don't have to do the complicated math to figure out the moon's

relationship to the sun while the Earth is orbiting it. In this video, it's the same solar system example that we saw a little bit ago, but we've also added touch interaction notes can become interactive by adding such listeners. So obviously and we probably a touch of events through the same graph the same way Android touch events are propagated through the do you hierarchy nose contain 3D models like the planets to the Sun or 2D Android views? You can create an Android view just as you would and your layout editor

and put it in the world and interact with it like you would with any other app. It's a first-class citizen of our scene. So how does an Android developer usually get started with Android Studio? Of course? I won't go too deep into this because there is another talk tomorrow morning called build adorate and launch that will show you all of these stuff. But I want to note that we have bills to plug in for Android Studio that allows you to drop in 3D models that are built with standard modeling tools like Maya or you can

download them from Paulie you go through the import slow and that converts the models into an SSA and SSB format. These formats are particular to clean form and the SSA is what's bundled into your app. So drop the SFA file into the reservoir folder the source that says filter and you can ship it with your app. One thing we wanted to know. Yes, you can do an edit. Your models are right there in Android Studio on the device and intelligent words to use to what you see in studio will be what you see on the device.

One of the things to note again. We're going to need more details tomorrow the Importer acts as a Gradle plugin. So for every assets that's important image of projects are going to get the new Gradle task, which means that you sure diners give you a new version of the SS. All you have to do is replace the file in your project in the next time you build will Rico Media Center to magically and you'll be up to date to not to worry about going through a manual wizard every time you get the new version of yes set. Sarkodie,

let's start off with our on crate method and your activity to Sissy wood. Find the AR fragment find seeing forms. They are in fragment. This takes care of setting up the ER for a session for you and also manages. The life cycle is being form. And naturally it will contain the AR View and if you hold the reference to the scene, I want to note here that you don't have to use our fragment you can use the view and therefore the scene directly to the powerful things about send form because it's just using a regular Androids you that happens to be

as tough as you you can drop it off wherever you want in your application. It doesn't have to be. Green and more importantly, we don't take over your application to try to something that you need to phone since Unity becomes our application. It's a very easy way to embed they are inside an Android application that already exists and the New York Times for instance is a great example of how far you can push this because this show AR through a web view so they just put a narrow view behind a transparent with you in as you scroll the web you dilute condensed into the are you so you can fix things

to see full screen example that you'll see me here. So let's start loading Oliver models. And in this case, I have a to do for you to read the load the rest of them, but I'll show you how to build a 3D model in seed form. We have two types of model. We call it or not model. Sorry Runner Bolt run durables are the things that are going to be rendered on the screen. I know we're very creative with the naming in this case. We want to load the sun model which was dropped into our Reservoir folder. And so

will you set those source and build I also want to point out that these are completablefuture. So it will do the loading in the background and you can accept and handle as he would with a completablefuture. I should also mention kotlin here. We could do all of this in kotlin but all of the code Snippets will be in Java, but if you are inclined to do Who I would love to see your code. Remember earlier we showed you this graph representing the sun and the planets keep the structure

in mind because we're going to be loading and creating the same graph. We've loaded all the model. So now let's build a solar system. We start off with the sun node crate and you know and set its renderable to be the sun model that you had loaded. Next we create the Earth known we said it's parents to the sun there for building the same grass starting to feel the same graph notice here at that. I have set the Sun to Earth meters constant. In this case. I've used 0.5 M because I wanted to fit the solar system inside

the living room, but we set our local position relative to its parent the sun set the runner bolt that we loaded continue. Now the Moon. Create a note that the parent to Earth and noticed here now that the the local position of the moon is Earth to Moon in meters. I think I started to be 0.1. It has nothing to do with a son that the Run durable as a moonrunner bowl and your set now how to animate their orbit. You can just use Android property animation.

We've created some evaluators for YouTube. Once we return the root node, we now have the entire graph our solar system. Thoughts bring the solar system into the scene. All we have to do is parented to the scene and that's it the parents of Sons of the scene and the solar system is now in your room or we could do something else. Anchors, I don't know if you know about arcore anchors, but they are how you attach your content of the world and arcore make sure that they are anchored to the world. You can get

anchors from are cords hit test based on the user's touch events or with our new Cloud anchors. You can use an anchor that another friend has created on another device. 4 using anchors. We've created a extension to note call the anchor note that just takes an anchor and now brings that anchor into racing graph. Set the anchor notes parent. He's awake just to the scene and then set the sun parent to the Anchor node notice how we've rearranged it a little bit first. We had the scene with a son as a root node, but now we've created the

seen the anchor node and then the sun and the rest of the solar system. I was mentioning model Vanderbilt before so now we'll talk about the 2D View rentables. To add to tea into your app, in this case. I've used a little info card that pops up about 1/4 of a meter above a planet say and you you want it to display some information on it. So I've created this note that would hold the rear end or bolts set it to a planet and it will now float quarter of a meter above

the planet where ever that planet is. Start off by building of you run durable, but notice here that instead of the rez raw source for the model renderable. I said it to the void and that's all there is to it. Now you have the 2D view just as he would have an any other application and you could do things with that view like set the text. Finally if you want to drag scale and rotate object, we made out really easy for you by creating a transformable node. The transformable

note is an extension to node. But it also understands touch events and gestures like dragging and scaling and rotating object. So in this case, if instead of creating a sun node, I created a transformable son know I would just do everything else that I did before but now we can actually drag and move the the solar system. I should mention that there is there was a talk earlier today about you X interactions in a are so if you want to know more about the best practices for you, please go back and look at that talk.

I'm headed over to let's talk about materials. So this is a this evening simple part of sin form. But before we get you can understand how to create your own phone Metro. We have to understand the concept behind physically based rendering. So it's too dangerous to plug in electricity said you end up with two fouls. Does it. S f a n a. S a v c battery That's goes into our vacation. And that's to load that run time does of the SFA is effectively Jason description of yes set something like this safe from the the moon the Earth the moon of the earth

that tonight and you can see the best color normal and so do those two textures that I'll just find somewhere else in the SF. I'll go into details about the syntax in the structure of the SFL mostly because there's excellent documentation that you would have won. Mine is also the talk to me. Morning, you should take that out with what I want to talk about is what do we what kind of pictures do you need to create to be meaningful for the color of the normal and those two wheeled ones the Mythic and the roughness fixtures with again before we do this we

have to talk about the past three or four years. It's a starting the VFX industry is now using off a lot of people a game. It says still fairly uncommon on mobile and the basic idea behind it is that we are relying on physical principles to Define older behaviors in all the equations that we use in the rendering system. So that separating the lighting codes from the the Kuda do you find the surface and it doesn't impact on the materials themselves means that you have to take into account laws of physics

like the energy conservation we use physical light units of pain since when Declare a son as a dynamic lighting scene. We use the Unicorn looks like you worked together the lights in your scene in your ear scene. It's just the light bulb you can use what are Lumens HD SDI. Do you need help us wedding dates are rendering. It also makes your life easier because those are things we deal with everyday and it's so natural to us and I'll show you an example. So this is an example of

a sphere and you can see that they are reflections on the sphere. You can switch you out. See me hear election is a physical effect co-defendant effect after a 19th century physicists in the closer you are to the edge of the field the more you can see the reflections and this is a natural phenomenon that you can see everywhere. So this is a photo at 2 in Lake Tahoe and you can see that Who's 50 years where I was standing you can see through the water and the further away. You get the more you can see the Reflections at

the waters. And again, you can see this on every object every object around you. First but this is what happens and we recreate this kind of physical behaviors, you know, rendering engine to make things look as realistic as possible. So we salute a workflow for Mitchell's call the meter across this work for you to do better. You're going to see there's a lot of information are there humidity crawfish workflow is available in a lot of poetry up tools. If you've used unit E5 you have access to eat Unreal Engine 4 uses it a blender recent versions has

access to submit a report to as well. So it works is when you want to just find a New Mexico. You're actually trying to describe and Define is surface into Define a Surface Pro going to need three things and we're going to take them in this house in a little bit. The first one is used to define the metallic property of the surface of a mess, but it's not even the word. So Mythic ilvl is weather project weather on the subject is a metal and we're going to see why this is very important to us. Then you're going to define the color of the object and we called the base

color as opposed to diffuse color on this picture to color. Terms that you might have seen the previews older engines and then there's a reason for that Tony you need to defend something for the roughness refute means how shiny the object is and if you want to go further the author of the things you can Define to give your sister first yourself is more detail more details of my natural parents to the first one is the normal the normal map. It just helps break the evenness of the surface in the second one is called the ambient occlusion or occlusion. So first, what is a

non-metal I'll spare you the equation is simple. So he had the time we have an object and object is thrown in the diagram, which we have lights coming from a light source and hitting the surface of the object and the lights that he's the object get split into two components. The first one in white is a reflective part Sedusa older Reflections, swing that orange ball that we saw before I did as you could see those Reflections. That's the that's the stopped at the light and then the light is refracted into the object and most objects around us

the absorbed some of the lights to light enters the object to get scattered bounces around inside the object and it'll be some of the lights will come out but not all of it. And this is what gives object colors in that particular case because the object appears orange. It just means that green and blue components of the light. I've been absorbed inside the object. When does orange light that comes out Schoolbus? The dishes light do white light is for this picture. My lights with picture is for the reflections and its uses for everything else. So this is an example of a nonmetallic Orange

Bowl and you can see this Reflections the reflections here. This is handling and environment inside the classroom where we have overhead white lights and you can see that even though the object itself is orange the lights appear white is because it just bounced us. They don't enter the object that don't take the time to take the the color of the object into account and then the rest of the object reflects. These parts are oranges expected. Know when you have a metallic object metals are conductors and decal collectors because when that energy hits them in these kids lights the lights the part

of the energy that's reflected into the surface gets absorbed. It just gets transmitted to do I get into the object and does not get scattered outside. So there's no dishes like you only get Reflections. However, what happens with this Reflections is that they get to take the color of the object so you don't get those white Reflections anymore. You get the orange Reflections instead so we could take the symbol that we just saw and turn it into a mental look again at the lights the overhead Lights of the classroom. You can see the denial appear orange and it said because the rest of

the light or is that the rest of the spiritual light was absorbed inside the object side effects of the defendant law that you just talked about you look at the edges of the sphere. You can see the reflections on that's orange anymore the take on the color of the environment. Special effects for the first time in the rendering engine but rendering Engineers that can you are obsessed about don't work comes Christmas time. They look at the Christmas tree and then take a picture of one of the ornaments. I was holding one of my phone's on

the side. I was letting the green method goal with an orange lights from the wallpaper on my phone and you can see that on the edge of the ball to Reflections on the green anymore did take on the orange light coming from those light source. So what we just saw here and if you're willing to sit on the computer for the first time, but this isn't the type that's perfectly natural that happens everywhere around you. Percy I mention that this is the first thing you should decide and you use Nelson why because when you define the color of your deck when you find the base color is whether or not

depending on the weather on the object is metallic or not. It's been a dramatic you change the aspect of the surface you should always decides first weather building with metal or nonmetal. So when it comes time to create the actual texture the metallic nest of the object can be defined as a Grace Celtic services that use between 0 and 2:55 at zero is not a metal in the 255 when it's white object is a metal. Most of the time the value should be either zero or either one older than between or mostly used for entertainment purposes. Because you know, your your texture

couldn't they needed the middle that's painted and defend itself is not a metal so judges of the pain. She wants a nice Fall Offs nice transition for metal to nonmetal. Sentinelese in real life happens to be a mixture and you can use intermediate values as well. But most of the time you won't have to deal with it. You don't have to worry about it very quickly when a mineral gets Rusty he texted it becomes known metallic. So is she trying to criticize that's Rusty do the worst things will be no middle. Yahoo, enjoy your friend. So that's something that again you'll see tomorrow. You don't

have to stick stickers for everything. You can also use common sense and very often you can get away with not having a texture for the metallic mass of the object can just say either is metalloids. Kind of Base colors to the color of the object defines either the diffuse color of the object fundamentals of this picture dark color or the color of the questions for metallic objects. What's quite difficult to to do when your creator best color or Texture is that it must be completely devoid of any lighting information on when shadowing and how can I look at the example

that can be hard because human beings will never see the actual color of an object. We only see objects through lighting looks like but you can quickly get used to it and whenever you use a tool like Photoshop, or I can see photo to build your textures make sure that you're working in the srgb color space. What does tool should you do mean by default. Just in case make sure you and your artist working that's going to space how to build colors to object based on real world data. Most of non metallic objects

used most of the range of the brightness. So whenever you pick a color in the Color Picker for non-metallic objective values of the RGB color should be between 10 and 2:40 is nothing as dark as zero does nothing as bright as 255 when when we deal with nonmetals and metals on the other hand. They're always right. So they'll know. Meadows basically don't exist. So you should stay in the range to turn up here that was mentioned that you should not have any lighting information inside. You are the best color of the object and you can see hear a set of switches taken from real world of

the ocean. And you can see that are gold points is that in real life appears quite yellow in saturated. The base car is actually now that it's really all the colors in the best color map tend to be very pale compared to which he proceeded. So here's another example on the right so you can see a metronome of Prison Breaks in on the left. You can see the base call or text when you can see the difference once who liked Joe dick all the contrast and saturation appears with all of that information is not in the original picture that was using to Chris material. So again, you should work with

your artist make sure their family on with the metallic metallic process workflow or make sure that is understand that the best color match should be should not contain any lightning or shadowing. No, I mentioned the still, to recycle the roughness and defends houshang an object is service simple way to refinish your face is like this is infinity. Smooth is no object in the world that stays smooth and what happens when you have a smooth object lights Wrath of light that are coming out to each other a balance of parallel to each other as well. So we can finish up

Reflections rough of dicks on the other hand have with people metrofest. It's a distress you can think of those as very tiny mirrors that might not be oriented in send directions to when light comes in you got this parallel rails rays of light they can build stuff in random directions and that causes blurry reflections. And those are examples of the touch. You can see a a bowl a little metal and we increase the roughness from zero all the way to one that the bottom. We have a normal terrible and increase of rough Edge 1021, then you can

see the effect here. We start with very sharp reflection and as we get closer and closer to one the reflections become so blurry that we can't even perceive that they are reflections. They're they're just afraid of Magic the entire surface the entire visible surface. So this is a very powerful future because it lets you again create things like polished metals or no less than you could just use her for quite a while and it's become basically watch The roughness is there a scene after the metallic properties to grayscale texture use veggies between

0 and 25050. Joseph is going to be glossy of a shiny at 2:55. It's going to be extremely rare rocks and they're going to be able to see the reflections anymore and just be aware that they may be differences between tools. So if you spell roughness of 100, let's in blender that's across this magic revealed the difference in the different engine goes out different ways of doing this competitions. You shouldn't worry too much about this just to just said yes, I can tell looks right into the same form sometimes instead in Boston. This is just the opposite. You so you can just

enjoy the texture in Photoshop for instance to get the darkness map. Next want to add some details to to save on performance in memory. We try to smooth surfaces to when you build your mesh made of triangles you smooth surfaces. So he will exempt or break the. Completely smooth to add some details we can use a normal map a normal map looks like this when you apply to the object you get to the orbitals of shadowing and more information more details on the surface of a ton of information available in lines were going to skip that before we run out of time. The only thing

to know is that the colors inside of normal normal map and could a vector in the direction. It's not a color. Next one is ambient occlusion. So here we are with our bricks and is being fixed your properly devil. What is the metallic normal map but we're like you were to go macroscale shadowing information because of bridge. Has a depth so it will create Shadows on itself like to disturb that should be casting Shadows on himself. But because we don't have access to the triangles inside with Chris is just a black and white texture that tells us where

the Shadows should do. So again that's called ambient occlusion or collusion. It's very simple to grocery list to do before and after that after before after sew again as a lot of detail and depth to your project. Elemental cushion that is just to grayscale texture. When do they use on zero if the pizza was completely in the dark? You should never have let you said exactly zero at 2:55. There's going to be no shelter when it doesn't affect older lighting. I'm not going to go into too much detail. So anytime you created an object that has cracks or or crevices that can upset. You

should be using an ambient occlusion that when do we have a flash pictures with the Middle East has the best color the roughness to normal and Jim into collision and if we get them all together in this particular example, you can create an object where everything varies from Pixel 2 here. We have a metal Ball, but for some reason some of the tales on me seeing angels on earth metals anymore the reflections I'll just gone so you can see we just destroy pictures. You can create their impressive directions from Pixel 2 XL and create most real-world materials in the photo realistic manner. One

thing you can do to optimize your materials is she accused the three for Matco gltf that I'm sure they're going to talk about more details tomorrow. You can have the channels into single texture. So I meant occlusion roughness and methodists are grayscale images so they can each fit in one of the channels of an RGB image fit into this is really in any good photo editor that game for Affinity photo or hear a flu shot and you can have only one picture instead of streets going to speed up your load times going to speed up the rendering as well. I do not want to talk about performing

Studio bit. So one of the teachers that we haven't said about rendering engine is something go Dynamic resolution. What we do is we always watch the time spent on the Jeep you to render every frame and instead of dropping frames when there's too much to render. We edit the resolution of the of the of the rendering so we smoothie at the resolution both on the vertical and horizontal axis. Sometimes with the first one X's, then the other sometimes both at the same time. So what this means for you is that this your building and they are seeing if you make it too complicated issues too many

objects, if you have Matilda Dr. Komplex, we're not going to drop friends were always been a favor performance of her anything else, but we are going to lower the resolution of your scene because we have really high density displays. So it's really hard to tell when this is going on and I'm sure that most of you won't even notice as you're using the app, but basically boils down to do you want to see a complex seen at lower resolution or do you want to see a simple scene at Harrah's? The maximum resolution of car fuses 1080p even under devices like the pixel 2 XL. We're not going to use

the full resolution of the display because it's just way too many pixels to be able to drive. Can you keep is rendering and I want to call out that the render does it automatically especially for your artists to create to add a lot of triangles to create a pretty smooth surfaces, but we are running all of this on mobile phone. So you should be careful with the complexity of the objects to give you a rough idea what you collect hero object to another joke. You can get pretty close to should have maybe

Atmos 10000 triangles, but even that if you can avoid using that many triangles it would be great and issues 10000 triangle in one object make sure there's only one of them and that's a hundred of them. Otherwise performance is going to suffer. Retail You artist to simplify the models as much as possible. That's smaller than the pixel is going to do with much work and I'm not going to go into the details here cuz you probably don't care because you keep me might end up doing the

work four times and we really don't want to do that. The complexity of the scene sanctuary in yaqui not going to add a lot of objects, but you should never create the office in Fountain Square. You have a model of a City emergency to put on board of objects in the city or new every building or if he car ever even pedestrian which would recommend is that the most you should have maybe a hundred objects visible at a time on screen and the reason I'm here is because we're going to run into the city was going to become a bottleneck. So this is not something that the enemy position can help with. We

avoid rendering anything. That's not on the screen. We have a lot of a conditions around that the Tooth Fairy have too many objects on screen going to be down by the CPU and there's not much we can do about it and you're going to stop dropping frames and dropping from bad in AR. I mentioned the format code GFCI who support OTG and SDXC accurate is a new is a standard driven by Chronos Chronos is the committee behind open Gia and invoking a lot of tools for gltf websites that sketch pad and a lot of Assets in to jump to it for me and one of the reasons why I liked you up yet is because

in the standard the pack the occlusion the roughness and the inside of single arm DB texture, which is something you should be doing for performance reasons. So if you can next at your earliest to give you GTA tomorrow is going to make veggie fault. Your models are the most recent reminder. One thing we didn't show you how to do with the ideas. You can add the lights in your sin and you can have many many many many lights are this is an example of a demo of a rendering engine running on my pixel too sweetie. I think we had something like a hundred and twenty-eight lights in the scene.

So you can see that you can you can have many many of them and you can still run at 60 frames per second was very important is that issue add mini lights in your senior make sure they don't overlap or they don't take life too much because if you have two lights on the same team, so we have to be to work twice. So it should I do a hundred lights on 160 will basically run during a hundred frames to use a lot of lights if she wants to make sure that they don't overlap in to do this in our ideas. You can give us a maximum sphere of influence porch light. Finally issues of urine

Revels. It's extremely convenience. You can create use the way you do it in the rest of the application. You can put them in the scene super useful with every you is rendered in software and forever review. We have to allocate what school is surface texture and this is going to cost memory is going to cost if you time in Spanish because the GPU time so try to reuse the views as much as possible. Don't try to put too many of them on the screen and don't try to look at so many of them at the same time and finally all the usual advice that will give you for performance inside Android

application applies to a ROTC student Advocate single-window. Luke. Don't do too much work be mindful of the size of your APK and all that good stuff with that. We're out of time. Guess we'll talk tomorrow morning a building 13 launch apps is another one called zennyrt stations that was earlier. So come back 2 hours ago. I Will Survive office hours. This is a good luck available online to create a scene of scene 2 what you saw and that's it and you have questions you can find us after this talk or tomorrow will be around to answer all your questions.

Cackle comments for the website

Buy this talk

Access to the talk “Building AR apps with the Sceneform SDK”
Available
In cart
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “2018 Google I/O”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “Software development”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
159
app store, apps, development, google play, mobile, soft

Similar talks

James Birney
Product Manager at Google
+ 1 speaker
Eitan Marder-Eppstein
Engineering Manager at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Tom Salter
Tech Lead at Google
Available
In cart
Free
Free
Free
Free
Free
Free
Clemeintine Jacoby
Software Engineer at Google
+ 1 speaker
Rose Yao
Director of Product at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Building AR apps with the Sceneform SDK”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
558 conferences
22059 speakers
8190 hours of content