Duration 39:30
16+
Play
Video

Get started with TensorFlow high-level APIs

Josh Gordon
Developer Advocate at Google
+ 1 speaker
  • Video
  • Table of contents
  • Video
2018 Google I/O
May 8, 2018, Mountain View, USA
2018 Google I/O
Video
Get started with TensorFlow high-level APIs
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
214.66 K
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speakers

Josh Gordon
Developer Advocate at Google
Laurence Moroney
Staff Developer Advocate at Google

Josh Gordon works as a developer advocate for TensorFlow, and teaches Deep Learning at Pace University. He has over a decade of machine learning experience to share.

View the profile

Laurence is a developer advocate at Google working on machine learning and artificial intelligence. He's the author of dozens of programming books, and hundreds of articles. When not Googling, he's author of a best-selling Science Fiction book series, and a produced screenwriter.

View the profile

About the talk

TensorFlow eager execution and high-level APIs allow developers to train models easily and effectively. This session will introduce these features, and how they can be used to simplify TensorFlow model building. Learn about loading datasets, training a classifier, and making predictions, all while highlighting the fundamentals of machine learning.

Share

People and good morning. Thanks for coming so early. So my name is Josh Gordon Rooney and we are here today to speak with you about tensorflow is high-level apis and I have a lot of good news for you. And I hope this talk will be concrete and useful. So one area and particularly passionate about is making machine learning as accessible as possible to as many people as possible and the tentacle team has been investing very heavily the same thing. So we spent a lot of energy making tensorflow easier to use and I'd like to show you based to get demo easiest

way to get started with tensorflow today. So there's three things that are concrete that I'd like to walk you through. And the very first is even if you're brand-new to tensorflow your brand-new to machine learning, even if you're new to python one area. That seems silly but is non-trivial for a lot of people is actually just installing tensorflow in different dependencies. And I know for python developers just pip install tensorflow, but that can be hard for people that are brand new. So I'm going to show you something called collab and I'll walk you through collab.

It's basically a jupyter notebook server running the cloud. It's free of charge does tensorflow pre-installed come to the Fiji to you? It's awesome. I'll walk you through how to use that how to get started with tensorflow the next thing the eyes but my personal favorite what I'd strongly strongly recommend to use something called Kara's and Kara's is completely inside of its great. I can't tell you how much fun I've had you been getting so I'll walk you through writing hello world in chaos, the same API is also useful for tensorflow JS and them to point you to some

educational resources to learn more. Cool. So these are the apis but I want to briefly introduce. So careless and Facebook Lego like building blocks for building a defining models TF. Data. So when a lot of people start learning they get really hung up on okay, they learned that a neural network is composed of layers and they find out that they can adjust the number of neurons four layer and there's different hyperparameters like the optimizer and stuff like that. It's been a lot of time on what I call modeling but something that's really really important

but doesn't get enough attention is actually how to get your data into the network is non-trivial call tfdata, which is a relatively easy to use but also very high performance way of writing a book pipelines and then imma show you eager execution. If you're new to test by the way, when you hear the word eager execution, if you're new to tensorflow just ignore that and just think of this as this is the thing you always do so you should always right and run. So developing to Bug Your tensorflow programs. And it makes me feel just like regular python. This

is a short talk. So I'm not going to go into all the details of how tensorflow Works under the hood. But this is the right way to do it if you're Learning Center close today. Octo briefly this is what I would do if you want to try tensorflow and Care Austin tfdata and execution in the fastest possible way and I should tell you that to all the apis. They're fully implemented and they're working. Well, we are just now starting to write all the samples and docks around them. So I have a feeling the samples I was able to cookout for the stalk. Are there quite

rough, but stay tuned and check back in the next few months as we flush this out. But let me show you how to Dive Right In. This website they will bring you out. Can we switch the laptop for a minute, please? It will bring you this get outside. and if you scroll down To the read me you'll see a sequence have a few notebook and I just wanna show you how easy it is to get started if you just click on one what happens if they open up immediately in collab. And so now you have a jupyter notebook. It's running entirely in the cloud. You can hit connect

to connect to a kernel. And now I can start running these cells and I'll walk you through this in more detail in a few minutes. But if you go through the first notebook, this is going to show you how to write your first no network using Keras. There's a little bit of a cold but the notebook is very short. The next notebook will show you how to do the same thing using carrots in combination with tfdata and eager execution and then we're going to look at your desk. So it's ready to get started. It will take you about five minutes and end to try this out. All right, so that's

let's switch back to this lights, please. And let me give you a little bit more depth of what's happening in these notebooks. Using the carrot API. This is the complete code minus a few lines of pre-processing just formatting a data to a right train evaluate and make predictions with your personal Network in tensorflow. So if you were using tensorflow about a year ago, it would have been substantially more code and there's other great high-level apis including us a really really wonderful for doing a production. But at least four learning ml I'd strongly recommend this

so I will walk you through exactly what all of these lines are doing. And one point I want to make is that the code is if Concepts heavy but code light so writing the code itself should no longer be a barrier to getting started with ML. I hope that far fewer people are going to spend energy and time on the syntax and bugging and now you can spend more time and energy thinking about what you're trying to do and why Oh, yeah, and before I die because I want to make a really really important Point sewing machine learning is a broad field by far up another mistake that a lot of students make

when they're learning is they learn how to train an image classifier? Like we're going to do in a moment and they find out when they write the Class 5A start evaluating it and they see these numbers like 99% accuracy and then they find out that by tweaking the network and gets like 99.6 or whatever. And in reality it almost never matters the most important thing you can spend your time on his design an experiment and what I mean by that is concretely thinking about what are you trying to predict to me? Why how will it be used in practice? What could go wrong? Where does the data come

from thinking through the design of your system as you would in any type of software is much much more important than messing around trying to get higher accuracy. I'm not going to talk on that but today but always think about what and why in addition to hell. I took 45 minutes a couple things not enough time to introduce tensorflow. I just wanted to call out the community is the most important thing about tensorflow in addition to the thousand plus folks who contributed code there are many many more who are doing things like teaching organizing events

of writing articles. These things are incredibly valuable to I'm just a few new things. I just wanted to call out this came out 2 days ago. So there's another Library called tensorflow JS part of tensorflow in JavaScript user the same API API them today. When are you going to act out this life? I was going to the bottom you check us out at home. There's a really great demo. You can play with on her blog and you can actually find the code in the website. Another thing. There is a

pair of links here, which I really heard you to try. So this is magenta and magenta the project using tensorflow for experiments in art music and hear this is computer-assisted drawing. So magenta is helping me drive duck. So I'm trying to duck with a mouse to magenta Timberlake autocomplete for drawing. There's a great game called Quick Draw, which I'd encourage you to try it's great for kids. It's ya One stop before I get my code and also before I get the code in a personal note. One of the reasons I care about machine learning so much is because of a project

called sellbot and cell project from 2004. It was in a biology lab and so bot uses a neural network to identify specific types of cells in solution and then sort them and what was so cool to me as a student working on this event machine learning is not just for a computer scientist. The fact that biologists were trying to do something useful using technology that originated in our field was really meaningful to me. It's how can we machine learning for medicine for arts for healthcare just to do useful things in the world. So that's that's kind of thinking behind everything I do

with ML. It's what can we do to help people basically are the URL for collab? Collab. Research. Google.com call Abby short for a laboratory. It's inspired a lot by Google Docs. It's a Google Docs style code editor and you can download basically. Save jupyter notebooks in Google Drive, you can download them back as regular jupyter notebook. So there's no walking. I know a lot of people by the way are not from python. So I just want to spend a minute and introduce jupyter notebook. There's two types of cells there's markdown cells which contain markdown and

if you edit a markdown self, like I'm doing with hello world and you executed it just renders the markdown the other type of cell of the code cell and if you added the code and execute to sell it excuse the code, it is really cool thing as well as that they give you access to gpus I have spoken with a lot of people who hate I find it very difficult to set up a GPS or you know, maybe there any University environmental they don't have access to expensive GPS and if you're using a collab you can actually connected to a GPU in our data center so that you can test out your code running on GPS.

Matplotlib Scatter Plots and data and the Grass Grows Right in line with your notebook. And this is really important cuz it's a great way to share results. Another thing. I want to mention quickly as you can pip install libraries. So you're basically you're running in some container sitting somewhere on GTP, but you can install whatever libraries you need. So here and pick installing matplotlib. So you have root access to it. And then the last thing a lot of people but it's super useful to help Snippets it conveniently hidden in the table of contents. But what am I doing? This notebook

is on a Kratom data and then I'm going to download the data off of collab back to my laptop and to find out how to do that. I put on the Snippets thing and I start searching for downloading data and collab can totally see it on the screen has a snippet of code that I can just copy and paste directly to notebook and run. So it has examples how to install libraries everything like that. It's super useful don't have time to talk about this is what it looks like to write your first no network. I talked about this

earlier but no one has ever looked at the coach write your first known that work regardless of how smart you are doesn't matter if you would like a PhD in physics or whatever and it's just going to get it. It's impossible. It takes months to learn this stuff. So it's completely normal to see lots of Concepts and have no idea what they mean. But let me just at least introduce you and then I'll point you to the course you can use to get more. Oh, that's right. So you'll see lots of parameters. We need to find these networks and really only one or two are important to spend your time on and I'll

point you what those are as we go. So there's only five steps to write hello world in tensorflow using Keras. The good news is step 3 4 and 5 are literally one line of code. So that's that's all the work of training at work evaluating your accuracy and making predictions is where a lot of contacts are. Let's see what this looks like. So, where are you at least I know for people that are machine learning experts, you know, and it's cold and it is just so we don't have to worry

too much. It's from the hello world a computer vision. It's a data set of about 60000 + very low-resolution handwritten digits. And our goal is to train an image classifier to classify a knife fidget spinner. It hasn't seen before. So people do well in Port tensorflow and you can see that on the second line. We're importing mnist and this is easy because the data set is already. We have a loader for if it's baked into and if you're new to using Carol tensorflow, you can see that it's just included you can do TF. Kara and now you've accessed to complete chaos

API, so there's nothing else to do in this is Awesome. By the way, there are many many Austin Advanced things. You can do using Keras and tensorflow like it's great here. I'm just trying to show you the straight easy as possible way. I did already for us in the train and test train is 60000 test at 10000 the top right? I have a diagram of the format of the images if you look at the notebooks in that Workshop directory. The best thing you can do when you import a dataset is to spend a lot of time asking really basic questions. So literally when you import the data printed

out for an out-of-shape print out the single image look at the format. What's the date of type is a floating-point is an integer. What are the dimensions? How many do I have spending? A lot of time will save you a lot of headache later on so it's always okay, that's really basic questions in the bottom, right? There's many neural networks that work with 2D images here. We're going to make a simplification and we're just going to unroll the thing. So instead of a 2D image. We're literally going to unstack the Rose is images happen to be 28 x 20 pixels when we on Stack-On we get a

vector that's 28 * 28 equals 780 something pixels in a line. So it's a simplification this report. The data said I didn't show the code which is just not apply to reshape it, but it's in the notebook. One funny thing about neural networks. So there's many different types of classifiers and most of my background. I was really invested in tree based models before deep learning weather thing. And the reason I like tree base model's like rainforest is that I can look at a tree and intuitively my

brain understands exactly what the trees doing to classify the data. Like I get it it clicks neural networks can be a little bit counterintuitive. It's hard to describe in 30 seconds. But let me just tell you what we're doing and you can take the course to learn more. So to finding a fully connected deep neural network. There's going to be two layers that you can see. So first were saying where to find your model we're saying we're going to use the sequential API, which is the simplest safety. I did find your model. It literally means our model is going to be stack of layers. The first

layer that were stacking on is going to be a dense layer dense means it's fully connected. I'm not sure you diagram of this. I'm sorry, there's just this is a one layer network, but the notebook 7 exercise you can add a second layer and output layer with 10 outfits and gather evidence that the image that we see through this network corresponds to each of the digits and the last night a little bit different but the point that I want to I mentioned there's a couple points that I want to make one.

This is the complete codes to find a network. So it's code concise broadly the more layers you add to your network and the more neurons are units per layer the more capacity your network has meaning the more types of patterns that can recognize. The problem is the more things your network can recognize the more likely it is to memorize the training data sewing machine learning. There's always this tension between memorization and generalization. You don't just memorize the training data. What you want to do is learn patterns that are useful to classify

digit that you haven't seen before. So it's very easy to get very high accuracy on the training set by building a deep neural network with many neurons and trading it for a long time. It's not necessarily the right thing to do though. So when you're messing around with these architectures started simply as possible and then slowly expand from there. The next thing you have to do is compiled your network is the last step in building it. It's just one line. There's two concepts. I'll get into the optimizer in a sec. The Lost function is basically the objective that your network is

trying to optimize a good news here is this is a fancy word categorical cross entropy, but what it literally means is your network is going to make a prediction. So you feed an image of a to through your network and it gives you a probability. Addition over all the digits that it could be. So maybe with 10% probability is a 0 with 5% probability that the one hopefully with like 90% probability to two. Anyway, this is a fancy word that compares the thing the network predicted to the thing you wanted it to predict and what you wanted to predict is all the evidence is on it to

the truth is there's a whole bag of different optimizers. You can use you don't have to worry about what rmsprop means too much because they're good defaults. So for this type of classification problem, basically, you're always using categorical cross entropy and rmsprop is a perfectly good Optimizer to start with using all the default parameters. So there's there's good defaults. Broadly, the way the network is trained is using gradient descent. So here's here's how I would think about this.

What is neural network is when we had those dense layers there many different wait connecting the pixels to the neurons and you can think of all those way to the parameter example that I had my head if you think of linear regression. What linear regression you have some points on some hot and you're trying to find the best fit line and if you think about from high school the equation for the line you got like y equals MX plus b and you got two parameters that you're trying to learn m is the slope and you have the intercept and four different values. You can calculate

how well your line fits the data and you can look at your air and there's different ways to get your air, but maybe it's the sum of the distances from your line to the pixels. So by Justin you can find a best fit line. Neural networks for training the very same in a similar way except instead of m&b. You got hundreds of thousands of parameters that you're trying to learn and these parameters connect to pixel 2 the neurons and so on and so forth and the way they're trained is using something called crazy if it snaps and there's a fancy diagram on the rights to show her something is not relevant,

but they all start at values any slowly Justin overtime until the network becomes better at recognizing digits. Anyway, I'll point you more education resources in the SEC. so here's the cool part building remodel is where there many many machine learning Concepts that you have to spend a lot of time learning the next three steps. They're literally Concepts that are basically involved with running an experiment. So here's the only parameter. So here's how you train the model. So it's one line fit is synonymous with train and we're

training at using the training images in the train labels. Here's the only parameter that really matters and the good news is this concept is little bit simpler. So Epix basically means an epic means One Sweep over all the training data. So we have 60,000 images One ipock basically means that we're training Network on all those once Training a network is a little bit like tuning a guitar. So think if you have a guitar and you want it starts untuned and you want to tune the strings hit a particular note. So you start tuning it and like every time you twist the wheel

on the guitar and you tune for a certain number of that box until the guitar place to write a note if you keep tuning past that point it's no longer going to play the right note and eventually he's going to snap. So you have to stop tuning it at a certain point and you stop doing it based on the sound neural networks are exactly the same way. So there's basically one very simple plot and it's this which I'd recommend you look at while you're training your network on the y-axis. We're seeing this fancy thing called lost that basically needs are lost is the same thing as are

just a fancy word for it. What we're trying to do is minimize our air, On the x-axis is how long were tuning the network in terms of number of epics the longer is if you noticed after a few have Hawks the Lost which is a low that's kind of when the guitar hit the right note. If you keep training the loss will start increasing again, so to find the right number. Fox, you literally make a plot like this. You just pull out your hair and then you take a look at you. Look for the point of Louis lost. Usually no validation day after that. You can evaluate it and evaluate just means giving some

new data class fight with my network and take a look at the accuracy and other metrics. That's also just one line of code. And that's called this will give you your loss. Are your are your accuracy when you train a model you can specify other metrics that you'd like returned and then to make predictions on new data. It's also just one line of code. So there's some syntax error. So you can say model. Predicting to give it an image the syntax there is a little funny as it happened but model expects what's called a batch of data and a batch just means it's it's

meant to predict on multiple images at once. So you get an image and you just wrap it in a list and this is just been taxed to make it happy but we're making predictions on the on the very first image from the test data and what the model gives you back is probably all the different digits and basically it's a very lowest layer of the network. You'll have evidence sitting in all the output nodes. And if you do some pie. Org Max to just find the largest elements, that would be the prediction. So here's pretty thing that it's at 7. I need to give you a really quick over you tfdata and then

I'm headed over to the Lawrence to show something more about ear. so when I would here's something to beware of intent for this many different ways of doing things. You should always do the simplest possible thing. That's adequate for your problem. So never basically when you're starting with M L and and probably 90% settings don't worry about performance just try and minimize complexity. So before we wrote Our hello world just using the interstate I-95 format. That's fine. If we

were working with a much larger data that or if we are reading images of disc or if we were pulling them from the cloud. There's a lot of complexity there the form it starts to matter. For example, there's latency when you pull images over Network, so maybe you want to hit a bunch of different servers at once or maybe you want to punch different threads. So tfdata has all sorts of functionality to help you with things like this. Also one funny thing that's happened. So gpus and CPUs had become surprisingly fast and the latency and training a lot of models is actually can you keep the

GPU fed? There's nothing hold GPU starvation where the Matrix multiply there is so fast that it's just sitting around waiting for data has lots of utilities to get your stuff like prefetching. It's a GPU if you need to tell her world, so we imported amnis Justin careless and just a demonstration purposes. I'm going to wrap it and it's he updated dataset. So I'm crazy days is that from 10 slices which is a fancy word which basically told it that hey, this is a list of things and I want every element in the list to be an item of my data set. Next thing you do. It has is very nice. Clean

API. I can say I also want you to shuffle the data here very long lists of things and it could be an infinite stream data. You don't want to shuffle the whole Stream So Shuffle has a size for the buffer. So here I have a buffer of 1000 elements just shuffle them as a stream in. And I can patch it up. That means my call this dataset. Imma get back a batch of 32 things. And so now I want to show you very briefly. I have you use it. So use it. What you can do in this is the complete by the way, you can just say like for images at labels printed

out. So this looks obvious to me and it's the way things should work. But if you knew it didn't used to always work this way. So if you have previous experience in tensorflow, you can take a look and you notice nowhere in this code. Is there a session there's no place holders. I'm not mentioning the word graph. I'm just riding regular python code and it just works. So this is eager execution 1/8. + the only thing I have to do is the second line of code there. It's just I'm able to sing you need to do this at the start of the pipe and father notebook and now you're basically

running a center put yearly which if you're starting out, it's just regular Python and this is the right way to use tensorflow. It makes the ball game. Much easier, there's lots of reasons why you would use graphs and we still love grasp his great talks from the tensorflow developer Summit that will go with them and lots of detail, but at least when you're hacking definitely definitely. Davis, okay. So for more contact for a while, Frontier, where at I also we

figured we'd have a few so that's good. So go for a few months and I think for getting started intensive both can be a little bit difficult if it coming from software Dead background and getting started in any kind of animal is right. I mean, I didn't even when I was in high school and now I have to understand it again, you know people tell me about linear regression to my head explodes, but then you know, it's something that you can pick up but then secondly is really the programming model and what what is the programming model all about I come

from a background of traditional programming an inspirational programming you basically you feed in the rules you feed in the data and you got to answer is out. So for example, if you're writing something like an activity detector I am I walking on my running and my biking am I driving rules about that? Probably something based around the speed right? I'm a very slow Runner so it's detecting something going at one and a half. Miles an hour to texting me is running but I know Josh is a really fast runner, you know, so it would probably think that he's on a bicycle. If you're using my

rules those kind of things you feeding those dating of feeding those roles and in this is what we've been doing for years was programmers, but when it comes to machine learning it. It kind of slipped the axis on this a little bit and change it to this. So when it comes to machine learning, this is all machine learning AIDS you feed in the answers and you feed in the data and you get back to the rules. So when you when we talk about training a model when we talked about like building these models what we're ending up getting is this binary blob that you can run inference on and it's finally

glad that you're running prints on is effectively got those rules that will give them, you know, that will give what you need back out to you the example in my example of activity detection, you know, if I walk a lot and palate that I'm walking and it's measuring my sensor data when I'm walking and if I run a lot which should be pretty nice right and if I run a lot like feet at the date and if I bike a lot and drive a lot and go on trains and going Planes, you know these kind of things are feeding all this data and feeding the answer is telling it. Okay right. Now I'm walking right now

biking right now. I'm on an airplane. Then the idea behind machine learning is it will build that binary blob for me which I can that run inference on. I'm by running inference on that then it's like, okay. What am I doing right now and it will deduce what I'm doing right now from all of these rules. And so instead of me trying to write all this FN FN FN FN, you know, I train a system with data and with answers going to give me bike rules example of this. I'm so what if we want to build something that would determine from a picture. What's a cat and what's a dog? Okay. Now if I

was doing this with if-then rules, I could probably do something like, you know, if it loves you unconditionally no matter what you do. It's probably a dog right right now plotting your murder. It's probably a cat but right now it's really hard to infer that from images. So I couldn't think of what the f stand rules would be. Sometimes I get my have an image of a way to pick up pointy ears if they can't but guess what this dog has pointy ears. So, you know this this is where we start thinking about this one machine learning start opening up these new scenarios and

these new ways that you was a programmer can bring value to your employer's unto your business. So let's talk about the answers in this case Okay, so the answer is here is like this is this is a cat right? You know, and this is a Things a dog, right? Okay, and this is not really pointy ears this little guy but this is clearly a cat. This is a walking tongue, but it's also a dog and this one is what happens if you feed a Mogwai after midnight, I think but this is actually one of the dogs in this set of cats and dogs so they can out

here, you know, I've got the data and here's I'm giving the answers. I'm having the machine what all of these things are. So now if I want to do it as a programmer, all I have to do is train a neural network by giving it the answer's get we told them labels and giving it the data but as a programmer his swear it was tough for me and I've only very recently been on tensorflow because what is it the first of all how to start working in Python any python developers are a few of you it's a lovely language but you sent to work in a text editor instead of an eye on things like staff 3D bug and cuz

I came from a background of like visual studio and xcode and Android Studio is it was a big learning curve for me to get into that and then with if you start at 10 to go with Kraft faced execution, it was very strange because you have all your code would like load up a graph and then you'd have to do that graph and it's like but what if I've got bugs in like reading my data or if I'm not reading it properly or if I haven't Shuffle the date of properly. It's really important was mentioning, Oregon to shuffle your day to properly or you can in Surabaya since your actual training and I'm not a

very good programmer. So I generally like to write two or three lines of code at a time spent. Through that make sure they work then write another two or three step through them. Make sure they work in that kind of thing. I know it's really difficult for me to do. Is it as a pencil flow developer so we can switch the laptop. I want to show why I'm really excited about eager mode. And the screensaver kicked in sorry. I'm just saying it so why I'm really excited about any commode in tensorflow is I taken and Josh has written a notebook with this cats versus dogs. So you can go and get

this code online and run it in the North Pole right now. I'm running in pycharm because like I said, I'm not a very good developer. So I want to write a few lines of code executes them write another few lines of code in pycharm. If you're not familiar with this a free Community Edition, which is what I'm running here if use Android Studio with probably looks very very familiar. I want I can actually do is I think it went in my cat say run debug RI and then I'm going to go and I'm going to debug and I'm going to keep cats versus dogs and it's going to start executing and look I hit a breakpoint.

You know, it's like this this to me is magical is about a person that when I meant when I put my breakpoint, I can start stepping through my code and see what's going on in my code. So you know what, I'm going to write another break point. I said another break point down here. So I'm going to execute my coat down for that break points and that when I start stepping through this I can start seeing that I'm doing things correctly, you know like it. Okay my data directory set up for Play and I have to find my labels and you know what I'm going to put another break at Levi breakpoint there

and I'll run down to this break one and I can start looking at what's going on in here. So I might raining images and if I hover over that you see it's like it's a little too big to print but when I look at this in the debugger, I can now start seeing that look out started loading up these arrays of images and I can see the image zero hair. Sorry if it's a little small but images Arrow here was cat number 921 and so if I then keep stepping through and this is what egar execution is giving me. Whoops. Sorry. I had a break I had a break point. I didn't want to hit so I'm just

going to go down to here and continue running. All right, so and if I then step over this should show me a cat. Okay, so like okay, I've gotten this cat. I'm loading it in but is this the right calf? You know, how do I redeem my code badly? I have I got the wrong time so I can go. Hey look, you know, here's my training data. I can go look at my cat's I Can Fly Cat number 921 and hopefully it's the same cat so I know my code is working right? You know anybody else develop like this you like to do a little steps. Oh good. Just me cuz I I find it really really difficult

to write a lot of code in one sitting. So I just got to do a baby step by baby-step on cat number 921 is here and there it is. It's the same cat sound like excellent. I know my code is working. I know. I know I'm loading properly from my data and I'll go through here and I can see you know, Milo damage code what it's doing is it's reading the image of creating a tensor out of that image. It's resizing it to like a smaller image that were going to use in training and when you come from a world where like the hello world in machine learning is handwriting recognition, you know for me hello

world was a Sprint app. Hello world, you know that there's this some very complicated scenario isn't very complex in areas with this breaks it down as a developer and I can stop poking through I can start looking at data in the ID and I can understand really what's going on. So I'm going to do is I'm just going to execute execute my code a little more about this to operations. So we're saying like tensorflow. Read image and either working exactly like they would work if you're using numpy because we're running eagerly this code is imperative and

executes right after you hit the line. Also you notice when it's really cool Hubbard over it. You can see the data speed of data and not it's it's concrete not symbolic because we're running late. So, of course, if you are, you know, if in Python you'll excuse the console this wild one of the nice things in pycharm is that I've got a console window. So if my existing code is printing out to the console I can do that. I'm just going to step a little bit further down and I want to go just to the training. I'm one of the nice things that Josh was talking about is that this is just an editor

writer in Python, you know, I can now step through and I can take a look at what's going on in my training. I can look at this box by batch. So if I got 6,000 images and I'm loading and I split that into smaller batches. And then once I've done all of those patches, that's an Epoch. So now I can just go through and I can look at my actually pops and I could look at my training and I can step into it and I can see what's going on as I spent another quick to show you how fast is apis are evolving don't notice that we're actually getting an Umpire value of the images in the labels. So

you can be no longer need that any man gets a first grade. But thank goodness. It still works on one of the really nice things about it. Of course. It's because tensorflow is open source, if you want to know what's going on with your training maybe something is going maybe uploaded data that's causing it to crash or something along those lines aren't you doing this? And as you're executing these things, let's see if I can do it, you know, and I'm going to hit pause at some point. I have now jumped into the tense of flow source code. So not only can I step through my source to

see what's going on. I can actually jump into the tensorflow source, so maybe I've done something to trigger a bug. Tensorflow or maybe I'm doing something wrong and tensorflow isn't handling it properly and I can fix it and then contribute back to tensorflow. So all of this is we made possible by eager execution in intensive lower. The price on them as Josh Mansion. All you got to do is use 10 slow version 1.7 or later and set up that executing eagerly. Can we switch back to this light now, please? So you're fired promise Josh. Thank you very

much. Thank you. It's been a long time an eclipse resources so versus we have a machine learning crash course, it's pretty good. It's short. So this is probably maybe a day or two days of your time. I'm just won't teach you all there is about ml but it's a very solid bottle introduction to what is Los what is gradient descent recommended book which is by Google Earth and I'm going too badly mispronounces name, but the author of care is Francois. It's a wonderful wonderful book. It's called Deep

learning with python. There are many many books with very similar titles to deep learning with python. But the one you want is written by Francoise. It's published by Manning and it comes with a collection of jupyter notebooks withdrawal stream available and can tell by the way. It will basically teach you how to use care us because careless is part of tensorflow. All of that code will work. With no changes other than Imports directly in tensorflow and that will give you a really solid foundation in carry us and then later if you're interested, you can learn how to use things like

tfdata and other Advanced features of tensorflow to do even more but it's a wonderful starting point. So take away another thing to so I wanted to also show tensorflow JS in this but we only have 40 minutes with a careless model. You can literally do model. Save. It's one line of Code Geass a model. Save and like food. Some path or whatever food at hd5. Now you've stayed your mouth with a disk using tensorflow JS. If you go to jail if they have tutorial to show you how to import relatively easily that save Keras model is the browser and so the same ATI compatible across these platforms.

So basically check out collab regardless of you using tensorflow not cool. That is a really really valuable resource. It's all so wonderful educational contacts, you know, if you're teaching after work or anything like that students can just jump right in check out the workshops. In the next three months or so we're going to be updating a lot of art tutorials and tensorflow tutorial to scare us. But in the meantime, I'm just haakenson samples together so you can see what it looks like on demo Sanjay esta Santa Claus. Org like this one. We can actually train it to

recognize your face going left right up and down and then you play Pac-Man by going like, this is it's really cool super goal. So take a look at open source. Thank you very much around afterwards for as long as you want to take questions. I really appreciate your time and hope the stuff is useful to you. Thank you.

Cackle comments for the website

Buy this talk

Access to the talk “Get started with TensorFlow high-level APIs”
Available
In cart
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “2018 Google I/O”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “Software development”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
161
app store, apps, development, google play, mobile, soft

Similar talks

Yufeng Guo
Developer and Machine Learning Advocate at Google
Available
In cart
Free
Free
Free
Free
Free
Free
Martin Görner
Software Engineer at Google
+ 1 speaker
Yu-Han Liu
Software Engineer at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Sara Robinson
Developer Advocate at Google
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Get started with TensorFlow high-level APIs”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
561 conferences
22100 speakers
8257 hours of content