Duration 36:00
16+
Play
Video

Cloud AI Services: What they are and how to use them By Karl Weinmeister, Manager, Google

Karl Weinmeister
Manager, Cloud AI Advocacy at Google
  • Video
  • Table of contents
  • Video
Request Q&A
Video
Cloud AI Services: What they are and how to use them By Karl Weinmeister, Manager, Google
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
63
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speaker

Karl Weinmeister
Manager, Cloud AI Advocacy at Google

Karl is a Developer Advocacy Manager from Google’s Developer Relations Artificial Intelligence and Machine Learning team. Karl has worked extensively in cloud and mobile, and was a contributor to one of the first AI-based crossword puzzle solvers that is still referenced today.

View the profile

About the talk

Powerful machine learning capabilities are available as Cloud AI services. You can embed off-the-shelf models for vision and natural language processing into your applications with APIs. AutoML can help you build a custom model with a minimum of code. If you do want to build a custom model by hand, there are serverless tools to develop, train, and deploy your model. In this session, we will discuss why and when to use AI services versus training your own model. We will cover differences in data requirements, performance, integration, and ease of use.

Share

All right. So let's go ahead and get started. So I'm going to talk today about AI in the cloud, of course, I'm with Google. That's what I know best. But I think that why you're going to hear about all the different ways that you can leverage cloud services, whether that's through ATI all the way to custom model development and everywhere in between with auto milk. So whether you're an experienced data science developer who's looking at bringing some machine learning into your applications. I hope there's something helpful for you today and

some of these tools in action then cover 80 I will move on to see how to build your own custom model and then finally automobile. The first to write use case and the reason I start with this is that machine learning as we all know. There's a lot of Promise. There's also some height it's useful to know where can you get a lot of value out of machine learning applications a few clusters of use cases that we find for machine learning one is Predictive Analytics and that is looking at historical data and trying to make predictions and recommendations for the future so that could be if you some

examples here fraud detection looking at it past transactions and using that information to look at a new transaction and make a determination on that one preventive maintenance. That's another common use case with industrial equipment by looking at sensors and other ways that we collect information. Can we prevent a catastrophic failure before it happens a couple others here quickly? We might look at sales of different products or at different stores. And can we look at what might happen next month or next quarter so they can make better

decisions. Another area is unstructured data. So maybe videos images Text data and making some sense of those triaging those and clustering that information. Automation is another common use case where we look at a step of the workflow. That might be error-prone or TDS. And how can we use machine learning to act on the data and make a good recommendation automate part of the process examples here maybe paper documents that we can now, you know digitized easily

another area is triaging information. So maybe we have a trouble ticket at a customer inquiry and we can figure out which customer service agent might be best for that issue based on what's in the text and just a few examples of things you could do finally personalization. So, how can we get to know our customer better and Taylor are experienced with the data that we have? Okay. So when we're talking about the tools, I never discuss use cases. The essence of this presentation is that there's a spectrum of possibilities

on one hand. We can work at the lower level at the machine learning infrastructure level at the other hand. We can talk about API that don't require you bringing your own day diet for building a model for your leveraging an existing model and simply passing an information and getting some outfits out of it. And then in the middle of the idea of a platform of tools to create a critter Model 2.0 model test cetera, and then which can help automate some of the So let's start

first with pre-trained models with apis a few examples of capabilities ranging from Vision to natural language translation. It's better for you to complish Common machine learning tasks. Let's start with the natural-language API. What's actually do a demo? So that just makes it a little bit more fun? If I'm going to go to the Cloud 2 lecon such natural language site. Just just for fun. I put in into this web demo information from the conference about the conference and plug that in here and

submitted it and it really what's happening. This is a web demo that is calling API under the hood that we get back. So there's four categories that you see her. The first is entity extraction. And this is where we take text and we figure out what are the mean objects are Concepts him a text and what comes back is a ranked list of these items with the Salient score which tells you how Central is that item to the meaning of the document? Various words you're tagged with whether their personal location Etc

the extraction feeling about the document ranging from one positive to negative one at a negative and you can see it. Let's just look at some of the sentence we see the document was a force of positive and then we can see that an individual paragraph level and you know, we say, okay we can set up a dynamic registration. Then you see easy to use Easy to monitor a positive sentiment etcetera. And then you also see it at the rate that the FED scores has a magnitude so that will tell you not just what the score is but the strength of

that assessment. The next thing is syntax so we can parse the sentence to figure out the relationship between the words but it's in the type of speech for theirs plural singular all that come in. And finally we can categorize the text into a common set of categories, right so that we'll talk about when you need to do some kind of custom categorization is case or around this. That's a natural language API. To use it. I'll show you a few examples in different programming languages. You would need to do a little bit of authorization, and then you would

a passing your document and you're sure what you're doing yours in the curl command. Its nature feel my wrath Linux you just directly calling at rest API and your the URL is the Google language API URL Animal Clinic classified doc next human class by text operation on that and you'll get back from Json that your application parks. Let's make lots of division API. The vision API provides many services. I'll just name a couple of these might never coming use cases image

classification and he's going to top left where you able to look at an image and provide labels about what's in that image. Object detection round rental location of various objects in in the image OCR where we're trying to extract text supports the other apiz consider. So let's take a look at it kind of fun example here where we have an amusement park in Sydney are called Luna Park and coming back correct first and foremost. It comes back as a landmark, but you'll see the other key parts of the picture the sky daytime architecture etcetera. So you just get the set of

labels and his confidence is coming back from API. With OCR hear you noticed that it's able to pull back a two words from a picture. And even have Landmark detection so it can cross-reference with Google Maps and give you a it's noticed that at 85% confidence that is located where this picture is on the back giving you that location. How do you use it? And I'm going to show you the rest API again, we're going to move on to how to use sdks in a moment. But just if you're kind of

looking at the structure underneath you have an API endpoint hear you're making a request and you pass in the type of information you want in your request and then your response will have the information that corresponds to all the features you're looking for in that picture right whether you're getting labels back or looking for Logos are all the different things you can do that will come back in their spots. Can you put that are you do that in the SDK? So there is a fear of python SDK you import the

package then you instantiate a client to an image located in a storage bucket somewhere and at the end where it all happens is we take that client and we call the label detection method on it passing the image location. You're going to get back a set of labels and scores for those labels. The video intelligence API is very similar to The Vision API except it's looking at a sequence of frames and there's some other capabilities that can do you say things like tracking

objects or crosses frames or speech transcription. So it can you extract a text out of the audio. Here's an example in JavaScript. So we instantiate a quiet again and then locate the video file and tell it what capabilities we want to do label detection here and let me call the annotate video methadone that quiet. So what what comes back so very similar to The Vision API, but because it's a video we also get information about when that label enters the video here receive at the

dog starts at 3 seconds and exits at 5 Seconds right some additional context. And so, you know, you can do some more audio to text with the video on solid state fair, but also the text-to-speech API will do that as well. Eric so we talked about pre-trained date guys. Hopefully that gives you a feel for what you can do with those. What if you want to do something that is custom that isn't free to find categorical label that looks look into that. So let's look at this first an area where does you know trying to figure out his

patients go into the vision API and with 99% probability. I see that it is a cat's to great. But what if I'm asking a different question, what type of cat is this floor model doesn't our predefined model that I just told you about doesn't ride that so I will need to create our own model for it. What would be the steps to do that? Let's walk through this. So the first step is you would need to gather and pre-process your data. So you might take enough pictures with different angles different lighting of the types of cats do your best to

get enough of them in a balance that so that you have a similar number of each type. And then what you might do is split your data where you might take say 80% of your day has a train set that aside 20% of tests and it build a model out of your training set leaving some data aside so that you can get an objective evaluation of the model against images. It's never seen before. Okay, so let's look at building a model. So here will just show it a couple keeps the code here. This actually going

to be a little more complicated than you have. No network have to be with the cuz I'm showing a convolutional neural network. This is typically use free image data, but a lot of the code and there are no the tensorflow GitHub repository. You can do things like transfer learning where you can you don't have to reinvent the wheel as far as she might import some of the temperature for libraries here packages then hear what we're doing is for building a model for each

layer the neural network in sequence. So we have a couple convolutional players. They're sexually small matrices that involve our kind of go across the image and extracts and signal from the image will applying some other functions. We flatten it we go through another layer and then we finally end up at Youfit the very bottom the num classes soak the number of types of cats like say there's 10 that were classifying end up with Ken and a probability score between 0 and 1

of that are predicting that type of adjusting those weights biases until we find the right parameters that are going to predict best. So did go to that training process in the tensorflow Keras framework. You would call model. Fit passing your training data and then some other parameters. Now you're ready to evaluate your model. Once it's been fit. You going to look at how it performs on the test data to get some accuracy. And you're ready to deploy your model. So you might end up with a metal file and there are a couple Styles here. You could deploy your model to the cloud

where you could use it in a server-side application web application or something like that, or maybe if it's a mobile application where you don't know if they want that traffic to the cloud you could Deploy on device. Read how do you take advantage of model? You can call a method like predict and passing the images and images at the end of the day are really just Matrix with RGB values of the colors and So we talked about building the model. So let's just take a step back and look at there's a lot more than just getting a model to 99% accuracy. Whatever you

wanted to do the lots of Apple Jacks that I mean, how do you test the model? How do you connect it to systems that are going to use the model? You know, how do you validate data that you use coming out of Model 1 about retraining your model with new information to create new models and after you've done that research phase of your project and you've reached a good point with your model. Okay. So the AI platform that provides you the capability to build your models with no books by Training Services

and a whole lot more as well as pipelines to connect all the steps together and make this our true production deployment. And briefly walk you through a few of these things like so you might create a new book server here here we're seeing where you were creating a virtual machine preloaded with your favorite Library weather that's tensorflow by torture something else you pick how many gpus the size of the machines that are able to go right into Jupiter lab which will show you know, there's also serverless training. So maybe the size of your job

doesn't fit onto 1 virtual machine and you won't want to run a large job. You could use a serverless infrastructure which will distributor training jobs. You don't have to worry about the back-end infrastructure. It will do the training and then perhaps when you're finished put the model into a storage bucket so that you can use it in the next up your pipe 1 along with your training job. You can pass into the range of parameters that you want to search across to find. optimal parameters I want your training model. You can also host the model in the

cloud. So this or something like the prediction service comes in. This gives you an online and point so that you can perform inference on that in the cloud. It has auto-scaling capability so we can scale up and then scale back down to zero as needed. It handles all the logging and monitoring for you and there's a variety of different options in terms of AI Hardware to Leverage. That sounds pretty cool. But what if you're just getting started or are you just I want a little bit of a fast-track here. Is there a simpler way?

Yes, there is cloud. Get the process that I was showing you before I take her for you. Of course, you start the identify the problem and gather the data, but beyond that point it's going to handle everything from splitting it and train test building the model you finding the optimal on some Windows model all these kind of thing your future engineering. It's going to train the model and deploy it for you as well. So you can have a rest endpoint for that and then you can just make predictions with your the auto no model. Let's look at a couple of samples at so it start

with the vision. So with automobiles when you can start with creating a data set and you tell it what kind of problem do you want to solve and you see a few examples of the main label for that picture is it multi-label? Where is the object detection? Then you might import your images and hear what you're saying is that you might import a CSV file that CSV file would have the location of each image along with the label of the image and it will go through that import the images might take a few minutes to do

that. And then you're going to end up with the screen here. We can view your data set. And here we see the various flowers have been imported. We can see the labels for them and we can adjust some labels if we need to give you an overview of the so now with that data so we can train our model. So here we train a couple models and you can see some accuracy statistics on those models. You can then deploy your model where you create an endpoint to serve

it to your users or exported. Okay, so sorry about ordinal natural language. So remember before we talked about pre-trained models, we took that example that the conference we did entity extraction sentiment analysis LLC. If you want to do create your own customized version of those how would you do that? Simple kind of fun example here on kaggle, which is a data science competition site. We have the movies latest set when you see 45000.6 million ratings. Let's see what we can do with that. So say we wanted to take a

movie description and classify the genre of the movie so we could do something like that with other than milk. Another thing might be custom sentiment. So here we are. We might have terminology for our specific problem or industries were looking at the airline industry. Maybe you want to tag some of these terms you waiting on the tarmac negative so much legroom positive. And so you're kind of looking for some of those phrases which you know, most likely that the the standard sentiment analysis a couple were great, but

you can really fine-tune out in create your own model if you'd like with his capability. Same thing with entity extraction instructions going to work for you seen examples are some of the entities are for General we say, you know, Laura is a person she wants a consumer good, but maybe you wanted to tag those a little bit more specifically. It's a beverage the restaurant Setter you could do that with custom bottle. Why would you do that? There's an example in that DUI. Where were kind of looking at various venues and we wanted to kind of take the menu and then parse out.

You know, is this a food item? Is this the name of the restaurant? Is this a heading on the menu given enough of these Menus? Can we start to look at a new menu and trying to figure out what the entities are on that menu? All right, the final area of Auto. No, I want to talk to you about today's automobile tables. And let's look at a fun example of a look at it a business example here. So another kaggle data sex is the Spotify music genre that's got plenty of

music do we want to try to do is predict the genre of the music like we did with the movies based on the information in the in the state of Base. Listen to something we might have a room here ristic like our brain is comes to conclusions and you say whatever something gave out of white. You know, what we're doing is we're looking at some quantitative data, you know, maybe it's the loudness the key signature the tempo all together this day. Can provide us some information about that piece of music and let genre is it likely part of it?

So we could build a model pasta into automl to to do a classification prop to figure out which genre belongs to a different type of the price of an item to imagine that we're going to list something for sale and we are not sure what makes sense, you know, so we were listing a piece of clothing we put the brand what this would do hear what you want to do is look at historical data and then guess what the right price would be. So we Define a Target Colin and Autumn L tables. Directly did this in a cargo competition and

what we what I want to show here is a question is you know, how old is automl do? Okay. So if you you know heard of cowgirl your price in there that there's a lot of great data scientist on this platform to end in folks there. They're learning would like these complications can last for weeks folks are working, you know nights weekends improving their models everyday moving up in the leaderboard. Okay. So what you see here is on the y-axis on this video from the error rate and then on the x-axis what you're seeing

is the leaderboard rights away with this trend is showing us that as we get closer and closer to the top of the leaderboard. Our air is going down right? We're trying to get as close to zero as we can. But what you see, is it flex hose at about a thousand and so what Just telling us is that there's a certain point where the problem is is pretty close to being solved there just minor variations among the data scientist purchase there. And so when we applied Auto nail table to it what we saw it is it with one

hour so great. So you can constrain the search to say, okay. I'm going to give it the budget of 1 hour or 8 hours or whatever. It might be too, you know perform this search for the optimal model and you found one hour that it hit that Flex up and then as you were able to provide more time by you can continue to kind of work down that curve. So this just puts it in perspective where you know, how old is Mel tables can can do So in summary, we saw three different kinds of cloud

AI services today apis platform in order to know and I think we have a little bit more time. I'll do a couple quick edematous are some other things. This is using the natural-language API. Yeah, they have a bell. Okay, perfect. Thank you hear what we're going to show you is the API clothing reviews. So let's look at these review. So first thing we're going to do is we're going to import the CSV going to skip over yet. What band is is in Jupiter allowed to send Trista Time by conference? You pull back the

first five rows and just take a look at those. So what's in here, you see the text, you know a review and some other information what category of clothing it is. Now, let's apply a little filter. We want reviews that are long enough that we can expensive and we don't want to review with, you know, two words in it, right? So here I'm basically applying a filter to that Colin and just getting along the reviews and I've already done a little bit of clean up so we don't need to delete that call. Let's pick one of these reviews just as an example. I'm going to pick the third one here.

I had such high hopes for this dress. I really wanted it to work, but it was too small, you know, and it goes on here for a soda. This is the Real review. Let's do some sentiment analysis point. So you see where you know, we import the packages instantiate the client and going to call Annalise sentiment on it and will return with that sentiment. It's so it's you. So you get a sentiment of -2.5 coming back in a magnitude of 3.5? 3 Colt okay, not not too bad. So let's let's now think about

you know, if we're running this business and we're trying to look across our whole product portfolio. How do I make sense of all these reviews? Maybe I can get some insights into which clothing items are on our aren't you have high customer satisfaction? Which don't where do I need to fix some issues? So let's take a sampling of a hundred of these items and let's call the API for it and let's do some analysis across multiple areas. So what I'm going to do here is I'm calling the analyze sentiment and I'm attending the scores and

magnitudes to my original data set and Jungkook at making them. So what that looks like it's here. You know categories and all that and now you see that I have two more columns scoring magnitude guy. So we got for showing data augmentation using machine learning to give us Knowledge about our data now. Let's look across the data set and let's do some visualization. So what am I do he refused to the amount plug LED Library here to do some analysis. So this is a box plot. If you haven't seen this before this is a

nice way to show the distribution of data. So here we have each of the categories to the line in the middle of the median the top of the box 75th percentile bottom 25th. And then the men in the next show in one glance. You can be started the spread of the data can be helpful where you say. Okay. Well, it looks like for swimwear we're doing great. But you know, maybe pants there might be some issues in our inner product. So we're using sentiment analysis to fix more insights about our products. Next

is entity extraction. Again, here's our texts and we would call analyze entities and I'm just going to pull back the first five. And so you see her what are the main words? Right? We've got hopes dress, you know, that's a lie and score. You see where which character would index where you can find them in the text. So that's that's a natural language API All rights. I think I have a few more minutes this time for tabular data. So here's the scenario. So you got government fire department of New York has this data set is this cuz it only has a few call him so we don't have

to really understand too deeply what's happening. This just gives us response time for how quickly they responded to various incidents overtime. See you see a call in for the month the type of instant. It was a false alarm and medical emergency where in the city that I have. How many times and finally what we're trying to predict the average response time? So we would go into tables here. We could import that CSV. I've already done this to us were from

Then once you import the data, it will give you some statistics here. So you see you know, how many categorical variables in a timestamp numeric excetera quality indicators percent missing invalid correlation with the target event. So here's where you want to look for. You know, if you have something near zero, maybe it's not that we don't know for sure. But maybe it's not. Do you know what useful feature in the model if it's near one? That might be a warning that you had some leakage where that feature is something that information that you

really wouldn't know prior to making the five predictions. So just this gives you kind of a quick glance things. You can click on any of these and see the distribution of the data and more information like that. Okay. So then what you can do is just go ahead and train your model you pick the budget that you want to put into it. Say one in which features you want include you also have some Advanced options if you choose around, you know, which you know optimization objective or you might call the Lost

function depending on if you have outliers or not and what you want to use for that and it has early stopping. So if it's not making Improvement, it's not going to use up the budget and you go ahead and your mom are going to do this all today. But you know, it wouldn't rain the model then you would end up with one of these models showing up. It will give you some accuracy statistics. So what you're seeing here than mean average are 9 seconds off. Not not bad and that are it was about 4 and 1/2 minutes or so with the average response time. So

you're not bad and you can click and get more detail there and things like explainability so we can look at Remodel and see which factors were the most important in the production. So he looks like incident classification was number one. So, you know, maybe also alarms no boosters medical emergencies met, you know, whatever type it is makes a big difference. Then we're in the city the second and then the time was the third most important prediction. They finally how do you use this? So you've got a few different options you have

batch prediction where you can take a whole bunch of rose from a CSV or from Big querrey prediction on those you can do online prediction where you can you call turn a tie or here just in the website and do a prediction and final you could even export your model so you can export it into a Docker container to use it in another application. All right, so that ends My Demo.

Cackle comments for the website

Buy this talk

Access to the talk “Cloud AI Services: What they are and how to use them By Karl Weinmeister, Manager, Google”
Available
In cart
Free
Free
Free
Free
Free
Free

Ticket

Get access to all videos “Global Artificial Intelligence Virtual Conference”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Similar talks

Chris Fregly
AI and Machine Learning at Amazon Web Services (AWS)
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Cloud AI Services: What they are and how to use them By Karl Weinmeister, Manager, Google”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
566 conferences
22974 speakers
8597 hours of content