Duration 14:27
16+
Play
Video

Overview of AI Notebooks on Google Cloud

Suds Narasimhan
Product Manager at Google Cloud
  • Video
  • Table of contents
  • Video
Google Cloud Next 2020
July 14, 2020, Online, San Francisco, CA, USA
Google Cloud Next 2020
Request Q&A
Video
Overview of AI Notebooks on Google Cloud
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Add to favorites
5.34 K
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speaker

Suds Narasimhan
Product Manager at Google Cloud

Experienced Product Manager for cloud computing services on AWS and Azure with expertise in pricing, licensing, product positioning, go-to market strategy, and developer experienceExceptional ability to lead international, cross-functional teams to deliver on mission-critical initiatives. Extensive experience in Technology , Health Care nd Financial Services industriesAward-winning team leader

View the profile

About the talk

Notebooks are the go-to tool for data scientists to develop and deploy AI/ML models. Watch an overview about AI Platform Notebooks, an enterprise-ready managed Jupyter Notebook service.

Speaker: Suds Narasimhan

Watch more:

Google Cloud Next ’20: OnAir → https://goo.gle/next2020

Subscribe to the GCP Channel → https://goo.gle/GCP

#GoogleCloudNext

AI125

fullname: Suds Narasimhan;

event: Google Cloud Next 2020; re_ty: Publish;

Share

Close. My name is Suds narsimhan. I'm a project manager in Google, Cloud for cloudy. I noticed. Today, I'm going to walk you to an overview of Cloudy. I know those are managed Jupiter lab notebook service for Enterprises on Google Play and deployed into production official intelligence and machine learning is what are one of the most are one of the key. This disruptive technologies that enterprises have been found her in the last 4 or 5 years and am I a n m l looking to middel district and the Enterprise's going forward for the next 10 years? The key

to unlocking value with artificial intelligence starts with the simplest way. To develop a model is to use notebooks in within notebooks to use. Open-source framework to develop models tensorflow pytorch and many other oil OSS distributions exist today that can help you quickly develop a model and gain insights. I said Dana Point today. More than 8 million. Jupyter, notebooks drawn on this notebook. Count, only is going to continue to drop Google has been in the Forefront of

bringing notebooks two different segments of customers. Over the past decade example is targeted at enthusiastic and people who are learning to be to sign just to get it at Den to channel 3, Searchers and students and it provides a collaborative environment for developing data science. I'll break it apart for notebooks are targeted towards Enterprises. This means that security building in which Enterprises really care about today. We're going to spend some time looking at are platforms

and how it can speed up your metabolism. As we noted Enterprises want insights in days, not weeks. This means that data scientists want to focus on the difficult job of moral development and not essentially the pedophile, task that goes around running a notebook and infrastructure associated with a notebook. Enterprise's encounter several challenges, in running notebooks and open-source framework. Just the first thing is around managing open to a Frameworks and making these a meeting. The operating systems to Frameworks the dependencies, work correctly,

hosting notebooks, and maintaining them on top of the Screen. Works is an additional pinpoint. Customers also want to make sure that the access to the data that they designed and the models developed in the notebook in cells are secure and they're only accessible by the right Personnel since I N M L involves accessing data across the Enterprise landscape, it is important for a notebook service to connect with all of the data sources that decide within an Enterprise is today's date. Are a iPod 4 notebook, service Southeast Problem by providing 3-g features.

We completely managed the installation of the jupyter notebook, environment, along with all the open-source Frameworks and dependencies, so you don't have to manage any of that life cycle. Our notebooks also ship with security features like b, b, c, s, e, c Mack and other gcp. Horizontal features built-in and Claudia. And what verse is also integrated with several data services, like big furry and they will Deep dive into a little air. Notebooks are really easy to get started. All you have to do is to go to the gcp console,

find a note books. Under the iPhone notebook, create a new instance, with the required applying framework and the type of p.m. that you wanted one to know if the pain So what makes REI notebooks are fully managed to experience what we do as part of Florida and notebooks is to pre-install, are all the find libraries in all the environments, we support popular, open-source environment, select answer Flow by torch and mathematical libraries like numpy and sci-fi. We also provide open source tools like my bills right into our deep running p.m.

on top of these deep learning vmsr notebooks run and they take advantage of these people, any libraries customers. Can configure the notice to the console however, they can also get a link to access the Jupiter notebook using epoxy outside of the gcp console. Korean notebook designed to work with different kinds of compute instances with ngcp wrong with DL. VM says, we just spoke about these RTC instances that come pre-installed with deep learning framework

react to work with custom container. So if you had a specific deep learning library that you wanted on within the custom container, you can do so and install a notebook on top of these containers, we will talk a little bit more about our date of an infection in the slides a follow. So you'll be able to query data that stored within and use that data to train their models. Just take a look at how a iPod for notebooks are architect to design to work with a variety of compute

instances. We just talked about how it works with deep sorrow which are g c, container of instances with free install, deep learning Frameworks within the VR. A iPod for notebooks also work with data clusters and data flow containers. Which means that if you have a special diploma frame or did you want to, you can create a custom container out of it and have an iPhone notebooks work with us containers, for example, in the query data from bickering, you said data as the bassist of Queen your model and other source Control

Systems. So it's easy to push and pull code into the suppositories Why would you want to use a app lock on notebooks, really easy to get started with your model development since you don't have to deal with installing these deep learning framework or setting up the notebook environment, you can get started very, very quickly being Edition since I've serviced the environment around code Def Leppard is very familiar for data scientist. Answer for notebooks enable new skill and control

cost-effectively. So you could start with a very small engines for model development. And as you would ramp up your training, you could move to a bigger VM or better, yet you could use a GPU or much powerful machines in order to train your models. Airport for notebooks comes with security building. So as we talked about BBC, scim controls, as well as see, my controls are built right into a high platform. Notebooks are plots on notebooks are also integrated with other services within gcp. So it plays within an ecosystem of analytic tools as well as our platform.

So it makes it really easy to build train and deploy these models. Let's talk about some recent updates that we have made to our notebook service in the last 6 months. We recently announced an integration with K gold candle. As you may know, it's a community of five million data scientist who write and share code with this integration between, can I go and apply for notebooks customers have access to a CVS Limitless, and customizable impute environment that allows users to scale up their work for easily. We've already seen thousands of customers from cago, go to

Claudia notebooks using this integration and we expect these numbers to grow in the next few months. One of the recent Innovations be announced was with that approach to the smart-alecky screen work this morning, Alex Rainbird brings together, Apache beam and dataproc together. With a airplane for notebooks, I need to analyze data, bytes of data before starting to work on a model with this building integration but spark and deep learning can be author from one place. That brings Enterprise controls the spark plug Based on Notebook, enough interfaces. If you want to learn further about

Apache, being notebooks. There's a fantastic tutorial. When you launch, when you launch a notebook, using smart and extreme work. And when you choose Apache, Peak experience to using smart alecks framework, is super easy to get started. All you would have to do is make the smart analytics Frameworks choice, either pick up. I should be more data distributions instead of a chance of snow or a fight torch distribution as you would for a deep learning environment. So the goal of Claudia notebooks is to how to develop models quickly and I love you too, easily scale and streamline LED

model within your production workflow. Let's take a quick look at some of the examples around how you can go about doing that means to plug into your data pipelines that enables feature engineering, as well as daily bling very easily. I know this helps you model your data efficiently, with Waters analysis and also embedded NORML capabilities. You can evaluate your model, the cleavage feature attributions. What if analysis and continuous evaluations, we've also built an algorithm site optimization automl nnas right into Claudia notebooks. That makes

model developments a simple and faster on top of this platform and multi Cloud platform that also has access to some of the most powerful computer you will find anywhere in the world, for example, CPUs. How to make notebooks for makes it easy for you to scale and streamline your model development with these notebooks Works in an ecosystem services and AI Services. Cody, I know this is can be easily integrated into your day at pipeline. So if you ingest a they're using their Fusion or transfer data using dataproc and put it into the

query, you can easily access that data label the data. And you said as previously mentioned notebooks also makes it easy to build develop and training model with a broken, tooth attached, pre-built algorithms, automl and access to work cleaning service. Claudia Arnold books. Also makes it easy to deploy these models. By verifying your model performance, using explainable AI as well as directly deploying the model into a prediction service. Operation service, auto ships with A continuous evaluation feature that helps you to monitor the model.

Once it's in production, Ollie's confidence are built with access to a pipeline that makes it easier to contain a rice and monitor the entire workflow and Jen. 162. Let's build into a high Plateau. Notebooks is explainable AI. Easter Bunny. I need a set of libraries that help you figure out why your model made a particular prediction, for example, on the right, you can see a color coded map of what the most important feature Smart in making a particular prediction. Explainable a I can be used with tabular image text tensorflow models

with a built-in water stool. When will the waters? Do you change the values for attributes dealing with that your building with a model and figure out how much it impacts the prediction of a particular model. Explain the unexplainable AI libraries are already available in some of our deep learning disabilities today and they'll be available in more in the coming weeks. Of course, he had platform notebooks and Nanny. I optimized infrastructure within gcp example, rv3 TPU parts for by 84% faster

performance, in both object detection and machine translation compared to on from infrastructure. This AI optimize infrastructure is only available on gcp among public clouds and they clearly outperformed on Prime Systems in a helper. Looking ahead at our roadmap, extending Beyond 2020, it's worth looking back on where we came from, we launched a iPod for notebooks in betta, in April 2019. And since the time that we have launched, we've seen tremendous interest amount Enterprises for developing models using powdered. We seen significant

option in the last year has been successfully, GA the product in March 2020 and 2020. We hope to build in features and machine learning operations limits, tracking collaboration and Innovation across our entire analytics estate available on the Google Cloud blog site, as well as get started with using a five-minute tutorial that's available on a product page. You can also learn about all the features and in-depth documentation on the product page on Google, put back on. Thank you for joining us today. I hope you enjoy this presentation and you have a good rest of the day and you

enjoy our other next 20 presentations,

Cackle comments for the website

Buy this talk

Access to the talk “Overview of AI Notebooks on Google Cloud”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free

Ticket

Get access to all videos “Google Cloud Next 2020”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Ticket

Similar talks

Sandeep Gupta
Product Manager at Google
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Kaz Sato
Developer Advocate at Google
+ 1 speaker
Rakesh Talanki
Google Cloud Platform Principal Architect Services at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Overview of AI Notebooks on Google Cloud”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
635 conferences
26170 speakers
9693 hours of content