Duration 15:53
16+
Play
Video

Edge TPU live demo: Coral Dev Board & Microcontrollers

Pete Warden
Software Engineer at Google
+ 1 speaker
  • Video
  • Table of contents
  • Video
TensorFlow Dev Summit 2019
March 6, 2019, Sunnyvale, CA, USA
TensorFlow Dev Summit 2019
Request Q&A
Video
Edge TPU live demo: Coral Dev Board & Microcontrollers
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Add to favorites
70.75 K
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

Thanks so much for a CDL and I'm really excited to be here still got a new project that I think is pretty cool. So let's play light to my cooking tell us what that was about to when I actually first joined Google back in 2014 and you can imagine that will whole bunch of internal projects. I didn't actually know about as a member of the public sort of blew my mind but one in particular for the first time he explained and he was on the speech too much time working with Alex you just sore

about time. I already had experience with image that works on the voicemail was the van was still like multiple megabytes. So this 13 kilobyte model was just amazing for me was when he told me why I needed to volume on these PSPs and other invented chips in smartphone. So I'm voice could listen out for Wake words like Hey Google while the main CPU was powered off to save the battery is my totem pole is often only had three types of lime and flash storage, so they said they couldn't say anything

that would have been drying just keeping a video connection to live to send data over would have just being a prohibitive. That really struck me that conversation in the continued work that we did for the speech team because they had so much experience doing all sorts of different approaches with speech. They spent a lot of time and energy experimenting and even within the weather today show methods were used. So I was left wondering if they'd be really useful for other invited applications as well. And it left me really want to see if we can actually build support for

these kind of devices into tensorflow itself. So that more people can actually get access the only people in the speech Community really know about the groundbreaking what was being dumb. So I really wanted to share a lot more widely. Today, I'm pleased to announce that we are releasing the first experimental support for embedded Platforms in tensorflow light to show you. I mean, he is a demonstration board that I actually have in my pocket and this is a prototype of a development board built by stop song and it has a pool table and 4 processor with

394 GB of RAM and a megabyte of flash storage and it was built by I'm back to be extremely low-power doing less than 1 million and a lot of cases so able to run on a single coin battery like this for many days potentially and I'm going to take my life in my hands now by tying a live demo Let us let us see if this is actually. It's going to be extremely hard to see unless we dim the lights. So what I'm going to be doing by saying a word and see if it actually lights up the little yellow light you can see the blue LED is flashing.

Yes. Yes, I knew I would take my life into my hands are. Yes. That we got her butt is managing to do a job of recognizing when I say the word unrelated conversations. So why is this useful first? This is running into a on the embedded chip. So we don't need to have any internet connection. So it's good news to pass component of a voice interface system itself, but it is on this device and the footprint of the tents of her light code for microcontrollers is only another 25 KGB.

Level 2 off, right? So it's within the capabilities of a lot of different embedded devices. Secondly, this is open-source. So you can actually grab the code yourself and build it yourself and you could modify you can actually I'm showing you here on this particular pothole but it actually works on a whole bunch of different embedded chips, and we really want to see what's more important. So we're keen to work with a community on collaborating to get more devices supported. You can also train your own model. Just something that make

noises yes isn't useful. But the key thing is that this comes with a boil that you can use to actually trying your own models and it also comes with a dataset of a hundred thousand of about 20 common words that use as your training staff. That first link that I biy project one. If you could actually go to that link and contribute your voice to the open dataset actually increase the size and the quality of the dates that we can actually make available.

So that would be awesome. And you can actually use the same approach to do a lot of different audio recognition to recognize different kinds of sounds and even thought to use it for similar signal processing problems like So how can you try this out for yourself? If you are in the audience at the end of today, you will find that you get a gift box and you actually have one of these in that. And we should need to do is remove the little tap on the battery and it should automatically boot up. Please lashed with this. Yes

example so you can try out for yourself and let me know how it goes. Just say it say yes to tensorflow. I like to see And we also include all the cable. So you should be able to just program at yourself through the serial Port right now. These are the first 747 built so that it's a wiring issue. So it will drain the battery. It was lost. It was lost more like ours than days. That will actually come would be fixed in the final four and he should be able to develop with these in the exact same way that you will

with the final shipping product. And if you're watching a home you can order these for my Spotify right now for I think it's $15 and lots of other instructions for other Platforms in the documentation. I'm so we're trying to support all of the many of the possible. We welcome Club location with everybody across the community to help unlock all the creativity that I know it and I'm really hoping I'm going to be spending a lot of my time reviewing pull request. And finally, this is my

first toddler a project. So I needed a lot of help with a lot of people to actually help bring this prototype together including the TS Light Team specially when I see how lucky Dan teman Andy Mason going to gym a spot for Life Savers. We literally got these in our hands middle of the day yesterday and all pet an ombre at Ambit who actually designed this process and helped us get the stuff going on. I see a lot of people as well including a big shout-out to mail and

Zack. So this is still a very early experiment, but I really can't wait to see what people built to this and one final note. I will be around 2:12. MC use with anybody's interested at breakout session on day two. So I'm really looking forward to chatting to everyone. Thanks. We really hope that you try this is your least eight years, but we think that it will be really impactful for everybody. Now before we go again, and I promise this is the last thing you hear from me. I want to welcome

June just going to talk about how by using Wheat Ridge dpu delegate are able to train these two shovel machines. Hi, my name is June taken. I'm actually one of the lead software Engineers inside of Google's new choral group, and I've been asked to give it talk about the edge TPU based Bejeweled machine demo. The first I can tell you what Coral is coral as a platform for products with on device machine learning using tensorflow and TS light refers to products or a single board computer and a USB stick.

So what is the edge TPU? It's a Google designed Asic that accelerates inference directly on the devices embedded in its very fast localizes data to the end and this allows for a whole new range of applications of machine weigh. So the first, kleefeld is the coral dead board. This is a single board computer with a removable song runs Linux and Android and the phone itself has a gigabyte of ram a quad-core a53 SOC Wi-Fi and Bluetooth. And of course the edge TPU

And the second is are coronal accelerator board. This board is just the edge TPU connected via USB C to whatever development system. You need Raspberry Pi or electric station. No, just teachable machine shows off of form of edge training. Additionally. There's three ways to do is paint weight imprinting and last layer retraining. After this time, oh, we're actually using the K nearest neighbors approach. So heinous animated gif you can see that the TPU enables

very high classification rates. The framerate you see here is actually the rate at which the TV was classifying the images not showing it in this case. You can see that we're getting about 30 frames-per-second essentially real-time classification. And I actually have one of the are teachable machine demos here. So we can turn the phone. Okay, so On this board. We have our HDPE development board symbols with a camera and a series of buttons and corresponds with the class and lights off when the model identifies an object from the camera. Wake me up at 1 to send.

Now everytime I take a picture by pressing one of these buttons and Associate that picture with that particular class and because it's running inference on the edge TPU is lights up in mediately the ones that finish booting. The first thing I have to do is training on the background. So I'll press this blue button and you can see it immediately turns on this is because in real time. now if I train one of the other buttons using something like a tangerine. facility X Okay, so now you can see

it can classify between this tangerine. And the background and further I can even grab objects. such as Bestia flight sticker look for Saint Lawrence the same color. so See, what was that for? Sorry. Now even though it's a similar color you can still discern the temporal full of flow light logo from the tangerine. So you can imagine in a manufacturing context your operators with no knowledge of machine learning or training in machine learning and adapter system

easily and quickly using this exact technique. So that's about it for the demo before I go. I should probably put her and also I should say. We're also giving away some Edge TPU accelerators. For those of you here today will have one available for you as well. And for those of you on the livestream, you can purchase one at Coral. With Google.

Cackle comments for the website

Buy this talk

Access to the talk “Edge TPU live demo: Coral Dev Board & Microcontrollers”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “TensorFlow Dev Summit 2019”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “IT & Technology”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
177
app store, apps, development, google play, mobile, soft

Similar talks

Pete Warden
Software Engineer at Google
+ 1 speaker
Raziel Alvarez
Software Engineer at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Alina Shinkarsky
TensorFlow Program Manager at Google
+ 3 speakers
Rajat Monga
TensorFlow PM at Google
+ 3 speakers
Megan Kacholia
Engineering Director at Google
+ 3 speakers
Kemal El Moujahid
Product Director, Tensor Flow at Google
+ 3 speakers
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free
Noam Shazeer
Software Engineer at Google
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Edge TPU live demo: Coral Dev Board & Microcontrollers”
Available
In cart
Free
Free
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
731 conferences
30008 speakers
11213 hours of content
Pete Warden
June Tate-Gans