Duration 41:41
16+
Play
Video

Web performance made easy

Ewa Gasperowicz
Developer Programs Engineer at Google
+ 1 speaker
  • Video
  • Table of contents
  • Video
2018 Google I/O
May 8, 2018, Mountain View, USA
2018 Google I/O
Video
Web performance made easy
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
60.72 K
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speakers

Ewa Gasperowicz
Developer Programs Engineer at Google
Addy Osmani
Engineering Manager at Google

Ewa is a developer platform engineer at Google Web team. A frontend engineer, focused mainly on making the web faster and more user friendly. Interested in Progressive Web Apps development and analysis, always keen to investigate common scenarios and emerging app design patterns.

View the profile

Addy is an Engineering Manager working with Chrome & Web DevRel with a focus on making the web fast.

View the profile

About the talk

The web has made great progress in enabling fast experiences, but building a fast site today isn’t trivial. At Google, we analyze a lot of sites and have learned what makes them load quickly and run smoothly. In this talk, learn how to find and fix the most common web performance bottlenecks to improve your UX by using tools like Lighthouse and DevTools, and discover the latest browser APIs (e.g. Service Workers, Async Images, Fetch Priorities) to get more control over your loading experience.

Share

Hey folks, my name is Addie. This is Eva. We work on Chrome and they were to give you a world win tour of tips and tricks for making your size flow fast. So let's get started. The internet's gets heavier and heavier every year if we check in on the state of mobile. We can see that and maybe on face removal weight is about 1.5 mix with majority of God being JavaScript and images. That's quite a lot as though there are other reasons why a webpage might feel sluggish and heavy starch particle Network latency

CPU limitations Parsons looking for all of these contribute to the complicated performance puzzle. We've been pretty busy over past are trying to figure out how to fix this Rich content of mobile web into the smallest package possible. In this talk, I did and I are going to show you what you can doing this year. Today, but also will take a peek into the future and give me a highlight of what might be possible really really soon. So I'm not the biggest fan of suitcases. The last time I was at the airport waiting at the baggage

carousel for my bags. I looked around for my suitcase and all I can find is this. I too fast the first four times this went around. I was really happy to get this back. I just wanted more of it that the experience that I had it at baggage carousel is a lot like the experience or users have when they're waiting for a page to load. In fact a speed as being at the very top of the US hierarchy is their needs and this isn't too surprising because you can't really do a whole lot until a page finished loading. You can't drive value

from the page. You can't admire Aesthetics. We know the performance matters, but it can also sometimes feel like a secret discovering where to start optimizing and so we've been working on trying to make this problem little easier for a while. And today we have an exciting announcement to share with you. We like to call it a Paul Irish in your pocket. Well, we're happy to announce and you'll be expanded set of performance in Lighthouse. Lighthouse is you know is

part of developer tools to make an audit of your website and also gives you hints on how to make it better. It's been around for awhile, right but as some of you fart today in the morning and today, I'm happy to share with you some of the newest audits that landed in the lighthouse overpass quarters will try to show them to you in action during the rest of the top. Is the internet saying goes 16 what performance is easy as throwing a horse, right? You just need to follow some steps carefully,

but there is some truth in this picture, but the good news is you can get pretty far by following some simple steps in order to show the steps. I do prefer something special for you. So evil, I am a really big fans of Google Doodles Google's been doing them since 1998 and we thought it'd be fun to create a nice little after we can show you some of our favorite interactive doodles from over the years. We call it the poodle theater. Here is an action. So we could get off with a drive-thru and as we can see we can now start browsing through the application looking at Doodles over the years. We can

click through to one and even start playing rock and roll surprise. You can go and you can play some of these in your own time. Just not during the stop, please. So let's begin. Our journey is the wild west of Weber performance and all the Cannons with one tool. of course lighthouse the initial performance of our ass cuz you can see on this Lighthouse report was pretty terrible on a network do user needed to wait for 15 seconds for the first meaningful paint over the app to get interactive Lighthouse highlighted officials

with our side and yogurt performance goes 23 years exactly that the page races at 3.4 Megs. We desperately needed to cut some fat. This stuff is our first performance challenge fun things that you can easily remove without affecting the overall performance experience. What is the easiest thing to remove usually nothing and do not really contribute to the cold like white space or comments Lighthouse highlights disappear Trinity in the mean if I see you and me

fight JavaScript or it would have been processed. So in order to get me if acacian we seem to use the ugly fight drai's play music station in Tacoma. So you should be able to find a ready-made solution for which everbilt process you happen to use. Other useful all the things that space is enable text compression. There's no reason to send uncompressed first. And most of the support is out of the box Tuesdays. We want to go skiing Firebase hosting to host our code and Firebase actually enables GDP by default. So by the sheer virtue of posting

on a code on a reasonable we cook that for free. Way of compressing other make and he's like Doc truyen broccoli. I get instructions as well. Let me enjoy support in most of the Roses these days and you can use a binary to bring compressor acid before sending them to the server. Okay, and it's the way to make sure that area code is nice and Compact and ready to be sent to the user. Our next thought was to not send it twice is most necessary. The initials guest policy

audit in Lighthouse helped us noticed that we could be optimizing our caching strategies in order to achieve exactly. Duration Heather another server. We make sure that on the repeated visit the user can we use the resources they have downloaded before? I actually you should end education as many resources as securely possible for as long as possible. Of time and provides meditation tokens for efficiency evaluation. All the changes were made so far. We're very straightforward, right? They require snow cold water

whether it was a really low hanging fruit with very little risk of breaking anything. So remember always minimize the codes preferably automatic. It was built Us by using the right again or by adding optimization module still on servers always compress your assets. And you said you sent this policies to optimize repeated visits. Okay, just when you're ready to move the day of this part of the unnecessary download for example and his coat. As it happens and you supposed to be surprised that

it may linger in the dark corners of your clothes base idle and long-forgotten and yet making it to the users bonds with each time. The app is closed it especially if you work on your app for a longer. Of time your team or your dependencies change and sometimes and orphan Library gets stuck behind that's exactly what happened to us. At the beginning we were using materials components library to quickly prototype power off in time. We move to a more custom look and feel and of course we forgot entirely Library fortunately the code coverage check help

us with Discover it in our bundle. You can check your coat coverage starts in the stores and bought for the wrong time as low time after application. You can see the two big red stripes in the bottom screenshot. We have over 95% of our CSS and used and the big bunch of JavaScript as well as issue in the CSS rules or did it show the TV of over 400 km. So we can fix our code and we kicked out both the JavaScript and CSS part of the library this brought our chances Bangles on 24th,

which is pretty good for a tiny, It's right. Of course, it might our performance score go up and also the time to interactive got much better and it's not enough to check your metrics and removing actual coaches never reach 3, so you should always look out for potential regression. Remember that our code was used in 95% while they're still has 5% right about it was still using as it starts from the library during a rose in little slider incorporate those times back into the bottom. So if

you have a proper testing place to help you guard against potential visual regressions, Remember code coverage for syndec to send in lighthouses your friend when it comes to spotting and removing a discount do this check regularly to keep your coats Bass clean and tidy and to texture changes thoroughly before deployment so far so good. All of those changes made out a little bit like it was too slow to slow to a day so he took it a bit further. How did it go? It went so so good. The song White Pages are like heavy suitcases. You have some stuff

that's really important and that you have crap then even more crap. You know, the large resources can slow down my page loads they can Costner users money and they have a big impact on their data plans. So it's really important to be mindful of this Lighthouse was able to detect we had an issue with some of our network payloads using the enormous Network a load on it here. We saw that we had over three Meg's worth of code that was being shipped down quite a lot especially on mobile at the very top of this list Lighthouse highlighted. If you had a JavaScript vendor bundle that was two

Megs of uncompress code that we were trying to shift down. This is also a problem highlighted by webpack don't we say is that the fastest request is the one that's not made ideally you should be measuring the value of every single asset you're serving down to your users measuring the performance of those assets and make a call on whether it's worth actually shipping down with the initial experience because sometimes these assets can be deferred or lazily loaded or process during idle time. In our case because we're dealing with a lot of JavaScript bundles. We were fortunate because the jobs

for Community has a rich set of jobs with bundle auditing tools. We started off with webpack bundle analyzer for us that we were including a dependency called Unicode, which is 1.6 Megs of parse JavaScript quite a lot. We didn't went out with your editor and using the import costs plug in the visual code Greenville to visualize the cost of every module that we were importing. This allowed me to discover which component was including code that was referencing this module. We've been switched over to another tool bundle phobia. This is a tool which allows you to enter in

the name of any npm package and actually see what is minified and Jesus eyes is estimated to be we found a nice alternative for the slug module. We were using the only way to point to kill and so we switch that up. This had a big impact on our performance between this change in discovering other opportunities to trim down or jobs for fungal side. We save 2.1 megabytes of code. We saw a 65% improvement over all once you factor in the Jesus and minified size of these bundles and we just found this is really worth doing is a process swing General on those sites and apps

in the case of a theater and after already has games and a lot of interactive multimedia content. It's important for us to keep the application shell and lightweight as possible. We found the Entertainer assets in measuring their performance impact made a really big difference to just make sure that your auditing your ass that's fairly regularly can have a big impact. There's another thing that can have a really big impact and that is Javascript. We all love JavaScript a little bit too much of it is your most

expensive ass up on mobile if you're sending down large bundles of JavaScript Not you why. Anything meaningful actually happening for first understand why jobs would cost so much. This is how we browser processes JavaScript. We first of all have to download that script. You have a child support engine withstand these two parts that code is the pilot and executed these phases or something that don't take a whole lot of time on a high-end device like a desktop machine or a laptop maybe even a hi on

phone but on a median mobile phone this process can take anywhere between five and 10 times longer. This is what the Lay's interactivity. So it's important for us to try trimming this down. Don't to help you discover that these issues with your app. We introduce a new jobs for food of time audit to Lighthouse and in the case of the who laughs it told us that we had 1.8 seconds of time, which is being spent in jail roster food up. It's one monolithic jobs for bundle what technique for working around. This

is using Code splitting the Colts winning is this notion of instead of giving your users a whole pieces worth of JavaScript. What if you only gave them one slice is a time as they needed it code splitting can be applied at Route Levolor component-level. It works. Great with react react loadable. UJS annular polymer react and multiple other libraries. So we incorporated Co splitting into her application. We switched over from static Imports to Dynamic Imports longest to a synchronously lazy load code in as we needed it and they packed this

freaking out the size of our bundles of decreasing our jobs for food of time is 0.78 seconds making me at 56% faster in general if your bill to get jobs. Have you experienced only send posed to user that? They need to take advantage of Concepts like Coast flooding explore ideas like tree shaking and check out this free him down your library size if you have to using webpack, As much as JavaScript can be an issue. We know that unlocks. My images can also be a problem too sober to even talk about optimizing images.

Images internet clothes images and so do we in using them in the background in the foreground and pretty much everywhere and fortunately Lighthouse was much less enthusiastic about this than we were three images related. We forgot them correctly and also we could get some gain from using other image formats as well like image or I can call. Emerald approach is to add an image of the musician steps to your bills process with libraries. Like for example, image me this way. You make sure

that also the images added in future get optimized automatically example of a or third-party Solutions light clothing or a or fastly offer you a comprehensive image optimization solution. So how to save yourself some time and headache you see if you can simply holster images on those Services if you don't want to do that because of the cost or latency issues project like summer or imageflo offer self hosted alternatives. Do you can see a single image optimization outcome was flat in what park is big

and rightly. So after sizing is correctly to the viewport and running into image of Tim. We went down 200 kilobytes, which is acceptable for multiple images. That's why it's pretty easy for static images. As much as sweetie, I love gifts especially the ones with the catching them. They can get really expensive surprisingly intended as an animation transforming the first play there for switching to a more suitable video format offers you like Savings in terms of the file size on the

homepage according to Lighthouse. We could be saving over 7 megabytes by switching to a more efficient video format. I want to sleep weight is about 7.3 Mikes way too much for any reasonable website with two source, files and MP4 and webm for white girls are support you can see how use ffmpeg tool to convert our animation GIF into a mp4 file format. And for example, image of Timothy I can do such conversion for you to save over 80% of our overall weight. Thanks to this conversation. Dutch Brothers down to around

one megabyte to push down the wire especially for the user on the restricted funds with me. We could use effective type API to realize they're on the slow batteries and give them a match much smaller jpeg instead. This interface uses the effective round trip time and Downing if I used to estimate the net worth of the user is using its importance of string slow to G2G 3G or 4G. So depending on this envelope because you place the video element with the

image, but at least decide is usable on the slow connection as well. Last but not least. There's a very common problem of Austin images. Carousel slider sorellina pages of a note images even though the user cannot see them on the page straightaway behavior in the images for yourself in the network panel. If you see a lot of images in cumming while only few are visible on the page, it means that maybe you could come see the lazy loading them instead. Lazy loading is not yet supported what is natively in the browser? So we have to use JavaScript to a z-score probability is

yours to Aurora Auto covers because it's only rock the visibility changes of the element, but it's also practically threw such as elements that are near the view for the optimal experience equation of the intersection of server, which gives you very efficient visibility lookups. Is it busy after this change? Our images are being searched on demand instead of straight away? Okay, that's a lot of Woodstock of us images. So just remember always optimize images before

pushing them to the user use responsive images technique to achieve the right side of the image formats, whatever possible especially for animated content and finally lazy load. Whatever is not immediately visible to the user. If you want something different to that topic, here's a present for you and very handy and comprehensive guide written by Eddie so you can actually see that images don't guide URL. Cool. So let's not talk about resources that are discovered and delivered late by the browser browser has the

same degree of importance and the browser knows this a lot of browsers have to decide what they should be like fetching first before images or script as authors of the page in forming. The browser was actually really important to us thankfully over the last couple of years browser vendors been adding a number of features to help us with this so things like a freaking actor preload, apriso platform browserquest the right thing at the right time and think it be a little bit more efficient than some of the

custom loading logic based approaches that are done using script instead. First thing White House tells us to do is avoid multiple costly round trips to any origin down the case of the Google app really heavily using Google fonts. Whatever you drop in South Street into your page is going to connect up to sublimate. What house is telling us that if we were able to warm up that connection we could save anywhere up to three hundred milliseconds in our initial connection that take advantage of Link rail preconnect. We

can effectively NASDAQ connection latency, especially with something like Google fonts. Where are Foxface the Assassin posted on google.com and our front resources are hosta. Gstatic this going to have a really big impact and we shaved off a few hundred milliseconds. The next thing the lighthouse suggest is that we preload key request. The Link rail preload is really powerful and informative browser that our resources needed as part of the current navigation and it's affecting it as soon as possible or Lighthouse is telling us that we should be going and reloading are Key West Van

resources because we're loading into web fonts. Reloading in what font looks like this. So specifying Grell equals preload you passing as for the type of font and then you specify the type of phone you're trying to load into his wife to the impact. This can have on your page is quite start. So normally without using Leaf real preload, if what fonts happen to be critical to your page with a browser has to do is it first of all has to affect your HTML has two parts or css and somewhere much later down the line. It'll finally go and fetch your web fonts using Link rail preload, as soon as

the browser has parser HTML fonts much earlier on in the case of all rap. This was able to shave a second off the time it took for us to render text using our web fonts. Now it's not quite that straightforward. If you're going to try free loading files using Google fonts. There is one got you. You see the Google thought you were else that we specify in her phone face is in our soft sheets have to be something that the font steam update fairly regularly is URL can expire or get update on a regular frequency. And so it was just to do if you want complete control of your

following experience is the self host your web fonts. This can be great cuz it gives you access to things like mcphail preload in our case. We found the tool Google web fonts helper really useful just helping us offline. Some of those web fonts and set them up locally. Now whether you're using restaurants part of your critical resources for happens to be JavaScript try to help as soon as possible. If you connecting up some multiple Origins that are critics consider using Lathrop reconnect. If you have an asset for the current page, that's really important use Link rail preload.

We have to have a good pre load plug-in available for webpack. And if you can ask it for future navigation consider using prefetch, this is something that has good support now in the latest version of wet pack animals preload. Now he's got something special to share with you today. In addition to features like resource hints as well as preload. We've been working on a brand new experimental browser feature. We're calling priority hands this new feature that allows you to hit the browser how important a resource is. It exposes a new attribute importance with the values low high or Auto

this allows to convey lowering a priority of less important resources, which is not critical Styles images for fetch API calls through more important things like our hero images in the case. So before we had a lazy loading to our images what the browser was doing is we had this image Carousel with all of our Doodles and the Brows respecting all the images at the very start of the carousel with a high priority curly on. Unfortunately. It was the images in the middle of the

carousel. There were most important to the user but we did we set the importance of those background images to very low the foreground wants to very high and what this had was a 2 second impact over slow 3-g and how quickly we were able to fetch and render those images so nice positive experience in a few weeks. I want to talk about is typography to typography is fundamental to design and if you're using web fonts, you ideally don't want to block rendering of your text and you definitely want it don't want to show

invisible text. Weihai like this is my house. Now with the avoid invisible text while web fonts are loading audit. If you load your web fonts using a phone face block you letting the browser decide what to do if it takes a long time for that weapon to fetch some browsers wait anywhere for 3 seconds for this before falling back to the system font and will eventually swap it out to the font. Once it's downloaded. We're trying to avoid this invisible text in this case would have been able to see this week's classic Doodles if the web van taking too long thankfully with a new feature

called fontus play you actually get a lot more control over this process. What does play helps you decide how what phones will render or fall back based on how long it takes for them to swap swap swap. This means the browser going to draw your text Bree immediately the fallback fun. If the van takes a while to load is going to swap it. Once the song space is available in the case. It allowed us to display some meaningful text very early on a transition over to the web font. Once it was ready.

Send general if you have any music by Ponce as a large percentage of the web does have a good web front loading strategy in place there a lot of white platform features, you can use to optimize reloading experience for fonts. No strikeouts. Zach. Leatherman's webfont recipes Rico cuz it's really great to talk about render blocking Scripts. Displaying visible text as early as possible. It's really important that we can go further than that. There are other applications that we could push earlier in the Donald train to provide at least some basic user experience

a little bit earlier. Here on the lighthouse timeline strip. You can see the strangest first few seconds. When all the resources are loading. The user cannot really see any content downloading and processing external is blocking our rendering process for making any progress. So what can we do about it? Well, we can try to optimize our critical rendering path by delivering some of this guy's a bit earlier if we extracted stars that are responsible for this initial render and inline them know what is the browser is able to render them straight away without waiting for

external starts to arrive and mpm module cause critical to Enlighten our critical content in index HTML Ringgit build stuff did most of the heavy lifting for us to get this working smoothly across different routes. Digital safe, if you're not very careful on your side structure is really complex. It might be really difficult to introduce this type of pattern. If you did not plan for abs and he text her from the beginning. This is why it's so important to take performance considerations and Elli on

if you don't design for performance from the start, there's a high chance you're running to she's doing later. Indiana hourly Spade off we managed to make it working and you have started delivering accountant much earlier improving our first meaningful paint. I'm significantly. So just come up to unplug during during process consider in line in critical space in the health of your document and print loading or unloading a synchronously the rest later on or non-critical script considered mocking the messages are attributes or lazy load them later during the life

cycle. Okay, so that's the whole story of how he threw the horse and try to put it in a suitcase. So let's take a look at that. This is how I where I closed it on Amazon mobile device on a 3-g network before and after the optimization. That's a pretty nice progress. All of his progress was fueled by a continuously checking and following the lighthouse report today a LinkedIn Lighthouse to so that you can check them in the context of their own application to check out how we technically implemented all of the changes. Feel free to take a look at our report, especially at the piers that

landed there. The performance score went from 23 to 91. That's pretty nice. Right? However, I was never to make Lighthouse Petty high level metrics included in the report like time to interactive or perceptual speed index proxy for improved user experience about how much shorter do waiting time for a hurricane. This is foo foo story of little fly a scaled much larger businesses. Will it take a look and Nikki are Japan's largest media company. They have a site that's got four hundred 50 million

users that are accessing it and they spend a lot of time optimizing your old mobile site turning it into a new pwa the impact of the foremast optimizations was huge the rebels get 14 seconds faster on interactivity. They saw a 230% increase in organic traffic 58% increase in conversion rate and their daily active users and he also went up but what do they actually do to optimize their performance? Well, if we take a look at the full list of things that decay did the look a little bit familiar that's because many of these are things that we covered today DK did things

like Optimizer jobs for bundles, they're able to shrink them down. 50% When do they change they made was they actually use webpack to optimize the performance of their third-party scripts in addition to their first party wants their able to use Link for all three fetch on their daily Edition Pages improving next Pace performance Lodi by 75% In addition to this they also take advantage of the tip that he was just walking through a CSS optimization optimization where you're sending down 14k worth of content in your first round trip to squeeze in enough of your critical Styles in

there. You can actually prove first meaningful paint by quite a lot. So here they were able to shave off a second or their first meaningful paint by taking advantage of this optimization. It finally on the K. They took advantage of the purple pattern. The purple is a pattern that was first discovered by the polymer team a few years ago and it stands for push render prakashan lazy load. So what does pushing their critical resources using link preload and server push the rendering their main article content by quickly there free cashing their top stories using service worker. So this

gives them offline access to read articles as well. And they're lazy loading codes related to Lenny code. They're also using skeleton screens to improve performance of these pages. So we talked about how performance can help businesses to give uses better experience. But on users we have one more thing that we think could help with the future. What is your president Aquino. We believe that machine learning represents an exciting opportunity for future in many areas. So

take these two words of machine learning as web performance and blend them together. Maybe it could lead us to some really really interesting Solutions today. We want to tell you about our experiences machine learning and explain why we think it has a lot of potential. Just an idea that we hope will spark more experimentation in the future. 3 has data can really God if user experiences with creating today. We make a lot of arbitrary decisions about what the user might want or need. And therefore what is being worth be refreshed or pre-loaded

or pre-cut if we are able to prioritize a small amount of resources, but it's really hard to scale it to the whole website at the same time. We have about typical Behavior behavior of evil surviving, how can you use that daytime? So we actually have that available to better inform optimizations today using the Google analytics reporting API. We can take a look at the next stop page and exit percentages for any URL on our site if anybody can check out here it is for developers. Google.com web it is we can

see here a lot of the user documentation actually end up going over to Lighthouse so we could potentially prefetch that paid postal and off on the pwa docs will usually take out the pwa checklist and we could use this data to improve our page-flip performance. This gives us the notion of data-driven loading for improving the performance website, but there is one piece missing here. Having a good probability model is important because we don't want to waste our users data by aggressively over profession content. We can take advantage of that Google analytics data and use machine

learning models like Markov chains or neural networks router implements is probability models. This is a lot less objective and error-prone than manual deciding what we should be free fetching or pre-loading for our side. We think and then wire this all up using lead Rope free Fletch inside of our site that as a user browser through the site able to actually hatch and cash things that they need ahead of time if proven our page-flip performance. We can actually go further than this earlier. We talked about code splitting and lazy loading a single page app with route

Trunks and a bungler. So instead of just prefetching Pages we could actually go more granular individual resources for a paid for number of pages. well Press all these ideas. We've been focused on trying to make some of these a little bit more low-friction for web developers to adopt. And so today we're happy to announce a new initiative were calling. Yes. Yes. Guess what is a project focused on data-driven user experiences for the web? We hope that it's going to expire explanation of using data to improve my performance and go beyond that is all open source and available

on GitHub today. This was built in collaboration with the open source Community by Nico guest, Matthews from Gatsby kaeding a penny is in the number of others will take a look at what yesterday's provides us out of the box the first thing that provides this is Google analytics module 4 analyzing our user navigation. It has a parser popular Frameworks Napa's your route back to your Frameworks router. We don't have a comprehensive webpack plugin the table to do machine learning training on some of that data to determine probabilities are able to bundle the JavaScript for your

routes in chunks with a potential future clustering of those trunks all together for you. We've also got experimental support for applying these Concepts to static site. That's enough talk. What about showing you how this works in practice? So here we have a demo. Orrefors going to do is we're going to load up around where the Ducks tools were in low and mobile emulation over slow three cheap because we have high confidence there going to be use we can toggle on a visualization of this sewing pink. We have Pages we have high confidence that you just going to want in green. We have low

confidence and yellow we have mild confidence. We can actually start the navigate through this application. So let's say we go to cheese can visualize this again we can see the cheesecake is a page that uses will often navigate to as you can see it floated instantly cuz it's already in the users cash for the times they go to it to you in confidence levels. We have four different pages as we navigate to the site. Even this last page the custard loaded really really quickly using these techniques. Now researching is a great idea and practice but we want to be

mindful of users data plans. And this is where we use Navigator. Connection. Effective type to make sure there were only prefetching things when we think that your connection can handle it. So this is our demo of gaster. Yes using Gatsby. By the way. This also happens to be a pwa with a great performance score. So thank you to both men can one cow for helping with this. Yes. Let us know what you think. So today we talked about quite a few things by the end of the day performances about inclusivity. It's about people. All of us. We've all experienced locate Lowe's on the go. We got

an opportunity today to consider trying to give her users more delightful experience. Is it look really really quickly? So we hope that you took something away from this talk performance is a journey. Lots of small changes can lead to really big gains. So check out some of the things we talked about today talk to listen to get sandbox if you got any questions, that's it from us. Thank you.

Cackle comments for the website

Buy this talk

Access to the talk “Web performance made easy”
Available
In cart
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “2018 Google I/O”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “Software development”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
159
app store, apps, development, google play, mobile, soft

Similar talks

Adam Greenberg
Sr. Global Product Partnerships Lead at Google
+ 1 speaker
Rowan Merewood
Developer Advocate at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Steve Orvell
Software Engineer at Google
Available
In cart
Free
Free
Free
Free
Free
Free
Tom Greenaway
Senior Developer Advocate at Google
+ 1 speaker
John Mueller
Software Engineer at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Web performance made easy”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
558 conferences
22059 speakers
8245 hours of content