Duration 39:42
16+
Play
Video

Deliver search-friendly JavaScript-powered websites

Tom Greenaway
Senior Developer Advocate at Google
+ 1 speaker
  • Video
  • Table of contents
  • Video
2018 Google I/O
May 9, 2018, Mountain View, USA
2018 Google I/O
Video
Deliver search-friendly JavaScript-powered websites
Available
In cart
Free
Free
Free
Free
Free
Free
Add to favorites
48.82 K
I like 0
I dislike 0
Available
In cart
Free
Free
Free
Free
Free
Free
  • Description
  • Transcript
  • Discussion

About speakers

Tom Greenaway
Senior Developer Advocate at Google
John Mueller
Software Engineer at Google

Tom Greenaway is a developer advocate in the partner developer relations team. He works with external partners and a variety of Google product teams to create showcase integrations of what’s possible with Google’s technologies. The indexability of client-side rendered HTML and JavaScript-powered websites is a particular area of interest for him. Before joining Google he ran his own game development studio and created the award-winning mobile game “Duet” which was downloaded over 15 million times.

View the profile

About the talk

Making a site that is engaging for users but performs well in Google Search requires combining specific server and client side technologies. Learn about the best practices to build and deploy indexable sites and web-applications with JavaScript frameworks. Lessons from this session can be implemented with most modern setups, whether you use Angular, Polymer, React, or other frameworks to build a full-fledged PWA or just use them for parts of a site. The session will also discuss search-friendly design patterns and SEO best practices.

Share

Good morning, everyone. My name is Tom Greenway and I'm a partner developer Advocate from Google Sydney with a focus on the index ability or Progressive web applications in Dallas from Zurich and with the like it's great to see so many of you here even at this early hour. Now that's a good Imagine John. I have a lot of experience to work web developers must do to ensure that website for indexable, which is another way of saying with a web page can be found in understood by search engines. What do search engine Co White Pages exactly the same way at

some pages more complex than others. I want to know more than JavaScript powered website JavaScript out website. We probably index by centrioles and especially Google touch including the Google search policy and you approach for rendering HTML to set scroll is and even a new Google search console tool. How long time ago before I join Google I was building e-commerce sites, and I personally felt there was a lot of mystery of times behind Google search, especially on the topic of index ability. Wonder why do some of my pages appear in Google search and some toast and what's the difference

between them? Will JavaScript to be read it correctly and be indexed and he's lazy loading an image safe to do. These are really critical questions and as developed as ourselves, we understand the frustration behind this mystery. So today John and I are going to do something. We very rarely to Google pull back to cut in a little bit and revealed some new piece of information about how people search sees the web and indexes it and with this new knowledge and a few new tools. You'll have concrete steps. You can take to ensure the JavaScript out websites.

You're building a Google search. Don't know what I want to remind you that this dog is that modern JavaScript out websites. And typically these websites will be powered by a JavaScript framework such as an angular Palama. React with you. To ask and it doesn't lock the great web development framework that's easy to use and helps you build your sights fostering. What's great for uses. But it's important to recognize that some of these Frameworks use a single page app configuration model that uses a single HTML file that pulls in a bunch of JavaScript and I can make a lot of

stuff simply. But if JavaScript out websites can be a problem for search engines default template for a new angular project. Looks like As you can see the default project template is pretty basic. It shows you how to use Angela to render a header and image in a few links nice and simple. How could this possibly be a problem from an index ability to spectiv? Well, let's take a peek behind the scenes at the HTML. This is it. Take a quick look when you get in the browser the default sample project had text imagery and links, but you wouldn't know that from

looking at this initial HTML that's been delivered from the seven. I'm with you the initial a female that's been sent down is that you're completely devoid of any content be here in the a fruit. That's all the reason the body of the page except for some script tags. So that doesn't mean nothing here index. Had to be clear Angela isn't the only by framework that says an empty response on its initial set aside render react if you don't have similar issues by the fault, so what does this mean to the index ability of the websites from the perspective of

Google Search? About to answer that question better will take a little step back and talk about the web in general why search engines exist and why search Melissa necessary? That's a good question to start with his how big is the web while we can tell it was actually found other 130 trillion documents on the web. So in other words, it's really big Hennessy know the animal search engines including Google is to provide a list of relevant search results based on a user's search query results. We need an index similar to the

catalog of a gigantic library and give me the size of the web. That's a really complex top. Set Builders index to power out the attention. We need another tool I slept for a while and traditionally is that for all I was basically just a computer and a piece of software that phone to key steps 1 a.m. To find a piece of content to be called in to do this the content must be retrievable buy a URL and once we haven't URL we get its content and we stick to the HTML the index of page and find you linked to crawl as well. And that's the cycle repeats. So it's okay that Fest at The Crawling and

break it down. Oh and yes as an Australian, I felt was imperative that I do some spiders in my dog. So this is the cutest most of what I can find. John with these date. Okay. Well, I have a few more in the day. So maybe he'll come around. To ensure the crawling is possible. There is some key things to keep in mind Presley. We need your own spiritual. I think that should be any issue when the Corolla wants to request the web pages and Retreat the resources necessary for indexing them from your website. And secondly if there are multiple documents that contain the same

content, we need a way to identify the original sauce. And finally, we also want to let me just clean unique URLs originally. This was pretty straightforward on the web. But then the first single-page web deep contacts. Best for the reach ability of your else. This is simple standard way to help search engines find content that you're probably familiar with side, which you are a role in which to ignore and I say URLs because these rules can prevent JavaScript from being cruel to which could affect your index ability and this example also gives us a link to excitement acai bowl is by

providing recommended setup your rolls to Crowell initially for a side. There's no guarantee. These Euros will get cold. Just wanted to speak no successful as well. Consider. Okay, but now let's talk about that. You look content scenario and how such policy with this situation sometimes websites one. Multiple Pages have the same content, right? Even if it's a website free sample song pluggers will publish articles on their website and crossbows to services like medium to increase the reach of that content and this is called content syndication which URL

you prefer to have you next. So the canonical Mana. It's in texture and here in the HTML allows that you put documents to communicate with rollers where the original authoritative sources for the contact list week old that Source document medical document. Especially your else the webs. Out quite simple. So you're all that wood specialist ever with some HTML but then it just changed everything suddenly websites to execute JavaScript. What's the fat content to the server without reloading the browser page the developers still wanted a way to support back and forth route navigation

and history as well. So it's Rick was invented which leverage something called a strike would identify its purposes for deep linking to the sub-continent of a page like a subsection of an encyclopedia article. And because right when identifies Westport by browsers for history, navigation dismiss developers could trick the browser into fetching you content dynamically without reloading the browser paid and yet also support the history in the navigation. We love about the web but we realize that using the fragment identify for two Pepsis. Stop directions on pages and also deep

linking is content. It wasn't very elegant. So we moved away from that and set another approach was proposed to use the fragment identifier followed by an! Which we called a hashtag and this way we could discern the difference between a traditional URL using the fragment identifier for the subcontract on the page. That is a fragment identify being used by JavaScript age is recommended for a while. However, nowadays there is a modeling jobs. Good API the make these old techniques less necessary and it's cold history a Fiat. That's great because it enables

managing the history of the history state of funeral without requiring complete reload to the browser all three jobs. So we get the best of both worlds dynamically fetch content between for just know you or else but I can tell you that from Google to spective. We no longer index that single hash walk around and we discourage the use of the hashbang trick as well. Okay. Well, that's the way that lets me go into the indexing step. April is ideally want to have to find all the content on your website if the coil is Concepts and content and how they going to in

Dexter and the cool content of the page include all the texts imagery video and even hidden element extraction metadata. And otherwise, it's okay, but don't forget about that content. You want to see the embedded content to I don't say this might seem really obvious, but I want to emphasize that I Google we take HD codes pretty seriously especially for for knockdown code. If Coral is find a page that has a 404 status code then they probably won't even bother index again and loudly of course a Corolla wants to find all

the links on a page as well as these links allow the trolls de profeta. banana splits in about this because How to search for a list like Google find links I can't speak for ulcers cause but I can say that a Google we only analyze one thing and contacts with a trip attributes and that's it. For example this fine together. I just added it won't kick for all because if not an anchor and this additional fan of added even though it's an anchor. He doesn't have an 8-track write review. But if you want using JavaScript such as with the history API that I

mentioned earlier navigate to page Hughley on the client and fetch you contact. Dynamically, you can't do that so long as you use the anchor tags with a giraffe actually it's like if it's lost example because most of us including Google will not stimulate stimulate stimulate navigation of a page find links tag will be followed for leaking. The wait is that really everything in order to have sifted Through the Ages melting index the page. We need to have the egg smell in the first place and in the early days of the web 7th likely gave us. But nowadays that's not really the case.

So let's insert a step between crawling and indexing because we need to recognize the search all of them selves might need to take on this rendering soft as well. Otherwise, I understand the websites with building because these signs are rendering Nation on the browser itself using JavaScript and templating Frameworks just like that. So when I say rendering I don't mean drawing pixel 2 the screen, I'm talking about the actual construction of the H&L itself and ultimately that's going to leave that happened on the client or a combination that you could be used and we

call that hype rate read right? Reddit rear-ended on the sub of the day Chanel immediately, but if it's rented on the client and things get a little bit tricky a right and that's going to be the challenge that we discussed in today. But one last time you might be wondering what is Google search of Chronicle Universal software running on it may be in the 90s. That was the case. But nowadays you just the sheer size of the web is comprised thousands of machines

running has distributed software that's counseling crunching data to understand all this is continuously expanding information on the web. And to be honest. I think we sometimes I recently learned that with the knowledge graph. We haven't which of the Dodger baseball the information we have on the web. How more than 1 billion things in the real world are connected. I noticed everything billion facts between them. Okay. Well now that we know the principles of assets Rob. Let's see how the three different key steps froling rendering an indexing All Connect because one

crucial thing to understand is the cycle of how Google bought or how it should ideally work. As you can see we want these three steps to hand over to one of the insulin. And as soon as the content is believed rented we want an index it to keep the Google search index as fresh as possible. This ain't sounds simple. Right? Well, it would be if all the content was Randy on the server and complete when we crawl it. Let me know if it's not uses client-side rendering but that's not going to be the case just like that Angela sample. I showed u l yet. So what does Google do in this situation?

Well, she's able to run when it encounters pages with JavaScript at the scale of the red requires a long time and computational resources. And make no mistake. This is a serious challenge for such fellows googlebot included and so he comes the important to read the back Google search. We would like to share with you today, but she's that currently the rendering of jobs can house websites in Google search has the resources available to process that contact. That you might be thinking. Okay, but what does that really mean?

But I'll show you. The reality Google Bots process looks a bit different we crawl the page. We fetch the service. I've been in contact and every run some initial indexing on that document. But rendering the jobs gopal web pages takes processing power in memory of all Google. Is very very powerful doesn't have to get infinite resources. So if the page has jobs 15, it is actually until we have the resources ready to run to the clients. I contact in the weight index be content further. So Google bought my index of page before rendering is complete and the final render connecting arrived several

days. Later. And when that final render doesn't rhyme then we can form another wave indexing on that client-side rented content and this effectively means that if your size using a heavy amount of client-side JavaScript for rendering you could be tripped up at times when your content is being indexed due to the nature Just Too Faced indexing Frozen. The ultimately what I'm really trying to say is because Google's googlebot actually runs to waste of indexing or front of content. It's possible. Some details might be missed. For example, if you decide is a progressive web

application and you've built around a single page app model or else just on face template of resources, which has been filled in with content by HX effects request. And if that's the case I did the initial e75 rendered version of the page at the URL included in it. Because if you're relying on that beat by the client, they will actually completely because of indexing doesn't check for the canonical tag at all additionally if the user requested URL that doesn't exist and you attempt to use JavaScript to send that user error 404 page then I have to go to miss that too. Don't

you I'm not trying to talk more about these issues later in this world. But the important thing to take away right now is that these really aren't mine at issues. These are real issues that could affect your index ability to go tags. H-E-B card understand the content on your web pages. I want to see Cleo. Old web pages on a website necessarily need to be indexed for example website to find the individual session pages when it being indexed. But because the chemical tag we rented in the client and the URLs with fragment identify base. 3D template with clean URLs and set aside

render canonical tags to ensure this prescriptions would probably index because we care about that content and to ensure these documents were added them to the site map as well. But what about the single page app which allows for filtering session on that page? So ask yourself this the pages I care about from the perspective of content indexing use client-side rendering anyway. Okay. So now, you know when building a client-side rendering website, you must tread carefully as the web in the industry has gotten bigger, since you have two teams and companies become more complex.

We the people building website the same people promoting or marketing this website and see this challenge is when they were facing together is an industry both from Google's perspective. And yours is Bella face after all you want your content index my search engine and so do we This seems like a good option to change track. So don't you want to take over and tell everyone about the Google search policy changes and some of the best practices they can apply so we can meet this challenge together. Sure. Thanks. Tom. That was a great summary of how search works

so I still don't know about those pictures of spiders. Scary in reality. It's actually quite friendly. Anyway, as I mentioned the indexing of modern JavaScript powered website is a challenge. It's a challenge both for Google as a search engine and for you all as developers of the modern wet and wild development on our side or still ongoing. We'd like to help you to tackle The Challenge in a more systematic way the 4th at 3 things here the policy change that we mentioned briefly before some new tools that are available to help you

diagnose these issues a little bit better and lastly a bunch of best practices to help you make better JavaScript powered website that also work well in search So we talked about client-side rendering briefly and server-side rendering already client-side rendering is a traditional state where JavaScript is processed on the client. That would be the users browser or on a search engine for server-side rendering the server. So your server will process the JavaScript and serve mostly static HTML to search engine. It's also has speed Advantage. So especially

on lower-end devices on mobile devices JavaScript can take a bit of time to run. So this is a good fact for both of these we index the state as ultimately seen in the browser. So that's what we pick up and we try to render pages when we need to do that. There's a third type of rendering that we talked about in the past. It starts in the same way and that pre-rendered HTML is sent to the client. So you have the same speed Advantage is there however on interaction or after the initial page load the server ads JavaScript on top of that and as with server-side rendering

our job as a search engine is pretty easy here. We just got there a free rider HD content we call this hybrid rendering. This is our long-term recommendation. We think this is probably where things will end up in the long run over and practice implementing that can still be a bit tricky. Don't make this easy. I put call out to angular since we feature them in the beginning as an example of a page that was hard to pick up. They have built a hybrid rendering mode with angular Universal. What I tell you to do that's a little bit easier over time. I imagine more

Frameworks will have something similar to make it easier for you to do this in practice. However, at least at the moment if your server is Javascript, you're going to be dealing with as well. So well with search we have another option that we like to introduce we call it Dynamic rendering. In a nutshell Dynamic rendering is the principle of sending normal clients. I'd rather add content to users and sending fully server-side render contents to search engines and two other crawlers that needed. This is a policy change that we talked about

before. So we call it dynamic because your site dynamically detect whether or not to request there is a search engine crawler like Google. And only then sends a server-side render content directly to the client. You can include other web services here as well that can't deal with red Drake or example, maybe social media services or chat Services anything I try to extract structured information from your pages and for all other requesters. So your normal users you would serve your normal hybrid or client-side winter coat. This also gives you the best of both worlds and makes

it easy for you to my grade 2 hybrid rendering for users over time as well. I have one thing to note. This is not a requirement for JavaScript eyes to be inducted as you'll see later Google but can render most Pages already for dynamic rendering a recommendation is to add a new tool or step in your server infrastructure to act as a dynamic renderer this read your normal client-side content and sends a tree record version to search engine problems. So how might you implement. We have two options here that the healthiest kind of get started. The first is

Puppeteer, which is a no JS Library which wraps a headless version of Google Chrome underneath. This allows you to render pages on your phone. And another option is render triumph, which is what you can run as a software-as-a-service that renters in cash with your content on your side as well. Both of these are open source, so you could make your own version or use something from a third-party that does something similar as well. For more information on these. I'd recommend checking out the iOS session on head looked wrong. I believe that's a recording about that already. Either way. He

can be pretty resource-intensive. So we recommend doing this out of van from your normal life and implementing cashing at as you need it. So let's take a quick look at what your server infrastructure might look like with a dynamic renderer integrated repressed from Google Maps come in on the side here. They are sent to your normal server and then perhaps through a reverse proxy. They're sent to the dynamic render there if requested render is the complete final page and since that's back to the search engine. So without needing to implement and maintain any new code this

setup could enable a website that's designed only for client-side rendering to perform Dynamic rendering of the content to Google. And two other appropriate clients. If you think about it, it's kind of solve the problems that Tom mentioned it before and now we can be kind of confident that the important content of a web pages is available to Google. When texting so how might you recognize Google. Request. It's actually pretty easy. So the easiest way to do that is to find Google. In the user agent string that you can do something similar for other

services that you want to serve pre-rendered content to and from Google. As well as some others you can also do a reverse DNS lookup if you want to be sure that you're serving it just to legitimize science. At one thing to kind of watch out for here is that if you serve adaptive content to smartphone users vs. Desktop users are you redirect user has two different Darrell is depending on the device that they use you must make sure that Dynamic read during also returns device Focus content. In other words, when they go to your webpages, they should see the mobile version of

the page and the others should see the DebtStoppers if you're using responsive design. So if you're using the same HTML and just using CSS to conditionally changed their the way the content to show two users. This is one thing you don't need to watch out for him because the HTML is exactly the same. What's not immediately clear from the user agent is that Google. Is currently using is somewhat older browser to render Pages if uses Chrome 41, which was released in 2015 at the most visible implication for developers. Is that new or JavaScript version the

coding conventions like Arrow function aren't supported by Google. And without that was added after Crown xli currently. Is it supported you can check these on a site like can I use and why you could theoretically install an older version of chromium. We don't recommend doing that for obvious security reasons. additionally, there's some apis I Google about doesn't support because they don't provide additional value for search will check these out to All right, so you might be thinking it sounds like a lot of work. I don't know if I really need to do this.

So lot of time to your page is properly. Why do I really have to watch house with well few reasons to to watch out for the first is if your sign is large and rapidly changing, for example, if you have a News website that has a lot of new content that keeps coming out regularly and requires quick indexing as Tom shows rendering is deferred from a Dexter. So if you have a large dynamic website send the new contact might take a while to be indexed. Otherwise secondly if you rely on Modern JavaScript functionality, for example, if you

have any libraries that can't be transferred back to es5 then Dynamic Frederick can't help you there. And that said we continue to recommend using proper graceful degradation technique so that even older clients have access to your contact. And finally, there is a third reason to also look into this in particular. If you used a social media sites, if your side relies on sharing through social media or through that application if these Services require access to your Page's content than Dynamic rendering can help you there, too. So when you might you not use Dynamic rendering

I think the main aspect here is balancing the time and effort needed to implement and to run this with the game that are received the remember implementation and maintenance of dynamic rendering can use a significant amount of server resources. And if you see properly high frequency changes to your side, maybe you don't need to actually met him some at anything special most I should be able to like Google and other Pages just fine. Like I mentioned your pages probably don't need Dynamic rendering for that time.

Let's take a look at a few tools to help you figure out what the situation is. I want diagnosing rendering we recommend doing so incrementally first taking the raw agent TV response and then checking the rendered version either on mobile or on mobile. And that's if you serve different contact, for example, let's take a quick look at these. So looking at the raw HD to be response one way to do that is to use Google search console add to gain access to Google search console and too few other features that they have their first need to

verify ownership of your website. This is really easy to do their few ways to do that. So I'd recommend doing that regardless of what your workout verify you can use a tool called Fetch as Google which will show the HTTP response that was received by Google not including the response code on top and the HDMI all that was provided before any rendering was done. This is a great way to double-check what is happening on your server, especially if you're using Dynamic rendering to to serve different content to Google. I want you text the wrong response.

I recommend checking how the pages actually render. So the two I use for this is the mobile friendly test. It's a really fast way of checking Google's rendering of a page as as I mentioned the name suggests that it's made for mobile devices. So as you might know overtime or indexing will be primarily focused on the mobile version of a page we call this mobile first indexing. So it's good to already start focusing on the mobile version when you're testing Gregory, We recommend testing a few pages of each kind of page within your website. So for example, if you have an

e-commerce site check the homepage soon as a category pages and some of the detail page and you don't need to check every page on him the whole website because a lot of times the 10th lights will be pretty similar. If your page is render. Well here then chances are pretty high Google search as well. One thing that's kind of a downside here is that you just see the screenshot that you don't see the record HDMI out here. What's one way to check the HDMI how well you fry? Oh, I think we lost it yesterday. We've added a way to review the HDMI out after

Redbury. This is also in the mobile friendly test. It shows you what was created after read during wisdom of a Google app includes all of the mark-up for link for images for structured data any invisible elements that might be on the page after rendering. So what do you do if the page just doesn't render properly at all? We also just launched the way to get full information about loading issues from a page as well on this part was in the mobile friendly test. You can see all of the resources that were blocked by Google. So, this could be JavaScript files or Avi

responsive. A lot of times not everything needs to be called kind of like Tom mention that for example, also if you have tracking pixels on a page Google. Doesn't really need to render those tracking pixels, but if you use an ATM to pull in content from somewhere else that's API endpoint is blocked by content at all and get the list of all of these issues is also available in search console. so when Pages fail in a browser, usually I checked the developer console for more information to see more details on exception and

new for Io one of the most requested teacher is from people who make JavaScript power at site or search is also showing the console log when I try to render something this allows you to check for all kinds of issues after example, if you're using key at 6, or if you just have other issues with the JavaScript went to try to run this makes my life so much easier because I don't have to help people with all of these details rendering issues that much Desktop is also a topic that still comes up as you seen and maybe some of the others are saying that stuff is like that

so you can run all of these Diagnostics results test as well tool shows the desktop version of these pages. So now that we've seen how to diagnose issues what kind of issues have we run across with modern JavaScript power science. What pattern do you need to watch out for and handle well on your side? So remember Tom mentioned at the beginning of the talk something about lazy loading images and being unsure if they're indexable. Well, it turns out they're only sometimes so it's good to look at that that depending on how lazy loading

is implemented. Google may be able to trigger it and with that may be able to pick up these images for indexing that for example, if the images are above the fold and you're lazy loading kind of runs those images automatically then we'll probably see that. However, if you want to be sure that he's able to pick up lazy load image one way to do that is to use a no script tag, so you can add to notes for pack around the normal image and will be able to pick that up for image search director. Another approach is to use structured data on a page when we see

structured data that refers to an image. We can also pick that up for image search images that are a reference only through CSS. We currently only index images that are embedded with the structure or with I parked wrong lazy loaded images to be loaded. What about tabs? That load the content after you click on them or if you have infinite scroll patterns on his side Google. Generally won't interact with a page. So it wouldn't be able to see these there two ways that you can get this to Google. So either you can pre-load the content and just UCSF to toggle visibility

on and off. That way. I can see that content from the preloaded version or ultimately you can just use separate URL app and navigate to user in Google Voice to those pages individually. Google box is a patient fought there a lot of pages that we have to crawl. So we have to be efficient and kind of go through pages fairly quickly will take ages are slow to load or rendered Google. Might miss some of the content and since embedded resources are aggressively cash for search rendering time out a really hard to test

for these problems. We recommend making performance and efficient web pages with your hopefully already doing for users. Anyway, right anyway in particular limit the number of embedded resources and avoid artificial delays, like kind of decisions like here you can test pages with the usual set of tools and roughly test rendering with the mobile-friendly testing tool little bit different for indexing in general if the pages work in the mobile friendly test still work for search injecting two, Additionally Google I want to see the page as a new user would

see it. So this way to store something locally would not be supported. So if you use any of these Technologies, make sure to use Grateful Dead Redemption techniques to allow anyone to view your pages, even if he's apis are not supported. And that was it what your guards to critical best practices now, it's time to go back and see what we seen. So first we recommend checking for proper implementation of best practices that we talked about in particular lazy load images are really

common second test the sample of your pages with the mobile friendly test and use the other testing tools as well. Remember you don't need to test all of your pages just make sure that you have all of the template cover and then finally if pages are large and recycle large and quit changing and you can or you can't reasonably fix rendering across besides then maybe consider using Dynamic rendering take me to serve Google. In other crawlers the pre-rendered version of your page. And finally if you do decide to use Dynamic rendering make sure to double-check the result there as

well. One thing to keep in mind indexing isn't the same as ranking but generally speaking Pages do need to be indexed before their content can appear in search at all. I don't know, Do you think that covers about everything all it was a lot to take in John some amazing content, but I guess one question I have and I think maybe other people in the audience at this on that might as well. Is it always going to be this way John? That's a great question Tom. I don't know is I think things will never stay the same. So as you mentioned the beginning this is a challenge for us

that's important within Google search. We want our search results reflect the web as it is regardless of the type of website. To use so our long-term version is that you the developers need to worry as much about this for search problems. Stop circling back on the diagram that the Tom Toad the beginning with deferred rendering one change we want to make is to move rendering closer to crawling and indexing another day. We want to make is to make Google bought you the more modern version of Chrome overtime. Both of these will take a bit of time. I don't like making

long-term predictions, but I suspect it will be at least until end of year until this work the little bit we trust that rendering will be more and more common across all kinds of web services left critical for modern sign whatever they best practices that we talked about. They'll continue to be important here as well. How does that sound? That sounds really great. I think that covers everything and I hope everyone in the room has learned some new approaches and tools that are useful for making your modern JavaScript out

websites work well in Google search if you have any questions will be in the mobile web sandbox area together with a search console team, and alternately you can always reach out to us online as well be at our office out alive office hours hang out and in the white Mazda help form as well. So thanks everyone for your time. Thank you.

Cackle comments for the website

Buy this talk

Access to the talk “Deliver search-friendly JavaScript-powered websites”
Available
In cart
Free
Free
Free
Free
Free
Free

Access to all the recordings of the event

Get access to all videos “2018 Google I/O”
Available
In cart
Free
Free
Free
Free
Free
Free
Ticket

Interested in topic “Software development”?

You might be interested in videos from this event

September 28, 2018
Moscow
16
159
app store, apps, development, google play, mobile, soft

Similar talks

Michael Bleigh
Software Engineer at Google
Available
In cart
Free
Free
Free
Free
Free
Free
Mariya Moeva
Product Manager at Google
+ 1 speaker
John Mueller
Software Engineer at Google
+ 1 speaker
Available
In cart
Free
Free
Free
Free
Free
Free
Steve Orvell
Software Engineer at Google
Available
In cart
Free
Free
Free
Free
Free
Free

Buy this video

Video

Access to the talk “Deliver search-friendly JavaScript-powered websites”
Available
In cart
Free
Free
Free
Free
Free
Free

Conference Cast

With ConferenceCast.tv, you get access to our library of the world's best conference talks.

Conference Cast
558 conferences
22059 speakers
8190 hours of content