Google never, ever does something for no reason. Sometimes it’s just a matter of waiting patiently to figure out what that reason is.
In May, Google created a fetch and render tool in Google Webmaster Tools that was built to render web pages properly for GoogleBot. At the time, it was unclear why the company was introducing the tool, though it hinted at future plans that would involve fetch and render.
On Oct. 27, we got a definitive answer.
That fetch and render tool foreshadowed the introduction of new guidelines that say you could be negatively impacted on search rankings and indexing when you block your CSS or JavaScript files from being crawled. When you allow Googlebot to access these things, as well as your image files, it will read your pages correctly. When you don’t, you could hurt the way the algorithms render your content and thus result in your page rankings declining.
So that tool that was put out a few months earlier was basically a warmup – it can be used to make sure GoogleBot is rendering your web pages correctly.
It’s all part of a drive toward better user experience that is ultimately behind the changes Google has made.
The Nitty-Gritty of the Changes:
Google says the change was basically to make its indexing system more like a modern browser, which have CSS and JavaScript turned on. So, as always, Google’s claim is that it’s doing this for the greater good. It wants to make sure it’s reading things just like the people who will be looking for your content.
That’s a big change from before, when Google’s indexing systems were more like text-only browsers. Google cites the example of Lynx. But the search engine says that approach no longer made sense since modern browsers index based on page rendering.
The search engine offers a few suggestions for optimal indexing, including:
- Getting rid of unnecessary downloads
- Merging your CSS and JavaScript files
- Using the progressive enhancement guidelines in your web design
What This Means
With any Google change, the real question is what does this mean? How will it impact webmasters and what sort of impact could it have on SEO?
Clearly the answer to that second question is sites that do not adhere to the suggested guidelines will see their search results suffer. Make sure your webmaster fully understands what Google is asking for, and discuss what type of changes should be implemented and how they could affect Google rankings.
Your aim is to create crawlable content, and that means doing whatever Google suggests. Use the fetch and render tool to make sure everything on your site is in order. It will crawl and display your site just as it would come up in your target audience’s browsers.
The tool will gather all your resources: CSS files, JavaScript files, pictures. Then it runs the code to render your page’s layout in an image. Once that has come up, you can do some detective work. Is Googlebot seeing the page in the same way it is rendered on your browser?
If yes, you are in good shape. If no, you need to figure out what tweaks to make so that Google is seeing the same thing you are. Here are potential problems that could be making your site’s content non-crawlable:
- Your website is blocking JavaScript or CSS
- Your server can’t handle the number of crawl requests you receive
- Your JavaScript is removing content from your pages
- Your JavaScript is too complex and is stopping the pages from rendering correctly
Why These Changes, Why Now
Google always has intent behind what it does, and here’s my read on its intent with these changes: It’s making user experience a bigger factor in its search rankings. Think about it. The emphasis on page loads and rendering are two major steps in that direction.
That has also prompted speculation that the company could start using mobile user experience for its rankings as well. There has been rampant speculation in recent months, as mobile usage begins to overtake desktop, that Google will begin shifting its focus to the mobile web for search engine optimization.
So could this be one of the first steps on the way to those big changes? Perhaps. I always think it’s dangerous to try to get too many steps ahead of Google; the search engine likes to reverse course and throw people off from time to time. It does not like it when SEOs make changes in anticipation of its actions, preferring to dictate the course itself. And I do think the idea behind the crawlable-non-crawlable content changes makes sense. You have to keep up with the times.
But others could argue that keeping up with the times is exactly what Google will be doing by putting greater emphasis on mobile user experience.
The Bottom Line
Like any change from Google, this one will require adjustment and a fair bit of vigilance. I think it’s mostly a sign of things to come. User experience is really important to Google these days, and you would be wise to start looking at your mobile site in those terms. Make sure that you are doing everything you can to make your site mobile friendly, while still presenting a great desk top experience.
That way if Google does actually start penalizing based on poor mobile user experience, you will already be two steps ahead.
By Adrienne Erin
Share Your Thoughts?