In the beginning there was text and that’s all that Google could see and Google saw that it was good.
Then came the day when Google got with the times and realized that these elements might be nice to see as well.
So what happened? In very simplistic terms, Google’s crawler bot started as a text only browser. The text in the body of your page was the only thing that mattered. Not images. Not styling. Not features. Just text.
It was really for 2 main reasons:
- When Google started, pages were just simpler. There was little more than just text that mattered to a webpage.
- It takes a proportionally huge amount of computing power to render all that extra data over just the text.
However, as the web evolved and all sites now have these elements, Google had to start taking that into account.
Although we don’t know how exactly Google ranks sites, it’s safe to say that keeping up with the times is going to help you show up higher in Google.
What does this mean for you?
Since Google only used to see the web in only text it was common practice to block Google from seeing anything but. Google stated a while back that (http://googlewebmastercentral.blogspot.ca/2014/10/updating-our-technical...)
you should let them crawl anything that contributes to the look of your site. You should remove any css or js files from your robots.txt file. They have now started sending warnings to webmasters about this.
This is how one of our clients, Eikon, looked too Google before the CSS and JS were unblocked. It doesn’t really give Google a good view of how the site looks or what the user experience might be like.
Here’s what it looked like after unblocking. Which do you think looks better to Google?
TL:DR - Unblock CSS and JS files in your robots.txt file. If you don’t know how, give us a call and we’ll help you out.