Clarification on the Lack of Clarity on Blocked JavaXcript, CSS, and Image Files

Clarification on the Lack of Clarity on Blocked JavaXcript, CSS, and Image Files
Not all is revealed but work is progressing.
Thanks to John Mueller for this update.
H/T MaAnna Stephenson
#rebottedresources
Originally shared by John Mueller
I see a bunch of posts about the robotted resources message that we're sending out. I haven't had time to go through & review them all (so include URLs if you can :)), but I'll spend some time double-checking the reports tomorrow.
Looking back a lot of years, blocking CSS & JS is something that used to make sense when search engines weren't that smart, and ended up indexing & ranking those files in search. For a long time now, that's no longer the case (happy to get reports if you do see it though), and nowadays it's extremely helpful to be able to access CSS & JS files so that we understand what a page really looks like. Is it mobile-friendly? Is there content that we'd miss without JavaScript? Blocked CSS & JS files are some of the bigger problems that we run across in the meantime.
Search Console gives you tools to find blocked content across your site, to test individual pages to see if all embedded content is accessible, and to test your robots.txt files to find why a specific URL is blocked (and to submit an updated robots.txt so that we can crawl it ASAP). We've also documented this problem in various ways over time, for example in https://developers.google.com/webmasters/mobile-sites/mobile-seo/common-mistakes/blocked-resources
That said, we hear you that these messages aren't as clear as they could be, and we'll double-check your reports (here on G+, on Twiter, in the forums, etc) to see what we can do to improve that.
Do I get it right, that Robots.txt files should be made accesible Zara Altair ? Thanks for sharing, very useful.
ReplyDeleteNina Trankova That is my understanding.
ReplyDeleteNina Trankova there is no way I'd give a total open door in robots.txt to Google or any other bot. That turns into both a security and performance issue. Here's more info on it http://www.blogaid.net/googlebot-css-and-js-block-warning-checks-fixes
ReplyDeleteDon't forget that many browsers now support plugins that scan the page first and block scripts by default. So no matter what is on the page or how it is configured it's up to the user to decide what is allowed & what is blocked. The robots.txt file as well as the META info of the sites index can be modified for individual pages/directories as well and may help to satisfy those areas you need to optimize. Totally agree with MaAnna Stephenson I would sooner have a secure site & optimized user experience over scroogling & monopolizing datamining for bots any day.
ReplyDeleteBesides, despite the race for top ranking, the secret is in the relevance to search results from your audience.
Great article btw!
Totally agree Justin Case. What Google wants and what's good for your site seem to be getting more at odds lately. I'm disappointed with them holding out that ranking carrot as a heavy handed way to shove it on us too. That, especially with most site owners not understanding the fundamentals of SEO and ranking in the first place. I see it every day in site audits. Missing all the basics and chasing these carrots.
ReplyDeleteYup exactly MaAnna Stephenson
ReplyDeleteCan't blame them for trying. It's equally dismaying how many owners of websites either are illiterate and take these "recommendations" as gospel or simply don't weigh the "cost" of sacrificing security for the convenience of not thinking at all about what the consequences will be until it's too late. Then, have the audacity to complain about it!
MaAnna Stephenson and Justin Case security is prime!
ReplyDelete