By Till Ahrens on Friday, 31 July 2015
Posted in Technical Issues
Replies 17
Likes 0
Views 610
Votes 0
I have the problem that several pages of my website as well as easyblog pages are blocked for the robots. Google Webaster tools is telling me I have alltogether 54 blocked ressource on my website. What can I do to unblock them?

here are some of the blocked links:

http://www.astrologiehamburg.com/cache/widgetkit/widgetkit-62d95882.js,50
http://www.astrologiehamburg.com/cache/widgetkit/widgetkit-26460cdf.css,50
http://www.astrologiehamburg.com/components/com_easyblog/assets/images/default_blogger.png,18
http://www.astrologiehamburg.com/components/com_easyblog/assets/css/common.css,17
http://www.astrologiehamburg.com/components/com_easyblog/themes/default/css/styles.css,17
http://www.astrologiehamburg.com/components/com_easyblog/themes/hako/styles/style.min.css,12
http://www.astrologiehamburg.com/components/com_easyblog/assets/images/loader.gif,10
http://www.astrologiehamburg.com/components/com_easyblog/assets/images/default_category.png,4
http://www.astrologiehamburg.com/templates/protostar/css/template.css,3
http://www.astrologiehamburg.com/images/headers/blue-flower.jpg,3
http://www.astrologiehamburg.com/media/jui/js/jquery-migrate.min.js,2
http://www.astrologiehamburg.com/media/jui/js/jquery-noconflict.js,2
http://www.astrologiehamburg.com/media/jui/js/bootstrap.min.js,2
http://www.astrologiehamburg.com/templates/protostar/js/template.js,2
http://www.astrologiehamburg.com/media/jui/js/jquery.min.js,2
http://www.astrologiehamburg.com/media/system/images/arrow.png,2
http://www.astrologiehamburg.com/components/com_easyblog/themes/nickel/styles/style.min.css,1
http://www.astrologiehamburg.com/media/system/js/html5fallback.js,1
http://www.astrologiehamburg.com/media/foundry/3.1/config/670907d32bcfefe50f98ffecdaa08447.js,1
It is actually correct to block all of these assets files. Why do you want to unblock them at all? Robots doesn't see any of these assets files
·
Friday, 31 July 2015 13:47
·
0 Likes
·
0 Votes
·
0 Comments
·
Seems like google webmaster tools are having problems with it and asking me to solve it. I don't get it....
For example this easy blog link is blocked too:
http://www.astrologiehamburg.com/till-ahrens-blog/entry/zeitqualitaet-und-jahresvorschau-2015.html
This should be readable for the robots?
·
Friday, 31 July 2015 13:57
·
0 Likes
·
0 Votes
·
0 Comments
·
The message I got is: Googlebot can not access CSS and JS files on http://www.astrologiehamburg.com/
·
Friday, 31 July 2015 14:01
·
0 Likes
·
0 Votes
·
0 Comments
·
To: Webmaster of http://www.astrologiehamburg.com/

The Google systems have recently encountered a problem with your home page, that affects the rendering and indexing your content by our algorithms. Googlebot can not access your JavaScript and / or CSS files because of limitations in your robots.txt file. These files Google can determine whether your site is functioning properly. If you block access to these files, this can lead to poorer rankings.
·
Friday, 31 July 2015 14:02
·
0 Likes
·
0 Votes
·
0 Comments
·
I don't really know why would the robots need to access your css and js file but you can always remove the items under your robots.txt file of your site
·
Friday, 31 July 2015 14:57
·
0 Likes
·
0 Votes
·
0 Comments
·
Hm... But the weird thing is the robots obviously cannot access my easyblog pages too. As the link given you above. So this is really necessary if I want my blog post to be found on Google. Isn't it?
·
Friday, 31 July 2015 15:11
·
0 Likes
·
0 Votes
·
0 Comments
·
What are the errors being generated from Webmasters tool for your EasyBlog pages?
·
Friday, 31 July 2015 15:16
·
0 Likes
·
0 Votes
·
0 Comments
·
hm... Suddenly all errors are gone. I ask google to crawl my site again. Did you do anything? I do not really understand it...

Google said: Remove the limitations on your CSS and JavaScript files from your robots.txt instructions and test the changes with the robots.txt tester. Then update your revised robots.txt file on your site and send them to the Search Console.

But how can I do this? Which folder are the CSS and JavaScript files?
Does it have any disadvantage if I allow the robots to read the components and cache and so on?
·
Friday, 31 July 2015 15:40
·
0 Likes
·
0 Votes
·
0 Comments
·
I just cheked Google. It's a problem that quite many people seem to have. Most articles advise you to allow the robots to access the css and Javascript files so you don't get a worse ranking in Google.

I shuld add this in the robto text file:
User-Agent: Googlebot
Allow: .js
Allow: .css

What do you think?
and where to add?
·
Friday, 31 July 2015 15:47
·
0 Likes
·
0 Votes
·
0 Comments
·
Ah... For example this one google wants to access and it's still gives me an error:
http://www.astrologiehamburg.com/components/com_easyblog/assets/images/default_blogger.png,18
·
Friday, 31 July 2015 15:56
·
0 Likes
·
0 Votes
·
0 Comments
·
Hey Till,

Hm, this is interesting. I guess Google just rolled out a new algorithm to crawl sites because it has never been a good idea to allow google bot to crawl js and css files.

As for those errors, I guess it's just Google displaying warnings about specific css rules being used.
·
Saturday, 01 August 2015 03:04
·
0 Likes
·
0 Votes
·
0 Comments
·
What do you suggest me to do?
·
Sunday, 02 August 2015 13:24
·
0 Likes
·
0 Votes
·
0 Comments
·
I think you can just ignore those css warnings, the bot is probably just being strict about the css rules being used. Even Joomla's css files are throwing warning
·
Sunday, 02 August 2015 13:39
·
0 Likes
·
0 Votes
·
0 Comments
·
okay, thanks.
Then we can close this thread.
·
Sunday, 02 August 2015 14:14
·
0 Likes
·
0 Votes
·
0 Comments
·
No problem Till
·
Sunday, 02 August 2015 16:17
·
0 Likes
·
0 Votes
·
0 Comments
·
View Full Post