Robots.txt blocked internal resources Wordpress
-
Hi all,
We've recently migrated a Wordpress website from staging to live, but the robots.txt was deleted. I've created the following new one:
User-agent: *
Allow: /
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Allow: /wp-admin/admin-ajax.phpHowever, in the site audit on SemRush, I now get the mention that a lot of pages have issues with blocked internal resources in robots.txt file. These blocked internal resources are all cached and minified css elements: links, images and scripts.
Does this mean that Google won't crawl some parts of these pages with blocked resources correctly and thus won't be able to follow these links and index the images? In other words, is this any cause for concern regarding SEO?
Of course I can change the robots.txt again, but will urls like https://example.com/wp-content/cache/minify/df983.js end up in the index?
Thanks for your thoughts!
-
Thanks for the answer!
Last question: is /wp-admin/admin-ajax.php an important part that has to be crawled? I found this explanation: https://wordpress.stackexchange.com/questions/190993/why-use-admin-ajax-php-and-how-does-it-work/191073#191073
However, on this specific website there is no html at all when I check the source code, only one line with 0 on it.
-
I would leave all the disallows out except for the /wp-admin/ section. For example, I'd rewrite the robots.txt file to read:
User-agent: *
Disallow: /wp-admin/Also, you kind of want Google to index your cached content. In the event your servers go down it will still be able to make your content available.
I hope that helps. Let me know how that works out for you!
-
Thanks for the clear answer.
I've changed the robots.txt to:
User-agent: *
Allow: /
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/themes/
Allow: /wp-admin/admin-ajax.phpThis should avoid problems with not indexing (parts of) cached content.
Or should I leave all the Disallows out?
-
Hey there --
Blocking resources with the robots.txt file prevents search engines from crawling content the no-index tag would be better suited for preventing content from being indexed.
However, previous best practice would dictate blocking access to /wp-includes/ and /wp-content/ directories, etc but that's no longer necessary.
Today, Google will fetch all your styling and JavaScript files so they can render your pages completely. Search engines now try to understand your page's layout and presentation as a key part of how they evaluate quality.
So, yeah this might have some impact on your SEO.
Also, if you're using a plugin to cache content you should want Google to crawl your cache content. And in my experience, Googlebot does a good job of not indexing /wp-content/ sections.
So, for your example page, https://example.com/wp-content/cache/minify/df983.js it shouldn't end up in their index.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Blog Structure & Hreflang Tags
Hi all, I'm running an international website across 5 regions using a correct hreflang setup. A problem I think I have is that my blog structure is not standardized and also uses hreflang tags for each blog article. This has naturally caused Google to index each of the pages across each region, meaning a massive amount of pages are being crawled. I know hreflang solves and issues with duplication penalties, but I have another question. If I have legacy blog articles that are considered low quality by Google, is that counting against my site once or multiple times for each time the blog is replicated across each region? I'm not sure if hreflang is something that would tell Google this. For example, if I have low quality blog posts: blog/en-us/low-quality-article-1
Intermediate & Advanced SEO | | MattBassos
blog/en-gb/low-quality-article-1
blog/en-ca/low-quality-article-1 Do you think Google is counting this as 3 low quality articles or just 1 if hreflang is correctly implemented? Any insights would be great because I'm considering to cull the international setup of the blog articles and use just /blog across each region.0 -
Robots txt is case senstive? Pls suggest
Hi i have seen few urls in the html improvements duplicate titles Can i disable one of the below url in the robots.txt? /store/Solar-Home-UPS-1KV-System/75652
Intermediate & Advanced SEO | | Rahim119
/store/solar-home-ups-1kv-system/75652 if i disable this Disallow: /store/Solar-Home-UPS-1KV-System/75652 will the Search engines scan this /store/solar-home-ups-1kv-system/75652 im little confused with case senstive.. Pls suggest go ahead or not in the robots.txt0 -
Any WordPress themes better for schema
I'm putting together a holiday listing sits and as seo is key would like to add schema data. Does anyone know of any themes that are easier to do this with for a non techie - have looked at plug ins but they get a mixed review Thanks Neil
Intermediate & Advanced SEO | | neilhenderson1 -
Redirect wordpress.com and internal link ?
Hi Moz Fans, First of all, I need to say thanks to all of answer to previous post. And today i also have the another question that similar to that post. Because our website using Wordpress.org as our CMS for blog post then easier to redirect by point to new site, According to setting site URL ? However in our each blog articles also have anchor text as internal link that link to another blog post, Which mean those link will be automatic redirect to new URL. So once Google bot re-crawl our website when we tell the Google by webmaster tools and the redirection we using 301. What will be happen when Google Bot crawl those link again We need to changes those link as well Keep same with redirection. Nothing happen
Intermediate & Advanced SEO | | ASKHANUMANTHAILAND0 -
Disallow URLs ENDING with certain values in robots.txt?
Is there any way to disallow URLs ending in a certain value? For example, if I have the following product page URL: http://website.com/category/product1, and I want to disallow /category/product1/review, /category/product2/review, etc. without disallowing the product pages themselves, is there any shortcut to do this, or must I disallow each gallery page individually?
Intermediate & Advanced SEO | | jmorehouse0 -
WordPress Duplicate URLs?
On my site, there are two different category bases leading to the exact same page. My developer claims that this is a common — and natural — occurrence when using WordPress, and that there's not a duplicate content issue to worry about. Is this true? Here's an example of the correct url. and... Here's an example of the same exact content, but using a different url. Notice that one is coming from /topics and the other is coming from /authors base. My understanding is that this is bad. Am I wrong?
Intermediate & Advanced SEO | | JasonMOZ1 -
Anyone deal with WebSynthesis as a WordPress host?
Curious to get feedback from users who have used or are currently using WebSynthesis for their WordPress web hosting. I'm also open to hear about what you are doing if not using WebSynthesis, like WPEngine or GoDaddy's new Managed WP solution, etc. Thanks!!
Intermediate & Advanced SEO | | WhiteboardCreations0 -
Sitemap or Sitemaps for Magento and Wordpress?
I'm trying to figure out what to do with our sitemap situation. We have a magento install for our shopping cart
Intermediate & Advanced SEO | | chrishansen
sdhydroponics.com
and a wordpress install on
sdhydroponics.com/resources In Magento we get the XML sitemap manually by going to Catalog => Google Sitemap => Add Sitemap In wordpress we use Google XML sitemaps plugin. My questions are: Do I need both of these sitemaps? Or can I use one or the other? If I use both, do I make one sitemap1.xml and the other sitemap2.xml and drop them in the root? How do I make sure google knows I have 2 sitemaps? Anything else I should know? Thank You0