Indexed, though blocked by robots.txt: Need to bother?
-
Hi,
We have intentionally blocked some of the website files which were indexed for years. Now we receive a message "Indexed, though blocked by robots.txt" in GSC. We can ignore as per my knowledge? Are any actions required about this? We thought of blocking them with meta tags but these are PDF files.
Thanks
-
Hi there!
What Google is telling you is that you are indexing URLs that you probably are not wanting to be indexed, or the other way around, that important pages are being blocked but indexed for other reasons.
If I might ask, why did you blocked through robots.txt those files?
There most 2 answers are:
1- Wanted to remove those from search results. If this is your case, you've solved only a part of the problem. What you should have done is (previously allowing robots to crawl those urls) apply noindex rules (keep in mind that can be set up in the HTTP header, as long as not html files cant have meta robots tag), then after a sufficient time block them in robots.txt.
_2- Optimize how GoogleBot (crawiling) time. _Being this case, then you've done it correctly and there is nothing to worry.Hope this help.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Indexing of Search Pages
I have a question on indexing search pages of an ecommerce or any website. I read Google doesn't recommend this and sites shouldn't allow indexing of their search pages. I recently attended an SEO event (BrightonSEO) and one of the talks was on search pages and how big players like eBay, Amazon do index their search pages. In fact, it is a core part of the pages that are indexed. eBay has to do it, as their product pages are on a time frame and Amazon only allows certain category search pages to be indexed. Reviewing my competitors, they are indexing search pages and this is why they have thousands and millions of web pages indexed. What are your thoughts? I thought search pages were too dynamic (URL strings) and they wouldn't have a unique page title, meta description or rich content to act as a well optimised page. Am I missing a trick here? Cyto
Algorithm Updates | | Bio-RadAbs0 -
Google Index
Hi all, I just submit my url and linked pages along with xml map to index. How long does it take google to index my new pages?
Algorithm Updates | | businessowner0 -
Is it stil a rule that Google will only index pages up to three tiers deep? Or has this changed?
I haven't looked into this in a while, it used to be that you didn't want to bury pages beyond three clicks from the main page. What is the rule now in order to have deep pages indexed?
Algorithm Updates | | seoessentials0 -
Struggling with Google Bot Blocks - Please help!
I own a site called www.wheretobuybeauty.com.au After months and months we still have a serious issue with all pages having blocked URLs according to Google Webmaster Tools. The 404 errors are returning a 200 header code according to the email below. Do you agree that the 404.php code should be changed? Can you do that please ? The current state: Google webmaster tools Index Status shows: 26,000 pages indexed 44,000 pages blocked by robots. In late March, we implemented a change recommended by an SEO expert and he provided a new robots.txt file, advised that we should amend sitemap.xml and other changes. We implemented those changes and then setup a re-index of the site by google. The no of blocked URLs eventually reduced in May and June to 1,000 for a few days – but now the problem has rapidly returned. The no of pages that are displayed in a google search request of www.google.com.au where the query was ‘site:wheretobuybeauty.com.au’ is 37,000: This new site has been re-crawled over last 4 weeks. About the site This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster tools robots.txt file has been modified several times: Firstly we had none Then we created one but were advised that it needed to have this current content: User-agent: * Disallow: Sitemap: http://www.wheretobuybeauty.com.au/sitemap.xml
Algorithm Updates | | socialgrowth0 -
Local search ranking tips needed
Hi there, I've been working on my clients website for a while now. About a month ago I created him a local business listing in Google. I was wondering if there are any new tips to get his business up the rankings in local search? I've researched and only really found information relevant to the old way Google displayed local search.
Algorithm Updates | | SeoSheikh0 -
Google Shopping Blocking All Vitamins and Natural Products - Glitch or Deliberate Censorship?
Hi everyone. We have a client that manufactures and supplies dietary supplements all around the world. We are slightly concerned that a recent Google shopping glitch (or change) is now seemingly excluding products from the shopping search results. This currently appears only to be happening in the US but we are really concerned, as our client ships all over the world and the potential loss of revenue could be quite large. There is already a YouTube video that demonstrates what is going on which is available below: http://www.youtube.com/watch?v=zNDyS0tF4dY Just to clarify, these are products that should not be included in any of Google's “sensitive” categories as they currently stand. Taking Vitamin B12 as an example, it is recognised as a permissible dietary supplement within pretty much every regulatory framework around the world, including those governed by the US FDA, The Euopean Commission and the Australian TGA. Therefore there would be no legal reasons to prevent it's inclusion in shopping results in any country. Has this just slipped under the radar or can anyone point us to a resource that may be able to clarify why this has happened? Thanks in advance guys!
Algorithm Updates | | AduroLabs0