800,000 pages blocked by robots...
-
We made some mods to our robots.txt file. Added in many php and html pages that should not have been indexed.
Well, not sure what happened or if there was some type of dynamic conflict with our CMS and one of these pages, but in a few weeks we checked webmaster tools and to our great surprise and dismay, the number of blocked pages we had by robots.txt was up to about 800,000 pages out of the 900,000 or so we have indexed.
1. So, first question is, has anyone experienced this before? I removed the files from robots.txt and the number of blocked files has still been climbing. Changed the robots.txt file on the 27th. It is the 29th and the new robots.txt file has been downloaded, but the blocked pages count has been rising in spite of it.
2. I understand that even if a page is blocked by robots.txt, it still shows up in the index, but does anyone know how the blocked page affects the ranking? i.e. while it might still show up even though it has been blocked will google show it at a lower rank because it was blocked by robots.txt?
Our current robots.txt just says:
User-agent: *
Disallow:Sitemap: oursitemap
Any thoughts?
Thanks!
Craig
-
Hey Matt,
Thanks for taking the time to answer!
Well, the good news is, this caused us to find some issues with our sitemaps that we have now fixed and we might not have found them if this hadn't happened.
According to wmtools, however, we are still at 937,000 pages blocked... I don't know if they are actually blocked or not... Now that we have re-submitted our sitemap, hopefully we will start to see this change soon.
Thanks for the load time info. Yeah, we are aware that we could speed things up. We are always trying to do that more and more.
Hopefully we will start seeing that number go down very soon...
Thanks!
Craig
-
Hi Craig
Sorry for taking a while to come back to you but I have been very busy, however I have a couple a question -
When you first noticed the blocked pages had you made any changes to the site at that time and if so what were they?
Have you done anything that could have slowed your site down - running your homepage through a speed of load test I notice that it took over 4 secs which isn't very quick.
I once had an issue of lots of pages being de-indexed when we made an update to our sites template and the load time increased drastically. Even if this isn't the cause of the issue looking at optimizing your load time will help increase the speed at which your site is re-crawled. Google will be able to get through more in the time it allocates to crawl your site each time it visits if it is smaller and loads quicker hence speeding up your recovery.
There are lots of tools and information on optimizing load times - here is just one - http://www.webpagetest.org
-
Hey Matt,
Just sent you a PM with our site details.
Yeah, we are on SEOMoz, but nothing standing out there.
We are up to 941,364 pages blocked today. I thought I saw that it had gone down a tiny bit yesterday, but was mistaken.
Thanks for taking a look!
Craig
-
Hi Craig - you are right that the directive without a slash means allow everything - I was just trying to figure out how you could have caused this issue because Google doesn't appear to be following your directive to crawl everything hence why I asked about the layout. Have you tested your robots.txt in Google Webmaster Tools?
What is the location of your robots.txt?
What does your index status say in Google Webmaster Tools?
You can also just create an empty robots.txt file which will allow all as well or
User-agent: *
Allow:/I take it that you have this website setup as a campaign in SEOMoz - has this identified any relevant issues in the latest crawl?
Would you share your web address with us or even private message me with it so I can have a look for clues as this is very interesting!
-
Hi Matt,
Sorry for the confusion, I should have pasted that text using plain text so it wouldn't be on the same line.
I edited it as seen above. The user agent and disallow are on separate lines.
Today we are up to 940,000 blocked URLs in webmaster tools.
The reason I didn't delete robots is I read some suggestion that if you delete it, Google will think that there was a problem accessing it and continue relying on the former version for a period of time. Not sure how much truth is in that, but seemed to make enough sense not to delete it, but just correctly modify it.
Are you saying that my command above is disallowing all? From the research I have done, you have to have a slash at the end to disallow all, i.e. Disallow: /. Having Disallow: with nothing after it, is supposed to allow all, which is the goal.
From robots.txt.org:
To allow all robots complete access
User-agent: *
Disallow:Strangely, we haven't noticed an enormous traffic drop. However, this happened right at the time that we fixed some other issues that should have caused a significant improvement, so it could just be that no positive impact has been felt and things just remained the same.
Ultimately, the fact that the blocked pages keeps rising is worrisome, or says that there is a bug in Google's system.
Thanks!
Craig
-
I have experienced something similar after a site redesign the test version was put live with the robots.txt disallowing all. My site was deindexed quickly as when you block pages with a robots.txt their page content wont be indexed so won't appear in the search results. Google may index urls that are disallowed if they are linked from another page online however the rank will be lower due to the page content being ignored. Remove your robots.txt above as it is disallowing all It would appear although that command should allow all but there is no point in having robot file allowing all as this happens without. Though you would usually have disallow: / - to stop all!! Then I would resubmit an updated site map in Google Webmaster Tools and you should see your pages start to be indexed again. If you don't have a site map you can just wait for Google to start to re crawl your site. I would also check your homepage source code to make sure there isn't a robots meta tag turned on by accident saying no index no follow as I have seen this done by accident with CMS before.
have a look here on exactly how google handles robots.txt - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
One quick question - have you laid out your file exactly as above with the user agent and disallow on the same line as this might be what is causing the issue? I haven't tested it but the standard is to have them on separate lines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
How can you add a rel canonical tag if you haven't created the wrong pages?
For one of our white paper campaigns we are getting multiple URL's some how but we only have one version of the page. So do I put the rel canonical tag on that one single page? Will that fix the other url's from being indexed? I'm assuming people are typing in the urls with underscores and capital or non-capital letters and it's showing up that way in analytics. Thanks!
Reporting & Analytics | | Sika220 -
Why is Google Analytics reporting 20% fewer goals than Unique pageviews of same thank you page?
This is really puzzling me and my research has not thrown out the answer. I have always understood URL goals to be unique pageviews of the thank you page you are tracking. UPVs and goals should both only be counted once per session... Has anyone else seen this issue? Goals were not set up historically so I wanted to use unique pageviews of the thank you page for year on year comparisons, but 20% is a big difference! Background There are multiple pages to track so goal is set up using Regex There is no mistake in the goal set up (honest!) The goal URLs all match the unique pageview URLs, there are no rogue URLs There has been no change to the site or the tracking set up Data is not being sampled It's a lead gen site in an area where multiple enquiries within one visit would be very unusual Thanks in advance!
Reporting & Analytics | | McCannSEO0 -
Has anyone experienced Google Analytics track the page visit to a "thank you" page, but not the goal conversion?
Has anyone experienced where Google Analytics would track the page visit to a "thank you" page, but not the goal conversion that should result? The goal had worked for a long time as it is as just a goal url with head match. No funnel. Not case sensitive. For about four days now, no conversions have been recorded, but Google Analytics shows hundreds of people visited the page that should trigger the goal. Additionally, we have received the hundreds of leads. A Screaming Frog search shows the code is embedded throughout the site. For the interested, the GA code looks like (and the 8 Xs are the correct number on the site): Am I missing something?
Reporting & Analytics | | 352inc0 -
Homepage on page 2 for site:domain
Hi all, today I noticed that our homepage is located on page 2 if you do the site:domain query. As far as I know, the site:domain results mirror the importance in the eyes of Google. Some time ago, our homepage was the first result. I have to say that we do not often have changing elements or new content on the homepage, it is more like a static page. But still the most linked to page on the domain... What conclusion can I come to? Is our homepage of lower importance to Google than some time ago? Is it a problem for SEO? As we backed down our advertisments, the traffic from branded keywords fell the last months - could this be an explanation? And, most important: do I have to worry? (Besides, the SEO-traffic is fine and growing..)
Reporting & Analytics | | accessKellyOCG0 -
Page Optimization score is An A but Rank is dropping
New to company in SEO and they wanted a page to rank high for certain term : knowledge management on this page: http://www.apqc.org/knowledge-management Ran SEO Moz optimizer and did suggested changes. Moved score to A and ranking was on google first page but now it's dropped considerably last 2 months as the bounce rate has gone through the roof. I'm at my wits end and don't want the higher ups on me for the drop. I'm not sure what to try next.
Reporting & Analytics | | inhouseninja0 -
Duplicate page content
I have a website which "houses" five different and completely separate departments, so the content is separated by subfolders. e.g. domain.com/department1 domain.com/department2 etc. and each have their own individual top navigation menus. There is an "About Us" section for each department which has about 6 subpages (Work for us, What we do, Awards etc.) but the problem is that the content for each department is exactly the same. The only difference is the navigation menu and the breadcrumbs. This isn't ideal as a change to one page means having to make the change to all 5 and from an SEO perspective it's duplicate content x5 (apart from the Nav). One solution I can see is to have the "About Us" section moved to the root level (domain.com/about-us) and have a generic nav, possibly with the department names on it. The only problem with this is that it disrupts the user journey if they are forced away from the department that they're chosen. Basically i'm looking for suggestions or examples of other sites that have got around this problem, I need inspiration! Any help would be greatly appreciated.
Reporting & Analytics | | haydennz0