800,000 pages blocked by robots...
-
We made some mods to our robots.txt file. Added in many php and html pages that should not have been indexed.
Well, not sure what happened or if there was some type of dynamic conflict with our CMS and one of these pages, but in a few weeks we checked webmaster tools and to our great surprise and dismay, the number of blocked pages we had by robots.txt was up to about 800,000 pages out of the 900,000 or so we have indexed.
1. So, first question is, has anyone experienced this before? I removed the files from robots.txt and the number of blocked files has still been climbing. Changed the robots.txt file on the 27th. It is the 29th and the new robots.txt file has been downloaded, but the blocked pages count has been rising in spite of it.
2. I understand that even if a page is blocked by robots.txt, it still shows up in the index, but does anyone know how the blocked page affects the ranking? i.e. while it might still show up even though it has been blocked will google show it at a lower rank because it was blocked by robots.txt?
Our current robots.txt just says:
User-agent: *
Disallow:Sitemap: oursitemap
Any thoughts?
Thanks!
Craig
-
Hey Matt,
Thanks for taking the time to answer!
Well, the good news is, this caused us to find some issues with our sitemaps that we have now fixed and we might not have found them if this hadn't happened.
According to wmtools, however, we are still at 937,000 pages blocked... I don't know if they are actually blocked or not... Now that we have re-submitted our sitemap, hopefully we will start to see this change soon.
Thanks for the load time info. Yeah, we are aware that we could speed things up. We are always trying to do that more and more.
Hopefully we will start seeing that number go down very soon...
Thanks!
Craig
-
Hi Craig
Sorry for taking a while to come back to you but I have been very busy, however I have a couple a question -
When you first noticed the blocked pages had you made any changes to the site at that time and if so what were they?
Have you done anything that could have slowed your site down - running your homepage through a speed of load test I notice that it took over 4 secs which isn't very quick.
I once had an issue of lots of pages being de-indexed when we made an update to our sites template and the load time increased drastically. Even if this isn't the cause of the issue looking at optimizing your load time will help increase the speed at which your site is re-crawled. Google will be able to get through more in the time it allocates to crawl your site each time it visits if it is smaller and loads quicker hence speeding up your recovery.
There are lots of tools and information on optimizing load times - here is just one - http://www.webpagetest.org
-
Hey Matt,
Just sent you a PM with our site details.
Yeah, we are on SEOMoz, but nothing standing out there.
We are up to 941,364 pages blocked today. I thought I saw that it had gone down a tiny bit yesterday, but was mistaken.
Thanks for taking a look!
Craig
-
Hi Craig - you are right that the directive without a slash means allow everything - I was just trying to figure out how you could have caused this issue because Google doesn't appear to be following your directive to crawl everything hence why I asked about the layout. Have you tested your robots.txt in Google Webmaster Tools?
What is the location of your robots.txt?
What does your index status say in Google Webmaster Tools?
You can also just create an empty robots.txt file which will allow all as well or
User-agent: *
Allow:/I take it that you have this website setup as a campaign in SEOMoz - has this identified any relevant issues in the latest crawl?
Would you share your web address with us or even private message me with it so I can have a look for clues as this is very interesting!
-
Hi Matt,
Sorry for the confusion, I should have pasted that text using plain text so it wouldn't be on the same line.
I edited it as seen above. The user agent and disallow are on separate lines.
Today we are up to 940,000 blocked URLs in webmaster tools.
The reason I didn't delete robots is I read some suggestion that if you delete it, Google will think that there was a problem accessing it and continue relying on the former version for a period of time. Not sure how much truth is in that, but seemed to make enough sense not to delete it, but just correctly modify it.
Are you saying that my command above is disallowing all? From the research I have done, you have to have a slash at the end to disallow all, i.e. Disallow: /. Having Disallow: with nothing after it, is supposed to allow all, which is the goal.
From robots.txt.org:
To allow all robots complete access
User-agent: *
Disallow:Strangely, we haven't noticed an enormous traffic drop. However, this happened right at the time that we fixed some other issues that should have caused a significant improvement, so it could just be that no positive impact has been felt and things just remained the same.
Ultimately, the fact that the blocked pages keeps rising is worrisome, or says that there is a bug in Google's system.
Thanks!
Craig
-
I have experienced something similar after a site redesign the test version was put live with the robots.txt disallowing all. My site was deindexed quickly as when you block pages with a robots.txt their page content wont be indexed so won't appear in the search results. Google may index urls that are disallowed if they are linked from another page online however the rank will be lower due to the page content being ignored. Remove your robots.txt above as it is disallowing all It would appear although that command should allow all but there is no point in having robot file allowing all as this happens without. Though you would usually have disallow: / - to stop all!! Then I would resubmit an updated site map in Google Webmaster Tools and you should see your pages start to be indexed again. If you don't have a site map you can just wait for Google to start to re crawl your site. I would also check your homepage source code to make sure there isn't a robots meta tag turned on by accident saying no index no follow as I have seen this done by accident with CMS before.
have a look here on exactly how google handles robots.txt - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
One quick question - have you laid out your file exactly as above with the user agent and disallow on the same line as this might be what is causing the issue? I haven't tested it but the standard is to have them on separate lines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does intercom support pages and redirect issue can affect the SEO performance of our website?
I noticed that in the redirect issues I have, most of the issues are coming from our Intercom support links. I want to ask, does intercom support pages and redirect issue can affect the SEO performance of our website?
Reporting & Analytics | | Envoke-Marketing0 -
SEO Effect of inserting No indexed Contents in normal Pages (Nextgen Gallery)
Hello Dear Community, I'm running a photography website and have a question about the indexability of "No indexed Content" inserted on indexable pages. Background : I read everywhere that best practice is to "no index" all useless pages with few content, what I did with Yoast plugin : I no indexed all my nextgen galleries and "ngg_tags" since they create single pages for every photo, tags or slideshow. I did the same for all my porfolio-posts, price lists, testimonials and so on... Nevertheless, I inserted these galleries and portfolios on SEO optimized page for my target keywords. + Nextgen plugin automatically adds these images in the page sitemap. My idea is to have only my Seo optimized page showing in Google and not the others. Problem: I've been checking the results in Google Search Console, filtering by images : I discovered that most of the images featured in these Masonry galleries are not showing in google, and actually almost all the images indexed are the Wordpress from media gallery. I double checked with Screaming Frog, and the software doesn"t see images on these pages. My question is: Is the low indexablilty of these contents are related to the No indexation of the original contents ??? Does somebody has experienced the same issue that these contents doesn't show on Google ? in advance many thanks for your help
Reporting & Analytics | | TristanAventure0 -
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
I have to delete old thankyou page configuration?
Currently i am tracking thank you page conversion via google tag manager by doing following configuration in tag manager :- "Page URL Contains thankyoupage.html" But now i am implementing Enhance Ecommerce with tag manager with following configuration : - Custom event - "Event equals Transaction" I have test at staging level conversion working fine. But i have only one doubt that now I am implementing Enhance Ecommerce with tag manager so i have to delete old tag configuration right? i.e. "Page URL Contains thankyoupage.html"
Reporting & Analytics | | varo0 -
SERPS showing wrong page
Hi there, searching Google UK for the term 'wireless alarms' returns the following page at position 39'ish - http://www.compoundsecurity.co.uk/security-information/gsm-gprs-alarm-systems The page that should be returned for that search is - http://www.compoundsecurity.co.uk/security-equipment The page being returned by Google at present is not optimised in any way for 'wireless alarms' but the page it should be showing is optimised for that term and the Moz grading tool agrees. The correct page also has a higher PA and 533 inbound links and the page being displayed has a lower PA and only 3 links. Could some kind soul tell me what i am doing wrong please? Thank you Si
Reporting & Analytics | | DaddySmurf0 -
Rank #1 for a 110,000/month query search, but barely any traffic?
Hi guys, As it says in the title, we've recently reached the absolute #1 position for a certain key phrase in the travel industry which the Google Keyword Tool tells me averages 110,000 local (165,000 global) searches a month... however we have received barely any traffic at all over the past TWO months for it and I'm trying my best to determine why. We've checked on multiple different devices with all forms of personalisation off, different browsers, 3G connections as opposed to office Wi-FI etc. and it still returns us as the #1 rank. Meta descriptions and title tags are pretty much pristine if I don't say so myself, however what should be a very lucrative key phrase is currently returning little to no traffic results. Has anyone had experience in a similar situation to this? Any possible causes that I might be missing? Would greatly appreciate any help. Thanks.
Reporting & Analytics | | ExperienceOz0 -
Google analytic conversion code in a single wordpress page
Hello How can I add/place the Google analytic conversion code in a single wordpress page? Example: http://www.newsmilecostarica.com/thank-you/ Thank you so much Andy
Reporting & Analytics | | newsmile0 -
Page Rank Decline in Webmaster Tools
Hello Forum, My team and I just got through completely redoing an eCommerce website for a yoga company and I noticed that in Google Webmaster Tools our site's average daily position changed from ~25 to ~40 in the last 5-6 days (during this time period we submitted our sitemap). I exported the Webmaster Tools results and found a high number of new keywords (which weren't there pre-launch) related to yoga. However, these keywords are not related to the products we sell. Several of these keywords have positions of 240 or higher and are skewing our avg page rank. These terms include words like "cricket logo," "animal yoga," "dog pose," "cat yoga," We did move our blog into our domain name now and there are some articles related to animal poses. Any thoughts as to what may be going on? Thanks!
Reporting & Analytics | | pano0