Recovering from Blocked Pages Debaucle
-
Hi, per this thread: http://www.seomoz.org/q/800-000-pages-blocked-by-robots We had a huge number of pages blocked by robots.txt by some dynamic file that must have integrated with our CMS somehow. In just a few weeks hundreds of thousands of pages were "blocked." This number is now going down, but instead of by the hundreds of thousands, it is going down by the hundreds and very sloooooowwwwllly. So, we really need to speed up this process. We have our sitemap we will re-submit, but I have a few questions related to it: Previously the sitemap had the <lastmod>tag set to the original date of the page. So, all of these pages have been changed since then. Any harm in doing a mass change of the <lastmod>field? It would be an accurate reflection, but I don't want it to be caught by some spam catcher. The easy thing to do would be to just set that date to now, but then they would all have the same date. Any other tips on how to get these pages "unblocked" faster? Thanks! Craig</lastmod></lastmod>
-
Hey Dan,
I am actually not so concerned about the pages being indexed. I don't really think they were ever de-indexed. Unless I am wrong, I think they were de-ranked.
I know others have said that when they "disallowed" large portions of their sites, their pages dropped in the rankings, and did not necessarily disappear. This is more what I want to see recovery from.
Thanks!
Craig
-
Craig
D'you have screaming frog? BEST way to make sure you're all set is - run a crawl with Screaming Frog. By default it will acknowledge robots.txt and not crawl anything being blocked. Set the user agent to Googlebot.
If it crawls all the pages you want it to just fine, than you are all set!
-Dan
-
Thanks for jumping in Dan. The number of blocked pages, over a month later is still way up there. It really has barely gone done. As of today it is at 904,000.
So, we still wait and hope that:
A. That many pages aren't actually blocked (whatever blocked actually means.)
B. The rate at which that number falls will begin to increase.
Thanks for your answer!
Craig
-
Hey There
I see this question is a bit old ... are you still have these issues? If so, when you say "going down" do you mean according to the numbers showing in Webmaster Tools?
I do know that quite often there can be a delay in the data in Webmaster Tools (especially the indexation report which you may be referring to).
I don't think there's any harm in updating the dates to reflect the most recent version of the page, so long as they are accurate.
Let me know if that helps or if you're all set.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
Crawl Test Report only shows home page and no inner site pages?
Hi, My site is [removed] When I first tried to set up a new campaign for the site, I received the error: Roger has detected a problem: We have detected that the root domain [removed] does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information. I then ran a Crawl Test per the FAQ. The SEOmoz crawl report only shows my home page URL and does not have any inner site pages. This is a Joomla site. What is the problem? Thanks! Dave
Technical SEO | | crave810 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Is content important on home page
hi. i am working on a site at the moment www.in2town.co.uk and i am trying to decide if on the second column of my site where it says uk news, if i should keep it the way it is and have content under the picture or should i get rid of the content under the picture and just have the main title. I am wanting to know if the content under the picture is important for google and for the reader or would it be better just to have the title which is h2. any help would be great.
Technical SEO | | ClaireH-1848860 -
How many pages should my site have?
Right now I think I only have 36. What is a good amount of pages to have? Any ideas on ways to add relevant pages to my site? I was thinking about starting a message board. Also, I have a free tech support chat room, and was thinking about posting the logs somewhere on the site. Does that sound like a good idea? Thanks.
Technical SEO | | eugenecomputergeeks0 -
How do I know which page a link is from
I've got an interesting situation. I hope you can help. I have a list of links but I'm not sure which pages of my site they are from. How do I know which page a specific link is from? Thanks in advance.
Technical SEO | | VinceWicks0 -
If you only want your home page to rank, can you use rel="canonical" on all your other pages?
If you have a lot of pages with 1 or 2 inbound links, what would be the effect of using rel="canonical" to point all those pages to the home page? Would it boost the rankings of the home page? As I understand it, your long-tail keyword traffic would start landing on the home page instead of finding what they were looking for. That would be bad, but might be worth it.
Technical SEO | | watchcases0 -
Discrepency between # of pages and # of pages indexed
Here is some background: The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted) 2) Only 5,500 pages appear in the Google index 3) Webmaster shows that approximately 200 pages could not be crawled for various reasons 4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting) 5) No other errors are being reported in either Webmaster or SEO MOZ 6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks. Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
Technical SEO | | Mont0