Recovering from Blocked Pages Debaucle
-
Hi, per this thread: http://www.seomoz.org/q/800-000-pages-blocked-by-robots We had a huge number of pages blocked by robots.txt by some dynamic file that must have integrated with our CMS somehow. In just a few weeks hundreds of thousands of pages were "blocked." This number is now going down, but instead of by the hundreds of thousands, it is going down by the hundreds and very sloooooowwwwllly. So, we really need to speed up this process. We have our sitemap we will re-submit, but I have a few questions related to it: Previously the sitemap had the <lastmod>tag set to the original date of the page. So, all of these pages have been changed since then. Any harm in doing a mass change of the <lastmod>field? It would be an accurate reflection, but I don't want it to be caught by some spam catcher. The easy thing to do would be to just set that date to now, but then they would all have the same date. Any other tips on how to get these pages "unblocked" faster? Thanks! Craig</lastmod></lastmod>
-
Hey Dan,
I am actually not so concerned about the pages being indexed. I don't really think they were ever de-indexed. Unless I am wrong, I think they were de-ranked.
I know others have said that when they "disallowed" large portions of their sites, their pages dropped in the rankings, and did not necessarily disappear. This is more what I want to see recovery from.
Thanks!
Craig
-
Craig
D'you have screaming frog? BEST way to make sure you're all set is - run a crawl with Screaming Frog. By default it will acknowledge robots.txt and not crawl anything being blocked. Set the user agent to Googlebot.
If it crawls all the pages you want it to just fine, than you are all set!
-Dan
-
Thanks for jumping in Dan. The number of blocked pages, over a month later is still way up there. It really has barely gone done. As of today it is at 904,000.
So, we still wait and hope that:
A. That many pages aren't actually blocked (whatever blocked actually means.)
B. The rate at which that number falls will begin to increase.
Thanks for your answer!
Craig
-
Hey There
I see this question is a bit old ... are you still have these issues? If so, when you say "going down" do you mean according to the numbers showing in Webmaster Tools?
I do know that quite often there can be a delay in the data in Webmaster Tools (especially the indexation report which you may be referring to).
I don't think there's any harm in updating the dates to reflect the most recent version of the page, so long as they are accurate.
Let me know if that helps or if you're all set.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Linking Pages - 404s
Hello, I have noticed that we have recently managed to accrue a large number of 404s that are listed as Page Title/URL of Linking Page in Moz (e.g. http://www.onexamination.com/international/) but I do not know which site they are coming from, is there an easy why to find out or shall we just create redirects for them all? Thanks in advance for your help. Rose
Technical SEO | | bmjcai1 -
Adding directories to robots nofollow cause pages to have Blocked Resources
In order to eliminate duplicate/missing title tag errors for a directory (and sub-directories) under www that contain our third-party chat scripts, I added the parent directory to the robots disallow list. We are now receiving a blocked resource error (in Webmaster Tools) on all of the pages that have a link to a javascript (for live chat) in the parent directory. My host is suggesting that the warning is only a notice and we can leave things as is without worrying about the page being de-ranked/penalized. I am wondering if this is true or if we should remove the one directory that contains the js from the robots file and find another way to resolve the duplicate title tags?
Technical SEO | | miamiman1000 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Local City Pages
Anyone have any input on the tactics being used for a national company trying to target local city pages. For instance, you might be a national printing company and you are trying to compete against local printers in cities by creating a specific page for that city + print keywords.
Technical SEO | | waqid0 -
How to know what pages are 301 redirecting to me?
Hi! It is easy to know if somebody is spam linking your website, looking i.e., looking at open site explorer to analyse the links profile. But, is it possible to know if a competitor of mine is redirecting a bad domain to main with a 301 redirect, thus transfering any bad SEO reputation to me? Best Regards, Daniel
Technical SEO | | te_c0 -
Should I delete a page or remove links on a penalized page?
Hello All, If I have a internal page that has low quality links point to it or a penality. Can I just remove the page, and start over versus trying to remove the links? Over time wouldn't this page disapear along with the penalty on that page? Kinda like pruning a tree? Cutting off the junk limbs so other could grow stronger, or to start new fresh ones. Example: www.domain.com Penalized Internal Page: (Say this page is penalized due to keyword stuffing, and has low quality links pointing to it like blog comments, or profiles) www.domain.com/penalized-internal-page.com Would it be effective to just delete this page (www.domain.com/penalized-internal-page.com) and start over with a new page. New Internal Page: www.domain.com/new-internal-page.com I would of course lose any good links point to that page, but it might be easier then trying to remove old back links. Thoughts? Thanks! Pete
Technical SEO | | Juratovic0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0 -
Page Title Not Displayed in SERPS
Why would a page title not be displayed in the SERPS? Everything appears to be formatted correctly in the code, yet the title of the company gets displayed instead of the page title? Any general idea why this could be happening?
Technical SEO | | MichaelWeisbaum0