New CMS system - 100,000 old urls - use robots.txt to block?
-
Hello.
My website has recently switched to a new CMS system.
Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls.
Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical'
Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find.
My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary.
Thanks!
-
Great stuff..thanks again for your advice..much appreciated!
-
It can be really tough to gauge the impact - it depends on how suddenly the 404s popped up, how many you're seeing (webmaster tools, for Google and Bing, is probably the best place to check) and how that number compares to your overall index. In most cases, it's a temporary problem and the engines will sort it out and de-index the 404'ed pages.
I'd just make sure that all of these 404s are intentional and none are valuable pages or occurring because of issues with the new CMS itself. It's easy to overlook something when you're talking about 100K pages, and it could be more than just a big chunk of 404s.
-
Thanks for the advice! The previous website did have a robots.txt file with a few wild cards declared. A lot of the urls I'm seeing are NOT indexed anymore and haven't been for many years.
So, I think the 'stop the bleeding' method will work, and I'll just have to proceed with investigating and applying 301s as necessary.
Any idea what kind of an impact this is having on our rankings? I submitted a valid sitemap, crawl paths are good, and major 301s are in place. We've been hit particularly hard in Bing.
Thanks!
-
I've honestly had mixed luck with using Robots.txt to block pages that have already been indexed. It tends to be unreliable at a large scale (good for prevention, poor for cures). I endorsed @Optimize, though, because if Robots.txt is your only option, it can help "stop the bleeding". Sometimes, you use the best you have.
It's a bit trickier with 404s ("Not Found"). Technically, there's nothing wrong with having 404s (and it's a very valid signal for SEO), but if you create 100,000 all at once, that can sometimes give raise red flags with Google. Some kind of mass-removal may prevent problems from Google crawling thousands of not founds all at once.
If these pages are isolated in a folder, then you can use Google Webmaster Tools to remove the entire folder (after you block it). This is MUCH faster than Robots.txt alone, but you need to make sure everything in the folder can be dumped out of the index.
-
Absolutely. Not founds and no content are a concern. This will help your ranking....
-
Thanks a lot! I should have been a little more specific..but, my exact question would be, if I move the crawlers' attention away from these 'Not Found' pages, will that benefit the indexation of the now valid pages? Are the 'Not Found's' really a concern? Will this help my indexation and/or ranking?
Thanks!
-
Loaded question without knowing exactly what you are doing.....but let me offer this advice. Stop the bleeding with robots.txt. This is the easiest way to quickly resolve that many "not found".
Then you can slowly pick away at the issue and figure out if some of the "not founds" really have content and it is sending them to the wrong area....
On a recent project we had over 200,000 additional url's "not found". We stopped the bleeding and then slowly over the course of a month, spending a couple hours a week, found another 5,000 pages of content that we redirected correctly and removed the robots....
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt Question for E-Commerce Sites
Hi All, I have a couple of e-commerce clients and have a question about URLs. When you perform a search on website all URLs contain a question mark, for example: /filter.aspx?search=blackout I'm not sure that I want these indexed. Could I be causing any harm/danger if I add this to the robots.txt file? /*? Any suggestions welcome! Gavin
On-Page Optimization | | IcanAgency0 -
Ecommerce URLs with numbers
Hi everybody! I have to optimize an ecommerce where somebody has previously done the SEO optimization, although the URLs have numbers before the product's name They have told me that these numbers are useful to find the products, so I think it shouldn't be really bad if I don't redirect them to "clear" ones. For example: /colesterol-sobrepeso/2217-hc-grass-capsulas-duras-15-capsulas.html > /colesterol-sobrepeso/hc-grass-capsulas-duras-15-capsulas.html Am I right? After all, they contain the keywords and the subfolders are also ok. Or it would be better if I redirect the whole site? Thanks!
On-Page Optimization | | Estherpuntu0 -
Transferring old articles to new site even if they are written horribly
OK my question for today is... If you currently have 300 articles on your current site but are building a new site would you transfer all of the articles over to the new site or focus on quality and rewrite the articles that had traffic? For example we have about 300 articles currently on our website 60 of which actually get traffic, We rewrote those articles to make sure they were written well. Someone thought that it would be best to simply transfer the other 240 articles over and rewrite them at another time to avoid 404 redirects. I would like your feedback on how you would approach this. Please be as detailed as possible explaining your thought process. Thanks!
On-Page Optimization | | PrintPlace.com0 -
Changing to Friendly SEO Urls
This is my site, example of a product : [Link removed] Would I lose rank in Google for changing all to friendly SEO urls? Thank you
On-Page Optimization | | 7liberty0 -
On Page Reports - Multiple URLs Appearing for a Keyword
Hello, I have a question regarding the on page reports automatically generated by seomoz When I look at my on page reports I notice that each keyword appears a number of times, each with a different url and then a grade for the on page report and sometimes a rank. I'm not sure I understand this, firstly I thought the on page reports were only generated for keywords in the top 50, does that mean the global top 50, or my top 50? Also why are they appearing for so many urls, I find this confusing and am not sure which pages to focus on improving, it's not always my intended pages that are ranking the best. I believe that I read somewhere that I can choose which pages to have the on page reports rank for, perhaps this is the solution? Any thoughts would be appreciated. Thanks, Iain
On-Page Optimization | | jannkuzel0 -
A new relic has been discovered!
Greetings Mozfriends! http://www.google.com/commercesearch/product.html I was wondering peoples thoughts were on Google Commerce Search, and if it is an effective tool. Justin Smith
On-Page Optimization | | FrontlineMobility0 -
Wordpress when to use posts or pages
Hi Guys, I have a network of EMD sites that currently use a homepage and then we have a blog page which has 5-6 posts on. Is this the best way to do it with sites under 10-20 pages? Or should we create say 3-4 new pages/categories and drop the posts relevant to each page/category in there? Thank you Jon
On-Page Optimization | | imrubbish0 -
Using categories in Permalinks
I am looking at updating my WP Permalink structure and wanted to know if I should continue to include the category after my domain as in www.maximphotostudio.net/weddings/6081/columbus_wedding_photography/ or maybe www.maximphotostudio.net/6081/columbus_wedding_photography and www.maximphotostudio.net/6082/dayton_wedding_photography. Any help is appreciated.
On-Page Optimization | | maximphotostudio0