New CMS system - 100,000 old urls - use robots.txt to block?
-
Hello.
My website has recently switched to a new CMS system.
Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls.
Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical'
Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find.
My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary.
Thanks!
-
Great stuff..thanks again for your advice..much appreciated!
-
It can be really tough to gauge the impact - it depends on how suddenly the 404s popped up, how many you're seeing (webmaster tools, for Google and Bing, is probably the best place to check) and how that number compares to your overall index. In most cases, it's a temporary problem and the engines will sort it out and de-index the 404'ed pages.
I'd just make sure that all of these 404s are intentional and none are valuable pages or occurring because of issues with the new CMS itself. It's easy to overlook something when you're talking about 100K pages, and it could be more than just a big chunk of 404s.
-
Thanks for the advice! The previous website did have a robots.txt file with a few wild cards declared. A lot of the urls I'm seeing are NOT indexed anymore and haven't been for many years.
So, I think the 'stop the bleeding' method will work, and I'll just have to proceed with investigating and applying 301s as necessary.
Any idea what kind of an impact this is having on our rankings? I submitted a valid sitemap, crawl paths are good, and major 301s are in place. We've been hit particularly hard in Bing.
Thanks!
-
I've honestly had mixed luck with using Robots.txt to block pages that have already been indexed. It tends to be unreliable at a large scale (good for prevention, poor for cures). I endorsed @Optimize, though, because if Robots.txt is your only option, it can help "stop the bleeding". Sometimes, you use the best you have.
It's a bit trickier with 404s ("Not Found"). Technically, there's nothing wrong with having 404s (and it's a very valid signal for SEO), but if you create 100,000 all at once, that can sometimes give raise red flags with Google. Some kind of mass-removal may prevent problems from Google crawling thousands of not founds all at once.
If these pages are isolated in a folder, then you can use Google Webmaster Tools to remove the entire folder (after you block it). This is MUCH faster than Robots.txt alone, but you need to make sure everything in the folder can be dumped out of the index.
-
Absolutely. Not founds and no content are a concern. This will help your ranking....
-
Thanks a lot! I should have been a little more specific..but, my exact question would be, if I move the crawlers' attention away from these 'Not Found' pages, will that benefit the indexation of the now valid pages? Are the 'Not Found's' really a concern? Will this help my indexation and/or ranking?
Thanks!
-
Loaded question without knowing exactly what you are doing.....but let me offer this advice. Stop the bleeding with robots.txt. This is the easiest way to quickly resolve that many "not found".
Then you can slowly pick away at the issue and figure out if some of the "not founds" really have content and it is sending them to the wrong area....
On a recent project we had over 200,000 additional url's "not found". We stopped the bleeding and then slowly over the course of a month, spending a couple hours a week, found another 5,000 pages of content that we redirected correctly and removed the robots....
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I spend time going back and optimizing old blog posts for SEO or just write new posts?
The site I manage (Boutique Estate Law Firm) has at least 350 old blog post archived that were not well optimized for SEO. Would it be valuable to go through and optimize those old posts or just write new optimized posts even though they are on the same subjects? My boss loves to churn out 300 word posts.
On-Page Optimization | | SEO4leagalPA0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Internal anchor text. Should we still use keywords?
I have a review site that has approximately 70 category pages. I'd like to include a few of them (not all 70) in the footer of the page for easy navigation and to direct the crawlers to the best ones. Is it advisable to use anchor text of "Category + reviews" or just leave it as "Category". I certainly dont want it to be overoptimized, but I do think its a good usability cue, so I'd be surprised if it was against guidelines. Any thoughts?
On-Page Optimization | | jim_shook0 -
Url structure
Hi Guys, Wondering what is better for url structure say for example a key word "slow cooker" example.com/slowcooker or example.com/slow-cooker ? Thank you 🙂
On-Page Optimization | | GetApp0 -
Meta tag "revisit after" - useful?
Hi everybody, I've rarely seen the "revisit after" meta tag during the last 1,5 years. As some of my current client websites are still using it and I'm not sure, if it's still usefull/has any effect, I'd like to hear from the community. Any advices/hints/experiences with the tag? Thanks in advance and cheers from Germany Sven
On-Page Optimization | | targi420 -
Custom Landing Page URLs
I will begin creating custom landing pages optimized for long-tail keywords. Placing the keywords in the URL is obviously important -- Question: would it be detrimental to rankings to have extra characters extending past the keyword? I'm not able to use tracking code, but need to put an identifier in the URL (clp = custom landing page). For example, is "www.domain.com/silver-fish.html" going to perform meaningfully better than "www.domain.com/silver-fish-clp.html" for the kw phrase "silver fish"? There will obviously be a lot of on-page optimization in addition to just structuring the URLs. Thank you. SIMbiz
On-Page Optimization | | SIMbiz0 -
Best information organization for a new site?
I'm launching a new stain removal website, and wanted to know what would be considered the best way to organize the content? Since most articles will roughly involve "removing X from Y" or "how to remove Z," I can see two ways... 1. Organize articles by Stained Items, Stain Agents and perhaps Cleaning Detergents. 2. Spread the categories out more, to try and group stained items according to categories... E.g. Hard surfaces, delicates, fabrics, ceramics etc. Any thoughts on which of these two might be the best way to organize the site, or are there any better suggestions? Not sure what the main considerations are here... Either of these two seem equally user-friendly.
On-Page Optimization | | ZakGottlieb710 -
New sitelinks - can we control the number?
A quick question on Google's new sitelink format. When searching for our brand name (Confetti) Google returns 8 sitelinks for our site. When searching for our domain (confetti.co.uk) Google returns the maximum number of 12 sitelinks. Is there a quick way (Webmaster Tools for example) to increase the number of sitelinks for our brand name to 12? Thanks,
On-Page Optimization | | Confetti_Wedding0