Best course of action when removing 100's of pages from your site?
-
We had a section on our site Legal News (we are a law firm). All we did there was rehash news stories from news sites (no original content). We decided to remove the entire Legal News section and we were left with close to 800 404's. Around this same time our rankings seemed to drop. Our webmaster implemented 301's to closely related content on our blog. In about a weeks time our rankings went back up. Our webmaster informed us that we should submit each url to Google for removal, which we did. Its been about three weeks and our Not Found errors in WMT is over 800 and seems to be increasing daily. Moz's crawler says we have only 35 404's and they are from our blog not the legal news section we removed. The last thing we want is to have another rankings drop.
Is this normal? What is the best course of action when removing hundreds of pages from your site?
-
Google always takes a while to update. Check a few of the 404 errors Google is reporting and see if they are still erroneous pages. If so, you may have some pages to redirect. Of course, if the page was simply removed and you don't have anything to redirect it to, then a 404 is what is supposed to be returned. If everything looks fine, the it's just the delay of Google updating its info. It should resolve itself when they get around to updating it.
-
Hi,
It was a very bold move to drop such a significant number of pages from your site, especially if they were built up over time and attracted links. Even if the content wasn't completely original, that's not to say it didn't have some value. I think if I had made such a major change to a website and saw rankings drop, I would probably have reversed the change but then it's not clear whether that's an available option. Since I don't know the full reasoning behind the decision I'll reserve any further judgement and try to answer your question.
Returning 404s is the "right" thing to do as those pages don't exist any more, though putting 301s to very similar content is preferable to keep the benefit of any backlinks. I sense there weren't many links to worry about though as you're not very positive about the content which was deleted!
Google will hold onto pages which return 404s for some time before removing them from its index. This is to be expected as web pages can break/disappear unintentionally and so you have a grace period to "fix" any issues before losing your traffic.
The fact that Moz isn't showing any 404s shows that you aren't linking to the deleted pages because they are not being picked up by the crawl. They will drop out of WMT in a few weeks where you haven't inserted 301s to existing pages. You should also double check that they've been removed from the sitemap you submitted to Google.
Hope that helps,
George
@methodicalweb
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index Page Redirect to Home Page? Best Practices...
Hi, I am wondering what the best practice is when a site has an index page and a home page? I have two pages, listed below, and want to know if I should 301 redirect my "index" page to my standard home page. The home page is where I would like all traffic to fall on for our website. Additionally, I used the rel=canonical tag years ago on the index page to indicate that the home page is the main content. Home Page - https://www.1099pro.com/ (PA 45) Home Page Canonical: rel="canonical" href="https://www.1099pro.com/"/> Index Page - https://www.1099pro.com/index.asp (PA - 33) Index Page Canonical: rel="canonical" href="https://www.1099pro.com/"/> It seems to me that there is some extra juice that could be passed to my home page (which is the page that ranks highly for our major keywords) by 301 redirecting the index page. Is there any reason why I should not do that? Really appreciate any help - especially with extra explanations - for the simple minded like me ;)! -Michael
Web Design | | Stew2220 -
CAPTCHA Alternatives to Improve Page Load Speed
Recently I had to install reCAPTCHA on my site. The site contains domain name generators and they were being misused, in the words of my host: _Addition of a Captcha will go one of two ways - hit the bruting on the head as intended - OR it will increase the load and the impact by rendering the Captcha's. _ Have noticed that reCAPTCHA adds a fair amount of code 32% of page size and 5 requests. I want to replace reCAPTCHA with an alternative, has anyone got any ideas? Cheers. Justin
Web Design | | GrouchyKids0 -
I want to create a 301 redirect on a WordPress site, nothing's working...
Hello all, I'm hoping someone out there can give me a hand with this. I'm trying to modify my .htaccess file so that the site will go from maxcarnagemusic.com to www.maxcarnagemusic.com and also, so viewers will be redirected to www.maxcarnagemusic.com/home when they try to access the site. I've tried a few different things, including adding the 301 redirect plugin for Wordpress, but nothing seems to work. Can someone out there show/tell me how to create an htaccess file that will execute as much. I apologize in advance, my Apache experience is very, very limited. Thank you all in advance!
Web Design | | maxcarnage0 -
How to make sure category pages rank higher than product pages?
Hi, This question is E-Commerce related. We have product categories dividing products by color. Let's say we have the category 'blue toy cars' and a product called 'blue toy car racer', both of these could rank for the keyword 'blue toy car'. How do we make sure the category 'blue toy cars' ranks above the product 'blue toy car racer'? Or is the category page automatically ranked higher because of the higher page authority of that page? Alex
Web Design | | WebmasterAlex0 -
Best Place for Back Linking
Does anyone have a good list or know where I can find one to show me the best sites to create some organic back links to mine, preferably without paying for them? Thanks to those who help, Craig Fenton IT
Web Design | | craigyboy0 -
Google Bot cannot see the content of my pages
When I go to Google Webmaster tools and I type in any URL from the site http://www.ccisolutions.com in the "Fetch as Google Bot" feature, and then I click the link that says "success," Google bot is seeing my pages like this: <code>HTTP/1.1 200 OK Date: Tue, 26 Apr 2011 19:11:50 GMT Server: Apache/2.2.6 (Unix) mod_ssl/2.2.6 OpenSSL/0.9.7a DAV/2 PHP/5.2.4 mod_jk/1.2.25 Set-Cookie: CCISolutions-UT-Status=66.249.72.55.1303845110495128; path=/; expires=Thu, 25-Apr-13 19:11:50 GMT; domain=.ccisolutions.com Last-Modified: Tue, 28 Oct 2008 14:36:45 GMT ETag: "314b26-5a-2d421940" Accept-Ranges: bytes Content-Length: 90 Keep-Alive: timeout=15, max=99 Connection: Keep-Alive Content-Type: text/html Any clue as to why this could be happening?</code>
Web Design | | danatanseo0 -
Will my site structure provide decent SEO?
We have an ASP.NET MVC website with a view that can dynamically display each product we offer. The product name is hyphenated in the URL, and this is what we’re using to pull the product from the database. So an example URL would be: http://www.mysite.com/Products/Florida/Sample-Product-Name We have another view that dynamically lists the products offered for each state. This page would contain links to the URL for each product offered in that state. The URL for Florida would be: http://www.mysite.com/Products/Florida We want to make sure that when we enter a new product into the database, the product is indexed by Google the next time our site is crawled. I know that Google will crawl through the links in our website, so the new product should get indexed as long as we have a link to it. In this case, the link will be on the view that lists the products for the corresponding state. I have 2 questions: 1) Is my understanding correct that Google will index the product page as long as it can find a link to it somewhere in my site? 3) To get Google to index each URL for content that is generated dynamically from a database, is having links in my site for each URL the only way to do it? Is there something we can do with the site map? Thanks in advance everyone! -Alex
Web Design | | dbuckles0 -
Ecommerce web site with too many internal links
Hi, We're using Magento CE 1.4.0.1 for our ecommerce web site with a fairly flat navigation system i.e. 9 major categories display across the top menu that when you roll over display 2-20 sub categories (which take you to a groups of similar products) and then individual product pages. The categories and sub categories are available to click on as part of a dynamic Html menu system on each page. Each page also shows a small number of related products. This linking structure seems fairly standard and yet Seomoz throws up the error message, "Too Many On-page links" for most pages on our site. Do I need to really worry about this? Is there much can be done to improve this on an ecommerce web site with a large catalogue of products? I've looked at the Knowledge Base but I don't feel the existing responses adequately address the issue for ecommerce sites.
Web Design | | languedoc0