301, 404 or 410? what is the best practice
-
Hi
I'm currently working on a project to correct some really bad practices from years of different SEO's.
Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page.
Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates.
I've pulled the pages, but i'm in several frames of mind on how to best fix this.
The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them.
Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
-
Thanks for the responses, a 410 is a lot of work for probably little gain, so i think i'll run with just leaving the 404.
I have done an analytic's check on the url's in question and 10 had a tiny bit of traffic, so for these only i'll 301 to one relevant page.
Thanks again.
-
404, 410 it does not matter, you have removed the pages that is the main thing.
But to be correct you should use a 410, as they are gone forever, while 404 just means not found.
-
Hi Paul,
If they are unlikely to have external links to them (or at least no good ones) and they are not internally linked then I think your best bet is just to let them 404. You should anyway fix up your 404 page to let users know there has been a site redesign or similar and give links to homepage and other important pages etc.
You could 410 them also which is said to remove these pages more quickly from the index and to be a final word that these pages no longer exist and will never come back but that might create more overhead than it is worth in regards setting up different 4xx header responses for different types of pages. The differences between 404 and 410 headers in practice seems to be very little according to most things I have read. Since there will be no links to them anyway, 404ing them is an easy solution and should not create any problems.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy for dissolving an innocently created link network with over 100 websites?
Hello Moz Community, Over many years 120 websites were created all under a couple different organizations around the globe. The sites are interconnected via anchor text and domain name links and some redirect to larger sites. The teachings have a central theme and many tools, training programs, events, locations and services are offered on many different websites. Attached is a slice of a Majestic Link Graph showing the network. God bless Majestic for this new tool! We are looking for solutions that are efficient and effective in regards to usability, rankings and being achievable. Thank you so much for your help! Donna EJhNPqT
White Hat / Black Hat SEO | | Awakening-Mind0 -
Best seo benefit location ( main page text or h1 , h2)?
i have learned that h1 has more value than h2 and h2 has more than h3, but lets say if i want to place my keywords in there. should i include them in the main body or should take advantage of header tags?
White Hat / Black Hat SEO | | Sam09schulz0 -
301 redirects for 3 top level domains using WP SEO Yoast
Hey Guys I have a custom built website - and a wp blog attached to this - problem is there are 3 top level domains: zenory.co.nz, zenory.com and zenory.com.au **The issue is when I enter the domain to 301 redirect I only have to enter one domain usually i enter redirect from zenory.com/blog/oldpage to zenory.com.newpage ** For eg: I have just move Phone Psychic Readings from the blog - over to the main site. However there seems to be an issue that I'm still having and trying to clean up. I'm finding backlinks there are linking to each other of my 3 domains that end up backlinking across domains, which I was told this can look as spammy to google. For eg: co.nz links many pages to com.au. I'm currently trying to clean this up at the moment - however while im in the process of this - I find myself question when I'm creating the 301 redirects from the blog - but lets say I'm on the blog for zenoy.co.nz/blog/oldblogpost and when I click on a blog post - it redirects me to zenory.com/newarticlepost - because I have redirected it to .com - how can I redirect and make sure is going back to the right domain name to save myself from having to show this cross backlinks? Would gratefully appreciate any assistance on this tricky situation. Cheers Just
White Hat / Black Hat SEO | | edward-may0 -
Sudden influx of 404's affecting SERP's?
Hi Mozzers, We've recently updated a site of ours that really should be doing much better than it currently is. It's got a good backlink profile (and some spammy links recently removed), has age on it's side and has been SEO'ed a tremendous amount. (think deep-level, schema.org, site-speed and much, much more). Because of this, we assumed thin, spammy content was the issue and removed these pages, creating new, content-rich pages in the meantime. IE: We removed a link-wheel page; <a>https://www.google.co.uk/search?q=site%3Asuperted.com%2Fpopular-searches</a>, which as you can see had a **lot **of results (circa 138,000). And added relevant pages for each of our entertainment 'categories'.
White Hat / Black Hat SEO | | ChimplyWebGroup
<a>http://www.superted.com/category.php/bands-musicians</a> - this page has some historical value, so the Mozbar shows some Page Authority here.
<a>http://www.superted.com/profiles.php/wedding-bands</a> - this is an example of a page linking from the above page. These are brand new URLs and are designed to provide relevant content. The old link-wheel pages contained pure links (usually 50+ on every page), no textual content, yet were still driving small amounts of traffic to our site.
The new pages contain quality and relevant content (ie - our list of Wedding Bands, what else would a searcher be looking for??) but some haven't been indexed/ranked yet. So with this in mind I have a few questions: How do we drive traffic to these new pages? We've started to create industry relevant links through our own members to the top-level pages. (http://www.superted.com/category.php/bands-musicians) The link-profile here _should _flow to some degree to the lower-level pages, right? We've got almost 500 'sub-categories', getting quality links to these is just unrealistic in the short term. How long until we should be indexed? We've seen an 800% drop in Organic Search traffic since removing our spammy link-wheel page. This is to be expected to a degree as these were the only real pages driving traffic. However, we saw this drop (and got rid of the pages) almost exactly a month ago, surely we should be re-indexed and re-algo'ed by now?! **Are we still being algor****hythmically penalised? **The old spammy pages are still indexed in Google (138,000 of them!) despite returning 404's for a month. When will these drop out of the rankings? If Google believes they still exist and we were indeed being punished for them, then it makes sense as to why we're still not ranking, but how do we get rid of them? I've tried submitting a manual removal of URL via WMT, but to no avail. Should I 410 the page? Have I been too hasty? I removed the spammy pages in case they were affecting us via a penalty. There would also have been some potential of duplicate content with the old and the new pages.
_popular-searches.php/event-services/videographer _may have clashed with _profiles.php/videographer, _for example.
Should I have kept these pages whilst we waited for the new pages to re-index? Any help would be extremely appreciated, I'm pulling my hair out that after following 'guidelines', we seem to have been punished in some way for it. I assumed we just needed to give Google time to re-index, but a month should surely be enough for a site with historical SEO value such as ours?
If anyone has any clues about what might be happening here, I'd be more than happy to pay for a genuine expert to take a look. If anyone has any potential ideas, I'd love to reward you with a 'good answer'. Many, many thanks in advance. Ryan.0 -
Disavow links leading to 404
Looking at the link profile anchor text of a site i'm working on new links keep popping up in the reports with let's say very distasteful anchor text. These links are obviously spam and link to old forum pages for the site that doesn't exist any more, so the majority seem to trigger the 404 page. I understand that the 404 page (404 header response) does not flow any link power, or damage, but given the nature and volume of the sites linking to the "domain" would it be a good idea to completely disassociate and disavow these domains?
White Hat / Black Hat SEO | | MickEdwards0 -
301 Redirect Asp.net Help
Hey, we are redesigning the site and we are changing a lot of urls to make them more SEO friendly But some of the old urls have PR 4-5 What is the best way to do about this? How to do a 301 redirect for specific pages in asp.net Or do you recommend something elsE? Thanks in advance
White Hat / Black Hat SEO | | Madz0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Retail Site and Internal Linking Best Practices
I am in the process of recreating my company's website and, in addition to the normal retail pages, we are adding a "learn" section with user manuals, reviews, manufacturer info, etc. etc. It's going to be a lot of content and there will be linking to these "learn" pages from both products and other "learn" pages. I read on a SEOmoz blog post that too much internal linking with optimized anchor text can trigger down-rankings from Google as a penalty. Well, we're talking about having 6-8 links to "learn" pages from product pages and interlinking many times within the "learn" pages like Wikipedia does. And I figured they would all have optimized text because I think that is usually best for the end user (I personally like to know that I am clicking on "A Review of the Samsung XRK1234" rather than just "A Review of Televisions"). What is best practice for this? Is there a suggested limit to the number of links or how many of them should have optimized text for a retail site with thousands of products? Any help is greatly appreciated!
White Hat / Black Hat SEO | | Marketing.SCG0