Deleted/Merged Content on Site Migration
-
Hey Moz Community!
Looking for some input on a site migration.
When redirecting some old pages that aren't going to be moved over to the new site, do you prefer to redirect to a homepage (or similar page) or to throw up a 404/410 on the new site?
What have you found works best?
-
If you want to keep any SEO value from inbound links that go to any of those older pages, you'll want to 301 redirect them to a similar page (i.e. a top category page).
According to Rand Fishkin's Moz blog writeup, Are 404 Pages Always Bad for SEO?
http://moz.com/blog/are-404-pages-always-bad-for-seo"When faced with 404s, my thinking is that unless the page:
A) Receives important links to it from external sources (Google Webmaster Tools is great for this)
B) Is receiving a substantive quantity of visitor traffic
and/or C) Has an obvious URL that visitors/links intended to reachIt's OK to let it 404."
According to Moz's Redirection SEO Best Practice:
http://moz.com/learn/seo/redirection
... you want to use a 301 redirect to indicate that the content has moved permanently.Here's a post that describes how to create a more SEO friendly migration, here's a great info graphic:
http://moz.com/blog/achieving-an-seo-friendly-domain-migration-the-infographicHope this helps!
Thanks,
-- Jeff -
Hello!
The pages that will no longer be available after the site is migrated will go directly to a 404. In doing similar site migrations, I have seen that the 404s will eventually not be crawled by search engines. I will typically make a really good 404 page in order to reduce bounce rate and improve click through.
On the other hand, If there are a ton of pages that are not being moved, I would suggest doing 301 redirects. Too many crawl errors (404s) will hurt search engine ranking. If you are redirecting, they should go to a relevant/closely related page.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GWT giving me 404 errors based on old and deleted site map
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure. However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages. Any thoughts?
Technical SEO | | Santaur0 -
Premium Content
Hey Guys I woking on a site that publishes hundreds of new content a day and part of the content is only available for users for 30 days. After 30 days the content is only accessible to premium users.
Technical SEO | | Mr.bfz
After 30 days, the page removes the content and replaces it with a log in/ sign up option. The same URL is kept for each page and the title of the article.
I have 2 concerns about this method. Is it healthy for the site to be removing tons of content of live pages and replace with a log in options Should I worry about Panda for creating tons of pages with unique URL but very similar source /content - the log in module and the text explaining that it is only available to premium users. The site is pretty big so google has some tolerance of things we can get away with it. Should I add a noindex attribute for those pages after 30 days? Even though it can takes months until google actually removes from the index. Is there a proper way for performing this type of feature in sites with a log in option after a period of time (first click free is not an option) Thanks Guys and I appreciate any help!0 -
Website Redesign / Switching CMS / .aspx and .html extensions question
Hello everyone, We're currently preparing a website redesign for one of our important websites. It is our most important website, having good rankings and a lot of visitors from Search Engines, so we want to be really careful with the redesign. Our strategy is to keep as much in place as possible. At first, we are only changing the styling of the website, we will keep the content, the structure, and as much as URLs the same as possible. However, we are switching from a custom build CMS system which created URLs like www.homepage.com/default-en.aspx
Technical SEO | | NielsB
No we would like to keep this URL the same , but our new CMS system does not support this kind of URLs. The same with for instance the URL: www.homepage.com/products.html
We're not able to recreate this URL in our new CMS. What would be the best strategy for SEO? Keep the URLs like this:
www.homepage.com/default-en
www.homepage.com/products Or doesn't it really matter, since Google we view these as completely different URLs? And, what would the impact of this changes in URLs be? Thanks a lot in advance! Best Regards, Jorg1 -
Is it needed to use http:// or not?
Hi, When doing link building and getting my URL to other websites, is it necessary that the other websites include http:// before the domain? Example: I want to increase the number of links to my site www.example.com . When I ask other websites to add a link to my site, should I ask them to use http://www.example.com or is www.example.com enough (without http://)? Or it really does not matter? Thanks in advance, Sam
Technical SEO | | Awaraman0 -
Wordpress: Tags generate duplicate Content - just delete the tags!?
Asking people, they say tags are bad and spamy and as I can see they generate all my duplicate page content issues. So the big question is, why Google very often prefers to show in SERPS these Tag-URLS... so it can't be too bad! :)))? Then after some research I found the "Term Optimizer" on Yoast.com ... that should help exactly with this problem but it seems not to be available anymore? So may be there another plugin that can help... or just delete all tags from my blog? and install permanent redirects?
Technical SEO | | inlinear
Is this the solution?0 -
Url rewrites / shortcuts - Are they considered duplicate content?
When creating a url rewrite or shortcut, does this create duplicate content issues? split your rankings / authority with google/search engines? Scenario 1 wwwlwhatthehellisahoneybooboo.com/dqotd/ -> www.whatthehellisahoneybooboo.com/08/12/2012/deep-questions-of-the-day.html Scenario 2 bitly.com/hbb -> www.whatthehellisahoneybooboo.com/08/12/2012/deep-questions-of-the-day.html (or to make it more compicated...directs to the above mentioned scenario 1 url rewrite) www.whatthehellisahoneybooboo.com/dqotd/ *note well- there's no server side access so mentions of optimizing .htacess are useless in this situation. To be clear, I'm only referring to rewrites, not redirects...just trying to understand the implications of rewrites. Thanks!
Technical SEO | | seosquared0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
Getting multiple errors for domain.com/xxxx/xxxx/feed/feed/feed/feed...
A recent SEOMoz crawl report is showing a bunch 404's and duplicate page content on pages with urls like http://domain.com/categories/about/feed/feed/feed/feed/feed and on and on. This is a wordpress install. Does anyone know what could be causing this or why SEOMoz would be trying to read these non-existent feed pages?
Technical SEO | | Brandtailers0