What is the best way to consolidate two websites into one?
-
Someone within our company's IT department just sent me some SEO advice that I believe is bogus. Can someone let me know if my initial gut-check is correct?
We have two websites selling two identical catalogs of products but branded differently (color scheme, wording, etc.) like this:
We want to shut down the second website.
I think we should set up 301 redirects from all pages on the second site to corresponding (relevant) pages on the first. In theory, this would pass over 90% of the earned link juice from one to the other.
Here is what my IT peer said:
"We could keep www.two.com set up indefinitely and just have it as the same web site as www.one.com (so two URLs but one site). This would help alleviate any issues with search engine results, etc. (Although I believe Ryan would agree this does impact www.one.com's rankings a bit, but shouldn't be a problem as long as we don't advertise both.) Google doesn't know they are on the same site, so you could technically get away with it. And it helps in indexing multiple pages on our sites."
... but wouldn't this be a big no-no because of the massive amounts of duplicate content it would create?
-
Hey Ryan
Just finally getting to this, thanks for reaching out on Twitter and asking for my input. It seems like everyone here has already helped you out, and I do agree
Only thing I'd add, is remember 301 redirects are not only for engines but for the users so always be sure they make sense from a UX point of view.
And also, be sure this doesn't create a chain of redirects. If 301s were already in place on the site you're redirecting from, always try to redirect from the original source, like this;
DO THIS:
www.one.com/original-page.html ----> www.two.com/real-page/
www.one.com/new-same-as-original/ ----> www.two.com/real-page/
NOT THIS:
www.one.com/original-page.html ---> www.one.com/new-same-as-original/ ----> www.two.com/real-page/
Communicate this to IT as best as you can, as its been said by Cutts to avoid chains of 301s if possible.
Hope this all helps, let us know!
-Dan
-
Thanks, Keri! Yeah... sometimes I think SEO needs to be a certification for IT folks...
-
Wow, that reminds me of the time developers told me Matt Cutts was wrong, that I should not redirect non-www to www, that any redirects were bad for search engines, and the more pages in the search engine the better, and wondered why my real issue with my support request was.
I'd push for the 301s, and I'm also interested to hear what others have to say.
-
Thanks for the validation, Highland. I'm hoping others will chime in, too!
-
I agree with you. Keeping a second site with duplicate content makes no sense. Just 301 two.com to one.com (preserving URLs like two.com/widgets redirects to one.com/widgets) and you should have no problems.
Advertising has nothing to do with indexation. Google actively looks for domains (they are a registrar after all) and spiders them unless you explicitly tell them not to (i.e. robots.txt)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google WMT/search console: Thousands of "Links to your site" even only one back-link from a website.
Hi, I can see in my search console that a website giving thousands of links to my site where hardly only one back-link from one of their page to our page. Why this is happening? Here is screenshot: http://imgur.com/a/VleUf
Intermediate & Advanced SEO | | vtmoz0 -
Slug best practices?
Hello, my team is trying to understand how to best construct slugs. We understand they need to be concise and easily understandable, but there seem to be vast differences between the three examples below. Are there reasons why one might be better than the others? http://www.washingtonpost.com/news/morning-mix/wp/2014/06/20/bad-boys-yum-yum-violent-criminal-or-not-this-mans-mugshot-is-heating-up-the-web/ http://hollywoodlife.com/2014/06/20/jeremy-meeks-sexy-mug-shot-felon-viral/ http://www.tmz.com/2014/06/19/mugshot-eyes-felon-sexy/
Intermediate & Advanced SEO | | TheaterMania0 -
Two websites to merge into one - one has already been migrated - what about the second?
Hiya Mozzers, I have just been checking a website for duplication issues (this is a new website - they have just migrated across from old website to this new "main website"), and I found a wordpress blog on a different URL, duplicating the "main website"'s blog. Should I just close down this wordpress blog, 301 redirecting from the wordpress blog to the "main website"'s blog (equivalent blog posts to equivalent blog posts, with other indexed non-specific pages 301 redirected to "main website"'s blog homepage)? Thanks in advance for your help.
Intermediate & Advanced SEO | | McTaggart0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
Consolidate 150 domains to 1
Hi! Just as the questions tell we are looking at a project where we might have to consolidate 150 different domains into 1 (of course with a corresponding page on the new domain). We aim at preserving as much of the linkjuice as possible from each domain. Any advice on doing this propely? I, of course, see a risk of opening the new domain and just redirecting (301) the old domains to the specific page on the new domain but is there any right or wrong way of doing this? I might add that each domain has a more or less unique linkprofiles in terms om linking domains, number of linking domains and such. Our dear friend Cutts has some information on this topic, http://www.youtube.com/watch?v=l7M22teF3Ho but he only talks about 4 domains - which of course seem like a bit more natural occurring phenomenon. But what about 150 of them? Anyone got any advice? Is this as much of a no-go that I feel it is? Thanks! Edit: There domains are all owned by the same entitiy, share the same GWT and such.
Intermediate & Advanced SEO | | bebetteronline0 -
Best practice?
Hi there, I have recently written an article which I have posted on an online newspaper website. I want to use this article and put it on my blog also, the reason the article will be placed on my blog is to drive users from my email marketing activities. Would it simply be best practice to disallow Google from crawling this page? or put a rel canonical on the article placed on my blog pointing to the article placed on the online newspaper website? Thanks for any suggestions
Intermediate & Advanced SEO | | Paul780 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
Best way to find all url parameters?
In reference to http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html, what is the best way to find all of the parameters that need to be addressed? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0