Temporary Duplicate Sites - Do anything?
-
Hi Mozzers -
We are about to move one of our sites to Joomla. This is one of our main sites and it receives about 40 million visits a month, so the dev team is a little concerned about how the new site will handle the load.
Dev's solution, since we control about 2/3 of that traffic through our own internal email and cross promotions, is to launch the new site and not take down the old site. They would leave the old site on its current URL and make the new site something like new.sub.site.com. Traffic we control would continue to the old site, traffic that we detect as new would be re-directed to the new site. Over time (the think about 3-4 months) they would shift the traffic all to the new site, then eventually change the URL of the new site to be the URL of the old site and be done.
So this seems to be at the outset a duplicate content (whole site) issue to start with. I think the best course of action is try to preserve all SEO value on the old URL since the new URL will eventually go away and become the old URL. I could consider on the new site no-crawl/no-index tags temporarily while both sites exist, but would that be risky since that site will eventually need to take those tags off and become the only site? Rel=canonical temporarily from the new site to the old site also seems like it might not be the best answer.
Any thoughts?
-
I'm going to throw in a completely different option, because in my opinion, messing with this kind of multiple version situation is going to put your huge website at massive risk of screwed up rankings and lost traffic no matter how tricky you get.
First, I'm assuming that significant high-level load testing has been done on the dev site already. If not, that's the place to start. (I'm suspecting a Joomla site for 40 million visits a month will have lots of load-balancing in place?)
Since by all indications, the sites will be identical to the visitor, I'd suggest switching to the new site, but keeping the original site immediately available in near-line status. By setting the TTL of the DNS to a very short duration while in transition, the site could be switched back to the old version within a minute or two just by updating the DNS if something goes pear-shaped on the new site.
Then, while the old site continues to serve visitors as it always has, devs can fix whatever issue was discovered on the new site.
This would mean keeping both sites' content updated concurrently during the period of the changeover, but it sounds like you were going to have to do that anyway. There's also the small risk that some visitors would have cached DNS on their own computers and so might still get sent to the new site for a while even after the DNS had been set back to the old site, but I'd say that's a vastly smaller risk than screwing up the rankings of the whole site.
Bottom line, there are plenty of load testing/quality assurance/server over-provisioning methods for making virtually certain the new site will be able to perform before going live. Having the backup site should be a very short term insurance, rather than a long term duplication process.
That's my perspective, anyway, having done a number of large-site migrations (though certainly nothing approaching 40M visits/month)
Paul
Just for refernce, I was involved in helping after just such a major migration where the multiple sites did get indexed. It took nearly a year to rectify the situation and get the rankings/traffic/usability back in order
-
Arghhh... This sounds like a crazy situation.
If the temp site is on a temporary subdomain, you definitely don't want any of those pages seeping into the index. But 3-4 months seems like an incredibly long time to sustain this. 3-4 days seems more reasonable to handle load testing.
For example, what happens when someone links to one of the temporary pages? Unless you put a rel canonical on the page, and allow robots to crawl it, then you won't gain from that link equity.
For a shorter time period, I'd simple block all crawlers via robots.txt, add a meta "noindex, nofollow" tag to the header, and hope for the best.
But for 3-4 months, you're taking the chance of sending very confusing signals to search engines, or losing out on new link equity. You could still use the meta "noindex, nofollow" on the temp domain if you need to, and also include rel=canonical tags (these are separate directives and actually processed differently) but there's no gaurentee of a smooth transistion once you ditch the temp urls.
So... my best advice is to convince your dev team to shorten the 3-4 month time frame. Not an easy job.
-
Wow 40 million visitors a month is no joke and nothing to be taken lightly if not done right the loss of traffic could be huge.
The new site should be non indexable and you can redirect a percentage of traffic to the new site (beta.site.com) for server load testing reasons and once you determine it is stable you can move it over to the new site.
Are URLs and site structure etc remaining the same? I wouldn't change too much at once or you won't know what happened if something tanks.
-
Thanks for the response.
It might have been just an unfounded concern, based on a vague memory of something I read about rel=canonical on here, but cannot find it now.
I was just concerned that if you have site A and B and rel=canonical from B to A, then eventually get rid of A and have B take on the URL of A, that the engines might interpret this oddly and have it affect domain authority.
-
Why do you think that canonical tags won't work?
That's what I would suggest.. Those tags simply tell Google which is the authoritative site of the duplicates. If you are preserving the original domain, canonical to that one and when you make the switch nothing will change. Do keep in mind if any of your directories or file structures are altered you will want to put in redirects but it sounds like your web team knows what they're doing here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My sites are not mooving why?
i have three local sites in Dubai. my second site is on page three. i didn't go for any guest post yet but for a long time with all improvement, It didn't move a bit. unable to understand the adhesivity of page three. lol any suggestion site 1- https://www.desertsafaritour.ae site 2- https://www.arabiannightsafari.com site3- https://www.uaedesertsafari.com any expert suggestion or any guideline by moz expert www.desertsafaritour.ae
Intermediate & Advanced SEO | | faisalkiani0 -
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Large Site - Complete Site URL Change and How to Preserver Organic Rankings/Traffic
Hello Community, What is your experience with site redesign when it comes to preserving the traffic? If a large enterprise website has to go through a site-wide enhancement (resulting in change of all URLs and partial content), what do you expect from Organic rankings and traffic? I assume we will experience a period that Google needs to "re-orientate" itself with the new site, if so, do you have similar experience and tips on how to minimize the traffic loss? Thanks
Intermediate & Advanced SEO | | b.digi0 -
Duplicate content across hundreds of Local sites and they all rank #1
Usually when we discuss duplicate content, we're addressing the topic of penalties or non-indexing. In this case, we're discussing ranking high with duplicate content. I've seen lots of dental, chiropractor and veterinarian sites built by companies that give them cookie cutter sites with the same copy. And they all rank #1 or #2. Here are two companies that do that:
Intermediate & Advanced SEO | | katandmouse
http://www.rampsites.com/rampsites/home_standard.asp?sectionid=4
http://mysocialpractice.com/about/ The later uses external blogs to provide inbound links to their clients' site, but not all services do that, in fact, this is the first time I've seen them with external blogs. Usually the blog with duplicate copy is ON SITE and the sites still rank #1. Query "Why Your Smile Prefers Water Over Soft Drinks" to see duplicate content on external blogs. Or "Remember the Mad Hatter from the childhood classic, Alice in Wonderland? Back then, the process of making hats involved using mercury compounds. Overexposure could produce symptoms referred to as being" for duplicate content on chiropractor sites that rank high. I've seen well optimized sites rank under them even though their sites have just as much quality content and it's all original with more engagement and inbound links. It appears to me that Google is turning a blind eye on duplicate content. Maybe because these are local businesses with local clientele it doesn't care that a chiropractor in NY has the same content as one in CA, just as the visitor doesn't care because the visitor in CA isn't look at a chiropractor's site in NY generally. So maybe geo-targeting the site has something to do with it. As a test, I should take the same copy and put it on a non-geo-targeted site and see if it will get indexed. I asked another Local SEO expert if she has run across this, probably the best in my opinion. She has and she finds it difficult to rank above them as well. It's almost as if Google is favoring those sites. So the question is, should all dentists, chiropractors and veterinarians give it up to these services? I shudder to think that, but, hey it's working and it's a whole lot less work - and maybe expense - for them.0 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
On-Site Optimization Tips for Job site?
I am working on a job site that only ranks well for the homepage with very low ranking internal pages. My job pages do not rank what so ever and are database driven and often times turn to 404 pages after the job has been filled. The job pages have to no content either. Anybody have any technical on-site recommendations for a job site I am working on especially regarding my internal pages? (Cross Country Allied.com)
Intermediate & Advanced SEO | | Melia0