Creating duplicate site for testing purpose. Can it hurt original site
-
Hello,
We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks.
may suggest - we need to work on live server, what we have planned
-
take exact replica of site and move to a test domain, but on live server
-
Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt
-
Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain
The process upgradation and new tools may take 1 - 1.5 month....
Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
-
-
Thanks, am using it through Password Protected & meta noindex tag
Its been kept out of search engine crawl !!
-
Hey Gagan,
So I think you're question is will content on your staging site still get indexed despite using robots.txt? The answer is yes, sometimes that does happen especially if a lot of people link to it. The best way to keep content out of the index would be to use the meta robots tag with noindex, nofollow. Search engines are much better about adhering to those than robots.txt.
Let us know if you run into any problems!
-Mike
-
Hi Gagan,
Google are generally more than happy for sites to test new pages, layouts and functionality. They even have some free tools for that purpose.
Content Experiments
https://support.google.com/analytics/answer/1745147?ref_topic=1745207&rd=1
I'm not sure about the viability of of using Content Experiments to test a whole new site, but it would be worth looking into.
Let us know how you get on.
Neil.
-
Ahaa.. Thanks Mr. Robert for your views
However, does any kind of duplicate url can still occur - can google can still crawl the url despite been blocked through robots - can the original running site can suffer in any way, if we create duplicate site
Its a content based site - covering Auto reviews, updates with news, forum & blog updates. There is no ecommerce shopping or products involved
Our tentative time frame to add on features, test all changes and do major upgrade for latest version of cms will be approx 45 days. Do you feel any issue - if both original site and a duplicate one on test domain (despite blocked by robots), but on real time server goes on simultaneously for that period.
Also - you referred other way of testing changes - is it possible to share them ?
-
Gagan
I think this is a great and interesting question. First, you are adding functionality, etc. to a site and you are curious as to the effect of that on visitors to the site once they are on it. This is data anyone in SEO should want to see for their sites.
I would first say that you need to define the test period (assuming you already know what you want to measure) for the site. If it is a week for example, I do not think you need worry about whether a site with three major engines blocked will in some way run into duped content issues. (NOTE: If this is a large site and/or one with a critical revenue need - one that cannot afford to have any type of slight but temporary downturn - I would look for another way to test the changes. Even if I was sure there were no other issues.)
I am assuming that if an ecommerce site for example, there will be the ability for a shopper to purchase on both, etc.
I would not run the test for any long period of time for a site that creates leads, revenue, etc. as I think it could cause customer confusion which can be more critical than duped content.
Let us know how it works out,
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitors with duplicate sites for backlinks
Hello all, In the last few months, my company has seen some keywords we historically rank well for fall off the first page, and there are a couple competitors that have appeared that use backlinks from seemingly the same site. For fairness, our site has slow page load speeds that we are working on changing, as well as not being mobile friendly yet. The sites that are ranking are mobile friendly and load fast, but we have heaps of other words still ranking well, and I'm more curious about this methodology. For example, these two pages: http://whiteboards.com.au/
White Hat / Black Hat SEO | | JustinBSLW
http://www.glasswhiteboards.com.au/ In OSE, glasswhiteboards has the majority of links from whiteboards, and the content between the sites is the same. My page has higher domain authority & page authority, but less backlinks. However, if you take away the backlinks from the duplicate site, they are the same. Isn't this type of content supposed to be flagged? My question is about whether this kind of similar site on different domains is a good idea to build links, as all my research shows that it's poor in the long run, but it seems to be working with these guys. Another group of sites that has been killing us uses this same method, with multiple sites that look the same that all link to each other to build up backlinks. These sites do have different content. It seems instead of building different categories within their own site, they have purchased multiple domains that act as their categories. Here's just a few: http://www.lockablenoticeboards.com.au/
http://www.snapperframes.com/
http://www.snapperdisplay.com.au/
http://www.light-box.com.au/
http://www.a-frame-signs.com.au/
http://www.posterhangers.com.au/0 -
Can H1 and Meta title be exactly the same ?
I've heard from some SEO's that H1 and Meta Title shouldn't be exactly the same, why ? Both of them describe what is ON the page right ? Why is it Spammy? Is it ?
White Hat / Black Hat SEO | | Tintanus2 -
Can I Use Meta NoIndex to Block Unwanted Links?
I have a forum thread on my site that is completely user generated, not spammy at all, but it is attracting about 45 backlinks from really spammy sites. Usually when this happens, the thread is created by a spammer and I just 404 it. But in this instance, the thread is completely legit, and I wouldn't want to 404 it because users could find it useful. If I add a meta noindex, nofollow tag to the header, will the spammy pagerank still be passed? How best can I protect myself from these low quality backlinks? I don't want to get slapped by Penguin! **Note: I cannot find contact information from the spam sites and it's in a foreign language.
White Hat / Black Hat SEO | | TMI.com0 -
Why There is No link Data Available in my Webmaster Tools even the site has lots of links and webmastert tools account setup properly
i have few account in my webmaster tools that are not showing any link data even the has lots of links. i checked the setup and its everything is good. is some one tell me why there is no data coming through? Thanks
White Hat / Black Hat SEO | | OnlineAssetPartners1 -
SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
Hi Moz community, New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes: I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags. Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation. Should I even keep the tags? Should I keep and not index? Should I add/remove from site map? Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
White Hat / Black Hat SEO | | BWrightTLM0 -
Duplicate Articles
We submit articles to a magazine which either get posted as text or in a flash container. Management would like to post it to our site as well. I'm sure this has been asked a million times but is this a bad thing to do? Do I need to a rel=canonical tag to the articles? Most of the articles posted to that other site do not contain a link back to our site.
White Hat / Black Hat SEO | | SirSud0 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0 -
Is it possible that since the Google Farmer's Update, that people practicing Google Bowling can negatively affect your site?
We have hundreds of random bad links that have been added to our sites across the board that nobody in our company paid for. Two of our domains have been penalized and three of our sites have pages that have been penalized. Our sites are established with quality content. One was built in 2007, the other in 2008. We pay writers to contribute quality and unique content. We just can't figure out a) Why the sites were pulled out of Google indexing suddenly after operating well for years b) Where the spike in links came from. Thanks
White Hat / Black Hat SEO | | dahnyogaworks0