Duplicate content from development website
-
Hi all - I've been trawling for duplicate content and then I stumbled across a development URL, set up by a previous web developer, which nearly mirrors current site (few content and structure changes since then, but otherwise it's all virtually the same). The developer didn't take it down when the site was launched.
I'm guessing the best thing to do is tell him to take down the development URL (which is specific to the pizza joint btw, immediately. Is there anything else I should ask him to do?
Thanks, Luke
-
Well when I did it I put one removal request in for the whole domain and also put a disallow in the robots.txt for the whole site. Matt appears to be referring to putting in to many removal requests, but if you want your whole site removing you only need one so this wouldn't be an issue - you put your domain URL in. When you say your page has no snippet have you checked what your meta description is as this can help influence your snippet text. I would work at getting your development site removed a.s.a.p and then seeing what happens with your snippet - I think that there is a good chance it could be down to duplicate content issues. Have you checked what the cache for your homepage is in Googles results?
-
Hello Max!
Thank you very much for your answer!
First of all... no, i didn't have analytics or webmaster tools on the development site, i just set up google webmaster tools yesterday to put the removal request. There are ~1800 pages from the dev site indexed and i was removing them one by one when i found this artlicle bu Matt Cutts so i stopped removing:
http://www.mattcutts.com/blog/overdoing-url-removals/
Do you think it would be a good idea to keep doing it?
As far as i have seen, the development site is not outranking the main site but my concern is that the main site home page is showing up in SERP with no snippet so i'm wondering if it´s related somehow with the duplicated content issue.
Regarding your suggestion, DEFINITELLY... that's the type of things that you assume the development company would take care of... I already asked them to add HTTP authentication to the development site!
I really hope Google gets the change soon!
Thank you very much for your help, i really appreciate it!
Un abrazo
-
Hi Max
A couple of questions to understand your situation better - do you have both Google Analytics and Google Webmaster Tools installed on your development site? Is your development site out ranking your main site for any of your key terms?
In my experience unless your development site is out ranking your main site I would add a robots.txt file to disallow all bot access and then I would also put in a removal request for your domain on Google Webmaster Tools. I found this fix very quick - within a matter of days everything was fixed.
However if you feel that you are getting traffic to your development site and it is out ranking your main site, so you have decided that the rel canonical option is best I would still remove your development site when rankings swap around (as Marie pointed out this took a week or so for her).
In regards to your development site I would always aim to have it removed from the index and when you have your issues sorted I would place a password on the whole site so that nobody can access it other than you or someone that has the password. This will allow you to use your development site to its full potential and not have to worry about competitors that have found the URL monitoring your development site even when it is de-indexed!
BTW when I had this issue I had several thousand pages indexed in Google from my development site. Unfortunately I can't give you an exact time as to how long it will take to fix this issue as it all depends on the current crawl rates to your sites.
Hope this helps!
-
I'm having a very similar problem... the development site got crawled and it has 1700+ pages indexed in Google. I'm working to redirect every page from the development site to its equivalent in the production site.
There's something else that i don't understand... the home page of the production site is not showing any snippet in SERPs.. do you think this can be caused by the duplication issue with the development site?
After redirecting from development to production, how long do you think it will take google to reindex everything and understand that there's no duplicated content anymore?
I would really appreciate your opinions!
Un abrazo
-
Thanks so much Matt, Kerie & Marie - brilliant advice there - really brilliant. With your help it's all removed now.
Blimey, that discovery sure set my heart racing (eeeek.)
-
Thanks Keri, great advice on the use of a code monitor - I have known the situation to occur where code changes have been made to development sites and the robots.txt has been changed or removed by mistake causing the development site to be indexed again. Monitoring this would have helped react to this situation so much quicker!
-
I had a similar situation where I had developed a site for a landscaping client. I thought I had gotten rid of the files but somehow Google found them. My development site ranked #1 for his terms and his site was on something like page 6 because it was duplicate content. Here's what I did:
I didn't want to take down my site right away because his company was ranking #1 for his keywords. (Even though they landed on the development site they still had his phone number to call.)
I added a rel canonical to the development site that told Google that the correct site to index was actually the client's site.
Within a week or so, the proper site was ranking #1. At that point I deleted the files for the development site.
-
Excellent advice here. If it's on a subdomain, the subdomain can be claimed in GWT as its own site. You can put a robots.txt on the subdomain then request the entire subdomain be removed from the index.
You may want to go one step further and use something like PolePosition's Code Monitor that checks the code of any page once per day and alerts you if there's a change. In a similar situation, I had it monitor the robots.txt for the live and all development sites for where I was working, so I knew if the developers changed something and could react quickly.
-
Hi Luke
I had the same problem and this is how I fixed it - I registered the development domain with GWT and then put in a removal request. I also got our developers to setup a robot.txt file to tell search engines not to index any of the site - the contents of the robots.txt file are as follows:
User-agent: * Disallow: /
This soon fixed the issue for us. Hope this helps - obviously you don't need the robots.txt if your are just going to take the site down completely as there will be no worry of people finding it in search engines and mistaking it for your live site or search engines finding duplicate content. I used this strategy as we still use the development site for testing etc before going live.
Can I just check is the URL on a separate domain? If it isn't and it is part of your existing domain for instance you can still block that URL using either a robots.txt file or a no index, no follow meta tag. You can also request removal of specific URL's within a site in GWT.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Buying a disused website and using their content - penalty risk?
Hi all, I'm in the process of setting up a new website. I have found various old websites covering a similar topic and I'm interested in purchasing two of these websites for their content as it is very good, despite those sites struggling to make ends meet. One of these websites is still live, the other one hasn't been live for 2 years. Let's say I bought these websites for their content, then used that content on my new domain and made sure the two websites where this content came from were offline, would I run a risk of getting penalised? Does Google hold onto content from a website even if it is now offline?
Intermediate & Advanced SEO | | Bee1590 -
Thin Content to Quality Content
How should i modify content from thin to high quality content. Somehow i realized that my pages where targetted keywords didn't had the keyword density lost a massive ranking after the last update whereas all pages which had the keyword density are ranking good. But my concern is all pages which are ranking good had all the keyword in a single statement like. Get ABC pens, ABC pencils, ABC colors, etc. at the end of a 300 word content describing ABC. Whereas the pages which dropped the rankings had a single keyword repeated just twice in a 500 word article. Can this be the reason for a massive drop. Should i add the single statement like the one which is there on pages ranking good? Is it good to add just a single line once the page is indexed or do i need to get a fresh content once again along with a sentence of keyword i mentioned above?
Intermediate & Advanced SEO | | welcomecure1 -
How should I manage duplicate content caused by a guided navigation for my e-commerce site?
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
Intermediate & Advanced SEO | | FireMountainGems0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
How do I geo-target continents & avoid duplicate content?
Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
Intermediate & Advanced SEO | | AxialDev
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-ca Link hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!0 -
Syndicating duplicate content descriptions - Can these be canonicalised?
Hi there, I have a site that contains descriptions of accommodation and we also use this content to syndicate to our partner sites. They then use this content to fill their descriptions on the same accommodation locations. I have looked at copyscape and Google and this does appear as duplicate content across these partnered sites. I do understand as well that certain kinds of content will not impact Google's duplication issue such as locations, addresses, opening times those kind of things, but would actual descriptions of a location around 250 words long be seen and penalised as duplicate content? Also is there a possible way to canonicalise this content so that Google can see it relates back to our original site? The only other way I can think of getting round a duplicate content issue like this is ordering the external sites to use tags like blockquotes and cite tags around the content.
Intermediate & Advanced SEO | | MalcolmGibb0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0