Duplicate content from development website
-
Hi all - I've been trawling for duplicate content and then I stumbled across a development URL, set up by a previous web developer, which nearly mirrors current site (few content and structure changes since then, but otherwise it's all virtually the same). The developer didn't take it down when the site was launched.
I'm guessing the best thing to do is tell him to take down the development URL (which is specific to the pizza joint btw, immediately. Is there anything else I should ask him to do?
Thanks, Luke
-
Well when I did it I put one removal request in for the whole domain and also put a disallow in the robots.txt for the whole site. Matt appears to be referring to putting in to many removal requests, but if you want your whole site removing you only need one so this wouldn't be an issue - you put your domain URL in. When you say your page has no snippet have you checked what your meta description is as this can help influence your snippet text. I would work at getting your development site removed a.s.a.p and then seeing what happens with your snippet - I think that there is a good chance it could be down to duplicate content issues. Have you checked what the cache for your homepage is in Googles results?
-
Hello Max!
Thank you very much for your answer!
First of all... no, i didn't have analytics or webmaster tools on the development site, i just set up google webmaster tools yesterday to put the removal request. There are ~1800 pages from the dev site indexed and i was removing them one by one when i found this artlicle bu Matt Cutts so i stopped removing:
http://www.mattcutts.com/blog/overdoing-url-removals/
Do you think it would be a good idea to keep doing it?
As far as i have seen, the development site is not outranking the main site but my concern is that the main site home page is showing up in SERP with no snippet so i'm wondering if it´s related somehow with the duplicated content issue.
Regarding your suggestion, DEFINITELLY... that's the type of things that you assume the development company would take care of... I already asked them to add HTTP authentication to the development site!
I really hope Google gets the change soon!
Thank you very much for your help, i really appreciate it!
Un abrazo
-
Hi Max
A couple of questions to understand your situation better - do you have both Google Analytics and Google Webmaster Tools installed on your development site? Is your development site out ranking your main site for any of your key terms?
In my experience unless your development site is out ranking your main site I would add a robots.txt file to disallow all bot access and then I would also put in a removal request for your domain on Google Webmaster Tools. I found this fix very quick - within a matter of days everything was fixed.
However if you feel that you are getting traffic to your development site and it is out ranking your main site, so you have decided that the rel canonical option is best I would still remove your development site when rankings swap around (as Marie pointed out this took a week or so for her).
In regards to your development site I would always aim to have it removed from the index and when you have your issues sorted I would place a password on the whole site so that nobody can access it other than you or someone that has the password. This will allow you to use your development site to its full potential and not have to worry about competitors that have found the URL monitoring your development site even when it is de-indexed!
BTW when I had this issue I had several thousand pages indexed in Google from my development site. Unfortunately I can't give you an exact time as to how long it will take to fix this issue as it all depends on the current crawl rates to your sites.
Hope this helps!
-
I'm having a very similar problem... the development site got crawled and it has 1700+ pages indexed in Google. I'm working to redirect every page from the development site to its equivalent in the production site.
There's something else that i don't understand... the home page of the production site is not showing any snippet in SERPs.. do you think this can be caused by the duplication issue with the development site?
After redirecting from development to production, how long do you think it will take google to reindex everything and understand that there's no duplicated content anymore?
I would really appreciate your opinions!
Un abrazo
-
Thanks so much Matt, Kerie & Marie - brilliant advice there - really brilliant. With your help it's all removed now.
Blimey, that discovery sure set my heart racing (eeeek.)
-
Thanks Keri, great advice on the use of a code monitor - I have known the situation to occur where code changes have been made to development sites and the robots.txt has been changed or removed by mistake causing the development site to be indexed again. Monitoring this would have helped react to this situation so much quicker!
-
I had a similar situation where I had developed a site for a landscaping client. I thought I had gotten rid of the files but somehow Google found them. My development site ranked #1 for his terms and his site was on something like page 6 because it was duplicate content. Here's what I did:
I didn't want to take down my site right away because his company was ranking #1 for his keywords. (Even though they landed on the development site they still had his phone number to call.)
I added a rel canonical to the development site that told Google that the correct site to index was actually the client's site.
Within a week or so, the proper site was ranking #1. At that point I deleted the files for the development site.
-
Excellent advice here. If it's on a subdomain, the subdomain can be claimed in GWT as its own site. You can put a robots.txt on the subdomain then request the entire subdomain be removed from the index.
You may want to go one step further and use something like PolePosition's Code Monitor that checks the code of any page once per day and alerts you if there's a change. In a similar situation, I had it monitor the robots.txt for the live and all development sites for where I was working, so I knew if the developers changed something and could react quickly.
-
Hi Luke
I had the same problem and this is how I fixed it - I registered the development domain with GWT and then put in a removal request. I also got our developers to setup a robot.txt file to tell search engines not to index any of the site - the contents of the robots.txt file are as follows:
User-agent: * Disallow: /
This soon fixed the issue for us. Hope this helps - obviously you don't need the robots.txt if your are just going to take the site down completely as there will be no worry of people finding it in search engines and mistaking it for your live site or search engines finding duplicate content. I used this strategy as we still use the development site for testing etc before going live.
Can I just check is the URL on a separate domain? If it isn't and it is part of your existing domain for instance you can still block that URL using either a robots.txt file or a no index, no follow meta tag. You can also request removal of specific URL's within a site in GWT.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Lot of duplicate content and still traffic is increasing... how does it work?
Hello Mozzers, I've a dilemma with a client's site I am working on that is make me questioning my SEO knowledge, or the way Google treat duplicate content. I'll explain now. The situation is the following: organic traffic is constantly increasing since last September, in every section of the site (home page, categories and product pages) even though: they have tons of duplicate content from same content in old and new URLs (which are in two different languages, even if the actual content on the page is in the same language in both of the URL versions) indexation is completely left to Google decision (no robots file, no sitemap, no meta robots in code, no use of canonical, no redirect applied to any of the old URLs, etc) a lot (really, a lot) of URLs with query parameters (which brings to more duplicated content) linked from the inner page of the site (and indexed in some case) they have Analytics but don't use Webmaster Tools Now... they expect me to help them increase even more the traffic they're getting, and I'll go first on "regular" onpage optimization, as their title, meta description and headers are not optimized at all according to the page content, but after that I was thinking on fixing the issues with indexation and content duplication, but I am worried I can "break the toy", as things are going well for them. Should I be confident that fixing these issues will bring to even better results or do you think is better for me to focus on other kind of improvements? Thanks for your help!
Intermediate & Advanced SEO | | Guybrush_Threepw00d0 -
Will merging sites create a duplicate content penalty?
I have 2 sites that would be better suited being merged and creating a more authoritative site. Basically I'de like to merge site A in to site B. If I add new pages from site A to Site B and create 301 redirects for those pages on site A to the new pages on Site B is that the best way to go about it? As the pages are already indexed would this create any duplicate content issue or would the redirect solve this?
Intermediate & Advanced SEO | | boballanjones0 -
Can I duplicate my websites content on Ebay Store?
Our company is setting up a store on Ebay. Is it okay to duplicate our content descriptions on our ebay store with a link going back to our website? Or would this potentially hurt us in Search?
Intermediate & Advanced SEO | | hfranz0 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
Is this duplicate content?
My client has several articles and pages that have 2 different URLs For example: /bc-blazes-construction-trail is the same article as: /article.cfm?intDocID=22572 I was not sure if this was duplicate content or not ... Or if I should be putting "/article.cfm" into the robots.txt file or not.. if anyone could help me out, that would be awesome! Thanks 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Can you be penalized by a development server with duplicate content?
I developed a site for another company late last year and after a few months of seo done by them they were getting good rankings for hundreds of keywords. When penguin hit they seemed to benefit and had many top 3 rankings. Then their rankings dropped one day early May. Site is still indexed and they still rank for their domain. After some digging they found the development server had a copy of the site (not 100% duplicate). We neglected to hide the site from the crawlers, although there were no links built and we hadn't done any optimization like meta descriptions etc. The company was justifiably upset. We contacted Google and let them know the site should not have been indexed, and asked they reconsider any penalties that may have been placed on the original site. We have not heard back from them as yet. I am wondering if this really was the cause of the penalty though. Here are a few more facts: Rankings built during late March / April on an aged domain with a site that went live in December. Between April 14-16 they lost about 250 links, mostly from one domain. They acquired those links about a month before. They went from 0 to 1130 links between Dec and April, then back to around 870 currently According to ahrefs.com they went from 5 ranked keywords in March to 200 in April to 800 in May, now down to 500 and dropping (I believe their data lags by at least a couple of weeks). So the bottom line is this site appeared to have suddenly ranked well for about a month then got hit with a penalty and are not in top 10 pages for most keywords anymore. I would love to hear any opinions on whether a duplicate site that had no links could be the cause of this penalty? I have read there is no such thing as a duplicate content penalty per se. I am of the (amateur) opinion that it may have had more to do with the quick sudden rise in the rankings triggering something. Thanks in advance.
Intermediate & Advanced SEO | | rmsmall0 -
Duplicate content that looks unique
OK, bit of an odd one. The SEOmoz crawler has flagged the following pages up as duplicate content. Does anyone have any idea what's going on? http://www.gear-zone.co.uk/blog/november-2011/gear$9zone-guide-to-winter-insulation http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone http://www.gear-zone.co.uk/blog/july-2011/telephone-issues-$9-2nd-july-2011 http://www.gear-zone.co.uk/blog/september-2011/gear$9zone-guide-to-nordic-walking-poles http://www.gear-zone.co.uk/blog/september-2011/win-a-the-north-face-nuptse-2-jacket-with-gear-zone https://www.google.com/webmasters/tools/googlebot-fetch?hl=en&siteUrl=http://www.gear-zone.co.uk/
Intermediate & Advanced SEO | | neooptic0