Duplicate content from development website
-
Hi all - I've been trawling for duplicate content and then I stumbled across a development URL, set up by a previous web developer, which nearly mirrors current site (few content and structure changes since then, but otherwise it's all virtually the same). The developer didn't take it down when the site was launched.
I'm guessing the best thing to do is tell him to take down the development URL (which is specific to the pizza joint btw, immediately. Is there anything else I should ask him to do?
Thanks, Luke
-
Well when I did it I put one removal request in for the whole domain and also put a disallow in the robots.txt for the whole site. Matt appears to be referring to putting in to many removal requests, but if you want your whole site removing you only need one so this wouldn't be an issue - you put your domain URL in. When you say your page has no snippet have you checked what your meta description is as this can help influence your snippet text. I would work at getting your development site removed a.s.a.p and then seeing what happens with your snippet - I think that there is a good chance it could be down to duplicate content issues. Have you checked what the cache for your homepage is in Googles results?
-
Hello Max!
Thank you very much for your answer!
First of all... no, i didn't have analytics or webmaster tools on the development site, i just set up google webmaster tools yesterday to put the removal request. There are ~1800 pages from the dev site indexed and i was removing them one by one when i found this artlicle bu Matt Cutts so i stopped removing:
http://www.mattcutts.com/blog/overdoing-url-removals/
Do you think it would be a good idea to keep doing it?
As far as i have seen, the development site is not outranking the main site but my concern is that the main site home page is showing up in SERP with no snippet so i'm wondering if it´s related somehow with the duplicated content issue.
Regarding your suggestion, DEFINITELLY... that's the type of things that you assume the development company would take care of... I already asked them to add HTTP authentication to the development site!
I really hope Google gets the change soon!
Thank you very much for your help, i really appreciate it!
Un abrazo
-
Hi Max
A couple of questions to understand your situation better - do you have both Google Analytics and Google Webmaster Tools installed on your development site? Is your development site out ranking your main site for any of your key terms?
In my experience unless your development site is out ranking your main site I would add a robots.txt file to disallow all bot access and then I would also put in a removal request for your domain on Google Webmaster Tools. I found this fix very quick - within a matter of days everything was fixed.
However if you feel that you are getting traffic to your development site and it is out ranking your main site, so you have decided that the rel canonical option is best I would still remove your development site when rankings swap around (as Marie pointed out this took a week or so for her).
In regards to your development site I would always aim to have it removed from the index and when you have your issues sorted I would place a password on the whole site so that nobody can access it other than you or someone that has the password. This will allow you to use your development site to its full potential and not have to worry about competitors that have found the URL monitoring your development site even when it is de-indexed!
BTW when I had this issue I had several thousand pages indexed in Google from my development site. Unfortunately I can't give you an exact time as to how long it will take to fix this issue as it all depends on the current crawl rates to your sites.
Hope this helps!
-
I'm having a very similar problem... the development site got crawled and it has 1700+ pages indexed in Google. I'm working to redirect every page from the development site to its equivalent in the production site.
There's something else that i don't understand... the home page of the production site is not showing any snippet in SERPs.. do you think this can be caused by the duplication issue with the development site?
After redirecting from development to production, how long do you think it will take google to reindex everything and understand that there's no duplicated content anymore?
I would really appreciate your opinions!
Un abrazo
-
Thanks so much Matt, Kerie & Marie - brilliant advice there - really brilliant. With your help it's all removed now.
Blimey, that discovery sure set my heart racing (eeeek.)
-
Thanks Keri, great advice on the use of a code monitor - I have known the situation to occur where code changes have been made to development sites and the robots.txt has been changed or removed by mistake causing the development site to be indexed again. Monitoring this would have helped react to this situation so much quicker!
-
I had a similar situation where I had developed a site for a landscaping client. I thought I had gotten rid of the files but somehow Google found them. My development site ranked #1 for his terms and his site was on something like page 6 because it was duplicate content. Here's what I did:
I didn't want to take down my site right away because his company was ranking #1 for his keywords. (Even though they landed on the development site they still had his phone number to call.)
I added a rel canonical to the development site that told Google that the correct site to index was actually the client's site.
Within a week or so, the proper site was ranking #1. At that point I deleted the files for the development site.
-
Excellent advice here. If it's on a subdomain, the subdomain can be claimed in GWT as its own site. You can put a robots.txt on the subdomain then request the entire subdomain be removed from the index.
You may want to go one step further and use something like PolePosition's Code Monitor that checks the code of any page once per day and alerts you if there's a change. In a similar situation, I had it monitor the robots.txt for the live and all development sites for where I was working, so I knew if the developers changed something and could react quickly.
-
Hi Luke
I had the same problem and this is how I fixed it - I registered the development domain with GWT and then put in a removal request. I also got our developers to setup a robot.txt file to tell search engines not to index any of the site - the contents of the robots.txt file are as follows:
User-agent: * Disallow: /
This soon fixed the issue for us. Hope this helps - obviously you don't need the robots.txt if your are just going to take the site down completely as there will be no worry of people finding it in search engines and mistaking it for your live site or search engines finding duplicate content. I used this strategy as we still use the development site for testing etc before going live.
Can I just check is the URL on a separate domain? If it isn't and it is part of your existing domain for instance you can still block that URL using either a robots.txt file or a no index, no follow meta tag. You can also request removal of specific URL's within a site in GWT.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Question With New Domain
Hey Everyone, I hope your day is going well. I have a question regarding duplicate content. Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet. The Issue Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc). Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
Intermediate & Advanced SEO | | imjonny0 -
Wondering if creating 256 new pages would cause duplicate content issues
I just completed a long post that reviews 16 landing page tools. I want to add 256 new pages that compare each tool against each other. For example: Leadpages vs. Instapage Leadpages vs. Unbounce Instapage vs. Unbounce, etc Each page will have one product's information on the left and the other on the right. So each page will be a unique combination BUT the same product information will be found on several other pages (its other comparisons vs the other 15 tools). This is because the Leadpages comparison information (a table) will be the same no matter which tool it is being compared against. If my math is correct, this will create 256 new pages - one for each combination of the 16 tools against each other! My site now is new and only has 6 posts/pages if that matters. Want to make sure I don't create a problem early on...Any thoughts?
Intermediate & Advanced SEO | | martechwiz0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Duplicate content issue for franchising business
Hi All We are in the process of adding a franchise model to our exisitng stand alone business and as part of the package given to the franchisee will be a website with conent identical to our existing website apart from some minor details such as contact and address details. This creates a huge duplicate content issue and even if we implement a cannonical approach to this will still be unfair to the franchisee in terms of their markeitng and own SEO efforts. The url for each franchise will be unique but the content will be the same to a large extend. The nature of the service we offer (professional qualificaitons) is such that the "products" can only be described in a certain way and it will be near on in impossible to have a unique set of "product" pages for each franchisee. I hope that some of you have come across a similar problem or that some of you have suggestions or ideas for us to get round this. Kind regards Peter
Intermediate & Advanced SEO | | masterpete0 -
Guest blogging and duplicate content
I have a guest blog prepared and several sites I can submit it to, would it be considered duplicate content if I submitted one guest blog post to multipul blogs? and if so this content is not on my site but is linking to it. What will google do? Lets say 5 blogs except the same content and post it up, I understand that the first blog to have it up will not be punished, what about the rest of the blogs? can they get punished for this duplicate content? can I get punished for having duplicate content linking to me?
Intermediate & Advanced SEO | | SEODinosaur0