Can you be penalized by a development server with duplicate content?
-
I developed a site for another company late last year and after a few months of seo done by them they were getting good rankings for hundreds of keywords. When penguin hit they seemed to benefit and had many top 3 rankings.
Then their rankings dropped one day early May. Site is still indexed and they still rank for their domain. After some digging they found the development server had a copy of the site (not 100% duplicate). We neglected to hide the site from the crawlers, although there were no links built and we hadn't done any optimization like meta descriptions etc.
The company was justifiably upset. We contacted Google and let them know the site should not have been indexed, and asked they reconsider any penalties that may have been placed on the original site. We have not heard back from them as yet.
I am wondering if this really was the cause of the penalty though. Here are a few more facts:
Rankings built during late March / April on an aged domain with a site that went live in December.
Between April 14-16 they lost about 250 links, mostly from one domain. They acquired those links about a month before.
They went from 0 to 1130 links between Dec and April, then back to around 870 currently
According to ahrefs.com they went from 5 ranked keywords in March to 200 in April to 800 in May, now down to 500 and dropping (I believe their data lags by at least a couple of weeks).
So the bottom line is this site appeared to have suddenly ranked well for about a month then got hit with a penalty and are not in top 10 pages for most keywords anymore.
I would love to hear any opinions on whether a duplicate site that had no links could be the cause of this penalty? I have read there is no such thing as a duplicate content penalty per se. I am of the (amateur) opinion that it may have had more to do with the quick sudden rise in the rankings triggering something.
Thanks in advance.
-
What kind of links they lost, what was that domain? If it was like 250 links form one domain for one month, Google could think that they were paid and that could get you penalty. Buying links is a risky business these days.
-
I have experience of this. And it wasn't a nice!
I created a test copy of a site (WordPress) that I work on with a friend. It had been ranking pretty well mainly though lots of quality curated content, plus a bit of low level link building. The link building had slowed in late 2010.
Within 12 hours of the test version of the site going 'live' (it was set to no-index in WP options, which I no longer trust) the live site rankings and traffic tanked. The test version was on a sub-domain, and was an exact replica of the live site. With no known links, it was somehow picked up by Google and all 400 or so pages where in the Gindex along with the live site. Three re-consideration requests and 6 months later, we got back to where we were. The offending sub domain was 301'd to the live site within minutes of inding the problem, and during the 6 month bad period all other causes were ruled out.
I now password protect any staging sites that are on the internet, just to be safe!
-
I would not worry at all, there is no duplicate copntent penalty for this sort of thing, al that will happen is one site will rank one will not. The original site with the links will obviously be se as the site to rank, block off the deve site anyhow if you are worried. but this seems like a deeper problem that a bit of duplicate content
-
Yes. It should always be practice to noindex any vhost on the development and staging servers.
Not only will duplicate content harm them, but in one personal case of mine, the staging server was outranking the client for their own keywords! Obviously Google was confused and didn't know which page to show in SERPs. In turn this confuses visitors and leads to some angry customers.
Lastly, having open access to your staging server is a security risk for a number of reasons. It's not so serious that you need to require a login, but you should definitely keep staging sites out of SERPs to prevent others from getting easy access to them.
For comparison, the example I gave where the staging server outranked the client, the client had a great SEO campaign and the staging server had several insignificant links by accident. So the link building contest doesn't always apply in this case.
-
While I have no experience with this specifically with regards to SEO and ranking, I do have a development server. If you don't mind me asking, why is your development server public? Usually they should be behind some kind of password and not accessible by search spiders.
If you are worried that that is the problem, just make the entire site noindex and that should get it out of google eventually. It may take some time however.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Duplicate Page Content
We have different plans that you can signup for - how can we rectify the duplicate page content and title issue here? Thanks. | http://signup.directiq.com/?plan=100 | 0 | 1 | 32 | 1 | 200 |
Intermediate & Advanced SEO | | directiq
| http://signup.directiq.com/?plan=104 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=116 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=117 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=102 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=119 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=101 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=103 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=5 |0 -
Need help with huge spike in duplicate content and page title errors.
Hi Mozzers, I come asking for help. I've had a client who's reported a staggering increase in errors of over 18,000! The errors include duplicate content and page titles. I think I've found the culprit and it's the News & Events calender on the following page: http://www.newmanshs.wa.edu.au/news-events/events/07-2013 Essentially each day of the week is an individual link, and events stretching over a few days get reported as duplicate content. Do you have any ideas how to fix this issue? Any help is much appreciated. Cheers
Intermediate & Advanced SEO | | bamcreative0 -
Duplicate Content
Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign. Can I distribute this to lots of sites? The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they? I this duplication bad for them and\or us? Thanks
Intermediate & Advanced SEO | | Studio330 -
Duplicate peices of content on multiple pages - is this a problem
I have a couple of WordPress clients with the same issue but caused in different ways: 1. The Slash WP theme which is a portfolio theme, involves setting up multiple excerpts of content that can then be added to multiple pages. So although the pages themselves are not identical, there are the same snippets of content appearing on multiple pages 2. A WP blog which has multiple categories and/or tags for each post, effectively ends up with many pages showing duplicate excerpts of content. My view has always been to noindex these pages (via Yoast), but was advised recently not to. In both these cases, even though the pages are not identical, do you think this duplicate content across multiple pages could cause an issue? All thoughts appreciated
Intermediate & Advanced SEO | | Chammy0 -
Can i get banned for my content?
Last night all our indexed pages are gone from google. Completely deindexed - banned. Links could not cause it, all of them are related, anchors diversified and spam is never used. Content is the same like our other website has, just some small changes. First stronger website is working as usual. So can it be that duplicate content caused a complete ban? (Website is 6 months old. Content has never been properly indexed, due to same reasons i think. Last week we made changes, ant it started to get indexed quite well until tonight..)
Intermediate & Advanced SEO | | bele0 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Do sites with a small number of content pages get penalized by Google?
If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?
Intermediate & Advanced SEO | | RightDirection0