Can you be penalized by a development server with duplicate content?
-
I developed a site for another company late last year and after a few months of seo done by them they were getting good rankings for hundreds of keywords. When penguin hit they seemed to benefit and had many top 3 rankings.
Then their rankings dropped one day early May. Site is still indexed and they still rank for their domain. After some digging they found the development server had a copy of the site (not 100% duplicate). We neglected to hide the site from the crawlers, although there were no links built and we hadn't done any optimization like meta descriptions etc.
The company was justifiably upset. We contacted Google and let them know the site should not have been indexed, and asked they reconsider any penalties that may have been placed on the original site. We have not heard back from them as yet.
I am wondering if this really was the cause of the penalty though. Here are a few more facts:
Rankings built during late March / April on an aged domain with a site that went live in December.
Between April 14-16 they lost about 250 links, mostly from one domain. They acquired those links about a month before.
They went from 0 to 1130 links between Dec and April, then back to around 870 currently
According to ahrefs.com they went from 5 ranked keywords in March to 200 in April to 800 in May, now down to 500 and dropping (I believe their data lags by at least a couple of weeks).
So the bottom line is this site appeared to have suddenly ranked well for about a month then got hit with a penalty and are not in top 10 pages for most keywords anymore.
I would love to hear any opinions on whether a duplicate site that had no links could be the cause of this penalty? I have read there is no such thing as a duplicate content penalty per se. I am of the (amateur) opinion that it may have had more to do with the quick sudden rise in the rankings triggering something.
Thanks in advance.
-
What kind of links they lost, what was that domain? If it was like 250 links form one domain for one month, Google could think that they were paid and that could get you penalty. Buying links is a risky business these days.
-
I have experience of this. And it wasn't a nice!
I created a test copy of a site (WordPress) that I work on with a friend. It had been ranking pretty well mainly though lots of quality curated content, plus a bit of low level link building. The link building had slowed in late 2010.
Within 12 hours of the test version of the site going 'live' (it was set to no-index in WP options, which I no longer trust) the live site rankings and traffic tanked. The test version was on a sub-domain, and was an exact replica of the live site. With no known links, it was somehow picked up by Google and all 400 or so pages where in the Gindex along with the live site. Three re-consideration requests and 6 months later, we got back to where we were. The offending sub domain was 301'd to the live site within minutes of inding the problem, and during the 6 month bad period all other causes were ruled out.
I now password protect any staging sites that are on the internet, just to be safe!
-
I would not worry at all, there is no duplicate copntent penalty for this sort of thing, al that will happen is one site will rank one will not. The original site with the links will obviously be se as the site to rank, block off the deve site anyhow if you are worried. but this seems like a deeper problem that a bit of duplicate content
-
Yes. It should always be practice to noindex any vhost on the development and staging servers.
Not only will duplicate content harm them, but in one personal case of mine, the staging server was outranking the client for their own keywords! Obviously Google was confused and didn't know which page to show in SERPs. In turn this confuses visitors and leads to some angry customers.
Lastly, having open access to your staging server is a security risk for a number of reasons. It's not so serious that you need to require a login, but you should definitely keep staging sites out of SERPs to prevent others from getting easy access to them.
For comparison, the example I gave where the staging server outranked the client, the client had a great SEO campaign and the staging server had several insignificant links by accident. So the link building contest doesn't always apply in this case.
-
While I have no experience with this specifically with regards to SEO and ranking, I do have a development server. If you don't mind me asking, why is your development server public? Usually they should be behind some kind of password and not accessible by search spiders.
If you are worried that that is the problem, just make the entire site noindex and that should get it out of google eventually. It may take some time however.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Question With New Domain
Hey Everyone, I hope your day is going well. I have a question regarding duplicate content. Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet. The Issue Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc). Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
Intermediate & Advanced SEO | | imjonny0 -
How can I get maximum Seo juice from others embedding my content
Hi - I create virtual tours which I host and my clients embed (this site will be a holiday directory one day and linking is unlikely). What can I do with the embed code they use - most use iframes - to get maximum Seo juice? Example tour below https://bestdevonholidays.co.uk/lavender/virtualtour.html Thanks
Intermediate & Advanced SEO | | virtualdevon0 -
Duplicate content in external domains
Hi,
Intermediate & Advanced SEO | | teconsite
I have been asking about this case before, but now my question is different.
We have a new school that offers courses and programs . Its website is quite new (just a five months old) It is very common between these schools to publish the courses and programs in training portals to promote those courses and to increase the visibility of them. As the website is really new, I found when I was doing the technical audit, that when I googled a text snipped from the site, the new school website was being omitted, and instead, the course portals are being shown. Of course, I know that the best recommendation would be to create a different content for that purpose, but I would like to explore if there is more options. Most of those portals doesn't allow to place a link to the website in the content and not to mention canonical. Of course most of them are older than the new website and their authority is higher. so,... with this situation, I think the only solution is to create a different content for the website and for the portals.
I was thinking that maybe, If we create the content first in the new website, send it to the index, and wait for google to index it, and then send the content to the portals, maybe we would have more opportunites to not be ommited by Google in search results. What do you think? Thank you!0 -
Potential Pagination Issue/ Duplicate content issue
Hi All, We upgraded our framework , relaunched our site with new url structures etc and re did our site map to Google last week. However, it's now come to light that the rel=next, rel=Prev tags we had in place on many of our pages are missing. We are putting them back in now but my worry is , as they were previously missing when we submitted the , will I have duplicate content issues or will it resolve itself , as Google re-crawls the site over time ?.. Any advice would be greatly appreciated? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Best strategy for duplicate content?
Hi everyone, We have a site where all product pages have more or less similar text (same printing techniques, etc.) The main differences are prices and images, text is highly similar. We have around 150 products in every language. Moz's algorithm tells me to do something about duplicate content, but I don't really know what we could do, since the descriptions can't be changed to be very different. We essentially have paper bags in different colors and and from different materials.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Last Panda: removed a lot of duplicated content but no still luck!
Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.
Intermediate & Advanced SEO | | fablau0 -
Can videos be considered duplicate content?
I have a page that ranks 5 and to get a rich snippet I'm thinking of adding a relevant video to the page. Thing is, the video is already on another page which ranks for this keyword... but only at position 20. As it happens the page the video is on is the more important page for other keywords, so I won't remove it. Will having the same video on two pages be considered a duplicate?
Intermediate & Advanced SEO | | Brocberry0 -
How To Handle Duplicate Content Regarding A Corp With Multiple Sites and Locations?
I have a client that has 800 locations. 50 of them are mine. The corporation has a standard website for their locations. The only thing different is their location info on each page. The majority of the content is the same for each website for each location. What can be done to minimize the impact/penalty of having "duplicate or near duplicate" content on their sites? Assuming corporate won't allow the pages to be altered.
Intermediate & Advanced SEO | | JChronicle0