How to recover from duplicate subdomain penalty?
-
Two and half a weeks ago, my site was slapped with a penalty -- 60% of organic traffic disappeared over 2-3 days.
After investigating we discovered that our site was serving the same content for all subdomains, and Google somehow had two additional subdomains it was crawling and indexing. We solved the issue with 301 redirects to our main site (www) a couple of days after the drop -- about two weeks ago.
Our rankings have not recovered, and the subdomains are still indexed per Webmaster Tools. Yesterday we submitted a Reconsideration Request. Will that help? Is there any other way to speed up the process of lifting the penalty?
This is the site: http://goo.gl/3DCbl
Thank you!
-
No recovery yet. Quick update... I put in a reconsideration request and was denied, saying No Manual Spam Actions found.
From WMT: The Total Crawled count on the bad subdomains is steady, and there are still no Removed pages, but the Not Selected count is steadily increasing--in fact the total of Indexed and Not Selected is greater than the Total Crawled count -- how does this make sense?
Thanks.
-
Oh - if the subdomains showed no pages indexed, and then all of a sudden at the exact time you dropped, the subdomains showed thousands of indexed pages, then you can definitely assume they are related.
I didnt realize there was such a clear correlation. The suggestions above still stand - you might want to go one further and simply add a noindex right in the robots.txt on those subdomains (make sure its on the subdomains and not the money site!).
Dont forget in WMT you can also do a change of address under Configuration. Youve already completed the first two steps, so you can simply tell Google exactly where the subs have moved.
There's no reason at all why these steps will not prompt google to de-index the subs. The links by the way are simply a 'nudge' to get Google to look at the subdomains again and 'discover' the changes.
-
We'll give the links a shot.
We did consider that the high number of similar static pages may be viewed negatively by Google, but we were ranking very well on many long tail searches before the drop. On WMT, the subdomains show no pages indexed until the exact date range that our rankings dropped, when they spike to the tens of thousands.
What do you think is the likelihood that the subdomains are the culprit in this case?
Thanks for all of your help.
-
Its definitely hard to say with that many URLs - I would definitely point a few at the sub's home page however. It could be that those sub-domains were cached at such long intervals, that Google simply hasn't checked the site again.
Sometimes, adding the sub to WMT, then submitting an xml sitemap, waiting until Google acknowledges it (and tells you how many are indexed) then removing the sitemap can help.
If and when the subdomains are de-indexed (and theres no reason to believe they wouldn't be), then watch your positioning for a week or two after - if it doesnt change, you have to consider that the drop in positioning may be from another cause. For example, the way that each sorting variable for the products lands on its own static page can be viewed as good for SEO but slightly risky since so many pages are so close to duplicated.
-
Thanks Jared. The subdomains are www.ww and www.lnirfrx. We configured all subdomans to 301 to www. We did not receive any messages in WMT -- just the sudden drop ranking.
I'm thinking about putting some links on a forum that I know doesn't have nofollows and is crawled several times a day. But we have tens of thousands of these subdomain pages indexed, will posting a couple of the links help? I wouldn't want to post more than that because it would look spammy.
-
Hi tact - what were your subdomains?
You mentioned that you sent in a Recon. Request - did you receive an unnatural links penalty in WMT?
If you have properly 301'd your subs so that NO subdomain page can be accessed, then by simply pointing a few links at the redirect like Ben said should help it de-index faster. Make sure though that the 301's are properly set up (do a header check) and also make sure that no content from the sub is available unless you are certain the the redirect is applied properly (clear the subdomain of files).
-
Do some guest blogging and point links at your 301s from your guest posts. Google will see that you mean business. You'll have new links and the old pages will be deindexed quicker.
-
I would submit a sitemap and keep moving forward with creating valuable content and sharing it to the right people. It can take Google a long time to get to your message.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain Severe Duplicate Content Issue
Hi A subdomain for our admin site has been indexed and it has caused over 2000 instances of duplicate content. To fix this issue, is a 301 redirect or canoncial tag the best option? http://www.example.com/services http://admin.example.com/services Really appreciate your advice J
Technical SEO | | Metricly-Marketing0 -
Duplicate content issue
Moz crawl diagnostic tool is giving me a heap of duplicate content for each event on my website... http://www.ticketarena.co.uk/events/Mint-Festival-7/ http://www.ticketarena.co.uk/events/Mint-Festival-7/index.html Should i use a 301 redirect on the second link? i was unaware that this was classed as duplicate content. I thought it was just the way the CMS system was set up? Can anyone shed any light on this please. Thanks
Technical SEO | | Alexogilvie0 -
How to protect against duplicate content?
I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google. My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored. Is that the right move? Are there other things I should do? Thanks!
Technical SEO | | williammarlow0 -
Tips and duplicate content
Hello, we have a search site that offers tips to help with search/find. These tips are organized on the site in xml format with commas... of course the search parameters are duplicated in the xml so that we have a number of tips for each search parameter. For example if the parameter is "dining room" we might have 35 pieces of advice - all less than a tweet long. My question - will I be penalized for keyword stuffing - how can I avoid this?
Technical SEO | | acraigi0 -
Penalty on two primary keywords
Hi Seomoz, I have been strugling to get www.texaspoker.dk out of what seems to be a keyword specific penalty (we are on page 5 on "poker" and "online poker"). First I thought it was Penguin related, but I'm not so sure any longer. I have removed all the bad links to my site possible (it's not easy to get other people to remove links, I can tell you), and I have reported all the links that I would like google to "ignore" (reconsideration request) ... all in all I have requested for reconsideration 5 times, and - despite some small changes - I got the same answer every time. We violate the quality guide lines and we should be looking for unnatural links pointing to our site. If any one - by having a look at our site - have any idea what could be wrong, please don't hold back, we would love to hear your point of view. Right now we are in the middle of making our partners take off their site wide links to us (the partners you'll find if you click the flags to the right in the top of the home page). On Texaspoker.dk we only link to the partners from the home page, but maybe we should consider to take of even these links? No one is really clicking on them anyway. Another thing, which is only under consideration, is to ask our partners from Betxpert.com (with whom we have exchanged our news feed - you will find their feed if you schroll down the home page) to set the feed to "no follow" and do the same our selves. What do you think of this thought? As far as I can see, there is nothing wrong with the on page optimization, but maybe some one can see what I don't see? Again - what ever thoughts you guys may have - shoot ... i'm ready to take all bulets 🙂 Thanks in advance! Nicolai, Texaspoker.dk
Technical SEO | | MPO0 -
Issue: Duplicate Pages Content
Hello, Following the setting up of a new campaign, SEOmoz pro says I have a duplicate page content issue. It says the follwoing are duplicates: http://www.mysite.com/ and http://www.mysite.com/index.htm This is obviously true, but is it a problem? Do I need to do anything to avoid a google penalty? The site in question is a static html site and the real page only exsists at http://www.mysite.com/index.htm but if you type in just the domain name then that brings up the same page. Please let me know what if anything I need to do. This site by the way, has had a panda 3.4 penalty a few months ago. Thanks, Colin
Technical SEO | | Colski0 -
How do I get rid of duplicate content
I have a site that is new but I managed to get it to page one. Now when I scan it on SEO Moz I see that I have duplicate content. Ex: www.mysite.com, www.mysite.com/index and www.mysite.com/ How do I fix this without jeopardizing my SERPS ranking? Any tips?
Technical SEO | | bronxpad0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240