How to recover from duplicate subdomain penalty?
-
Two and half a weeks ago, my site was slapped with a penalty -- 60% of organic traffic disappeared over 2-3 days.
After investigating we discovered that our site was serving the same content for all subdomains, and Google somehow had two additional subdomains it was crawling and indexing. We solved the issue with 301 redirects to our main site (www) a couple of days after the drop -- about two weeks ago.
Our rankings have not recovered, and the subdomains are still indexed per Webmaster Tools. Yesterday we submitted a Reconsideration Request. Will that help? Is there any other way to speed up the process of lifting the penalty?
This is the site: http://goo.gl/3DCbl
Thank you!
-
No recovery yet. Quick update... I put in a reconsideration request and was denied, saying No Manual Spam Actions found.
From WMT: The Total Crawled count on the bad subdomains is steady, and there are still no Removed pages, but the Not Selected count is steadily increasing--in fact the total of Indexed and Not Selected is greater than the Total Crawled count -- how does this make sense?
Thanks.
-
Oh - if the subdomains showed no pages indexed, and then all of a sudden at the exact time you dropped, the subdomains showed thousands of indexed pages, then you can definitely assume they are related.
I didnt realize there was such a clear correlation. The suggestions above still stand - you might want to go one further and simply add a noindex right in the robots.txt on those subdomains (make sure its on the subdomains and not the money site!).
Dont forget in WMT you can also do a change of address under Configuration. Youve already completed the first two steps, so you can simply tell Google exactly where the subs have moved.
There's no reason at all why these steps will not prompt google to de-index the subs. The links by the way are simply a 'nudge' to get Google to look at the subdomains again and 'discover' the changes.
-
We'll give the links a shot.
We did consider that the high number of similar static pages may be viewed negatively by Google, but we were ranking very well on many long tail searches before the drop. On WMT, the subdomains show no pages indexed until the exact date range that our rankings dropped, when they spike to the tens of thousands.
What do you think is the likelihood that the subdomains are the culprit in this case?
Thanks for all of your help.
-
Its definitely hard to say with that many URLs - I would definitely point a few at the sub's home page however. It could be that those sub-domains were cached at such long intervals, that Google simply hasn't checked the site again.
Sometimes, adding the sub to WMT, then submitting an xml sitemap, waiting until Google acknowledges it (and tells you how many are indexed) then removing the sitemap can help.
If and when the subdomains are de-indexed (and theres no reason to believe they wouldn't be), then watch your positioning for a week or two after - if it doesnt change, you have to consider that the drop in positioning may be from another cause. For example, the way that each sorting variable for the products lands on its own static page can be viewed as good for SEO but slightly risky since so many pages are so close to duplicated.
-
Thanks Jared. The subdomains are www.ww and www.lnirfrx. We configured all subdomans to 301 to www. We did not receive any messages in WMT -- just the sudden drop ranking.
I'm thinking about putting some links on a forum that I know doesn't have nofollows and is crawled several times a day. But we have tens of thousands of these subdomain pages indexed, will posting a couple of the links help? I wouldn't want to post more than that because it would look spammy.
-
Hi tact - what were your subdomains?
You mentioned that you sent in a Recon. Request - did you receive an unnatural links penalty in WMT?
If you have properly 301'd your subs so that NO subdomain page can be accessed, then by simply pointing a few links at the redirect like Ben said should help it de-index faster. Make sure though that the 301's are properly set up (do a header check) and also make sure that no content from the sub is available unless you are certain the the redirect is applied properly (clear the subdomain of files).
-
Do some guest blogging and point links at your 301s from your guest posts. Google will see that you mean business. You'll have new links and the old pages will be deindexed quicker.
-
I would submit a sitemap and keep moving forward with creating valuable content and sharing it to the right people. It can take Google a long time to get to your message.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with duplicate pages on Shopify
Moz is alerting me that there's about 60 duplicate pages on my Shopify ecommerce site. Most of them are products. I'm not sure how to fix this since the coding for my site is in liquid. I'm not sure if this is something I even need to be worried about. Most of these duplicate pages are a result of product tags shopify sites use to group products you tag with characteristics that the user can select in the product view. here are a couple URLS: https://www.mamadoux.com/collections/all/hooded https://www.mamadoux.com/collections/all/jumpers https://www.mamadoux.com/collections/all/menswear
Technical SEO | | Mamadoux0 -
Duplicate Content
We have a ton of duplicate content/title errors on our reports, many of them showing errors of: http://www.mysite.com/(page title) and http://mysite.com/(page title) Our site has been set up so that mysite.com 301 redirects to www.mysite.com (we did this a couple years ago). Is it possible that I set up my campaign the wrong way in SEOMoz? I'm thinking it must be a user error when I set up the campaign since we already have the 301 Redirect. Any advice is appreciated!
Technical SEO | | Ditigal_Taylor0 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0 -
Removing Duplicate Pages
Hi everyone. I'm sure this falls under novice seo question. But how do i remove duplicate pages from my site. I have not created the pages per say. Their may be a an internal link on a page that links to the page causing the duplication. Do i remove the internal link here is a sample of a duplicate page http://www.ticketplatform.com/about/ticket-industry-news-details/11-03-07/Ticket_Platform_to_help_LilysProject_com_to_raise_money_for_ALYN_Hospital_in_Israel.aspx?ReturnURL=%2fabout%2fticket-industry-news.aspx http://www.ticketplatform.com/about/ticket-industry-news-details/11-03-07/Ticket_Platform_to_help_LilysProject_com_to_raise_money_for_ALYN_Hospital_in_Israel.aspx?ReturnURL=%2fhome.aspx&CntPageID=1 I know the url is way too long. working on it Thanks for your feedbacks.
Technical SEO | | ticketplatform0 -
Canonicalization - duplicate homepage issues
I'm trying to work out the best way to resolve an issue where Google is seeing duplicate versions of a homepage, i.e. http://www.home.co.uk/Home.aspx and http://www.home.co.uk/ The site runs on Windows servers. I've tried implementing redirects for homepages before (for a different site on a linux server) and ended up with a loop, so although I know I can read lots of info (as I have been doing) and try again, I am really concerned about getting it wrong. Can anyone give me some advice on the best way to make Google take just one version of the page? Obviously link juice is also being diluted so I need to get this sorted asap. Thanks.
Technical SEO | | travelinnovations0 -
Duplicate title issue
During the crwal SEO moz found duplicate title problems with quite a good number of pages. This was because my site has test questions like http://www.skill-guru.com/12/scjp-5-mock-test/questions and when user does next or previous, they can traverse to different pages but the title and descrition would remain same. How can this probkem be resolved ?
Technical SEO | | skill-guru2