How to recover from duplicate subdomain penalty?
-
Two and half a weeks ago, my site was slapped with a penalty -- 60% of organic traffic disappeared over 2-3 days.
After investigating we discovered that our site was serving the same content for all subdomains, and Google somehow had two additional subdomains it was crawling and indexing. We solved the issue with 301 redirects to our main site (www) a couple of days after the drop -- about two weeks ago.
Our rankings have not recovered, and the subdomains are still indexed per Webmaster Tools. Yesterday we submitted a Reconsideration Request. Will that help? Is there any other way to speed up the process of lifting the penalty?
This is the site: http://goo.gl/3DCbl
Thank you!
-
No recovery yet. Quick update... I put in a reconsideration request and was denied, saying No Manual Spam Actions found.
From WMT: The Total Crawled count on the bad subdomains is steady, and there are still no Removed pages, but the Not Selected count is steadily increasing--in fact the total of Indexed and Not Selected is greater than the Total Crawled count -- how does this make sense?
Thanks.
-
Oh - if the subdomains showed no pages indexed, and then all of a sudden at the exact time you dropped, the subdomains showed thousands of indexed pages, then you can definitely assume they are related.
I didnt realize there was such a clear correlation. The suggestions above still stand - you might want to go one further and simply add a noindex right in the robots.txt on those subdomains (make sure its on the subdomains and not the money site!).
Dont forget in WMT you can also do a change of address under Configuration. Youve already completed the first two steps, so you can simply tell Google exactly where the subs have moved.
There's no reason at all why these steps will not prompt google to de-index the subs. The links by the way are simply a 'nudge' to get Google to look at the subdomains again and 'discover' the changes.
-
We'll give the links a shot.
We did consider that the high number of similar static pages may be viewed negatively by Google, but we were ranking very well on many long tail searches before the drop. On WMT, the subdomains show no pages indexed until the exact date range that our rankings dropped, when they spike to the tens of thousands.
What do you think is the likelihood that the subdomains are the culprit in this case?
Thanks for all of your help.
-
Its definitely hard to say with that many URLs - I would definitely point a few at the sub's home page however. It could be that those sub-domains were cached at such long intervals, that Google simply hasn't checked the site again.
Sometimes, adding the sub to WMT, then submitting an xml sitemap, waiting until Google acknowledges it (and tells you how many are indexed) then removing the sitemap can help.
If and when the subdomains are de-indexed (and theres no reason to believe they wouldn't be), then watch your positioning for a week or two after - if it doesnt change, you have to consider that the drop in positioning may be from another cause. For example, the way that each sorting variable for the products lands on its own static page can be viewed as good for SEO but slightly risky since so many pages are so close to duplicated.
-
Thanks Jared. The subdomains are www.ww and www.lnirfrx. We configured all subdomans to 301 to www. We did not receive any messages in WMT -- just the sudden drop ranking.
I'm thinking about putting some links on a forum that I know doesn't have nofollows and is crawled several times a day. But we have tens of thousands of these subdomain pages indexed, will posting a couple of the links help? I wouldn't want to post more than that because it would look spammy.
-
Hi tact - what were your subdomains?
You mentioned that you sent in a Recon. Request - did you receive an unnatural links penalty in WMT?
If you have properly 301'd your subs so that NO subdomain page can be accessed, then by simply pointing a few links at the redirect like Ben said should help it de-index faster. Make sure though that the 301's are properly set up (do a header check) and also make sure that no content from the sub is available unless you are certain the the redirect is applied properly (clear the subdomain of files).
-
Do some guest blogging and point links at your 301s from your guest posts. Google will see that you mean business. You'll have new links and the old pages will be deindexed quicker.
-
I would submit a sitemap and keep moving forward with creating valuable content and sharing it to the right people. It can take Google a long time to get to your message.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain 403 error
Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
Technical SEO | | WeAreDigital_BE
Jens0 -
Duplicate Content
HI There, Hoping someone can help me - before i damage my desk banging my head. Getting notifications from ahrefs and Moz for duplicate content. I have no idea where these weird urls have came from , but they do take us to the correct page (but it seems a duplicate of this page). correct url http://www.acsilver.co.uk/shop/pc/Antique-Vintage-Rings-c152.htm Incorrect url http://www.acsilver.co.uk/shop/pc/vintage-Vintage-Rings- c152.htm This is showing for most of our store categories 😞 Desperate for help as to what could be causing these issues. I have a technical member of the ecommerce software go through the large sitemap files and they assured me it wasn't linked to the sitemap files. Gemma
Technical SEO | | acsilver0 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
Noindex, follow duplicate pages
I have a series of websites that all feature a library of the same content. These pages don't make up the majority of the sites content, maybe 10-15% of the total pages. Most of our clients won't take the time to rewrite the content, but it's valuable to their site. So I decided to noindex, follow all of the pages. Outside of convincing them all to write their own versions of the content, is this the best method? I could also block the pages with robots.txt, but then I couldn't pass any link juice through the pages. Any thoughts?
Technical SEO | | vforvinnie0 -
Subdomain CMS or unique URL
I own a company for teams Ex myteams.com . A team registers and they get a site at team1.myteams.com. Content on each sub team site is mostly unique and I have several back links on each to the main site myteams.com. I also provide them with a unique URl team1.com will show team1.myteams.com. So couple questions As far as SEO should i be pushing the team1.com url or team1.myteams.com url? Is a link from team1.com or team1.myteams.com better for my site, their site or both How many back links should the sub sites have? Thanks
Technical SEO | | MichaelRyan220 -
Subdirectories vs subdomains
Hi SEO gurus 🙂 Anyone has input on what's better? blog.domain.com vs domain.com/blog store.domain.com vs domain.com/store etc I think the subdir (/xyz) will concentrate authority on the same subdomain so should be better? However sometimes it is tidier on the server to maintain online stores or blogs in a separate strucutre so subdomains work better in that sense. I just want to make sure that doesn't affect SEO? Cheers!
Technical SEO | | hectorpn0 -
Duplicate Page Title
First i had a problem with duplicate title errors, almost every page i had was double because my website linked to both www.funky-lama.com and funky-lama.com I changed this by adding a code to htaccess to redirect everything to www.funky-lama.com, but now my website was crawled again and the errors were actually doubled. all my pages now have duplicate title errors cause of pages like this www.funky-lama.com/160-confetti-gitaar.html funky-lama.com/160-confetti-gitaar.html www.funky-lama.com/1_present-time funky-lama.com/1_present-time
Technical SEO | | funkylama0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0