How to recover from duplicate subdomain penalty?
-
Two and half a weeks ago, my site was slapped with a penalty -- 60% of organic traffic disappeared over 2-3 days.
After investigating we discovered that our site was serving the same content for all subdomains, and Google somehow had two additional subdomains it was crawling and indexing. We solved the issue with 301 redirects to our main site (www) a couple of days after the drop -- about two weeks ago.
Our rankings have not recovered, and the subdomains are still indexed per Webmaster Tools. Yesterday we submitted a Reconsideration Request. Will that help? Is there any other way to speed up the process of lifting the penalty?
This is the site: http://goo.gl/3DCbl
Thank you!
-
No recovery yet. Quick update... I put in a reconsideration request and was denied, saying No Manual Spam Actions found.
From WMT: The Total Crawled count on the bad subdomains is steady, and there are still no Removed pages, but the Not Selected count is steadily increasing--in fact the total of Indexed and Not Selected is greater than the Total Crawled count -- how does this make sense?
Thanks.
-
Oh - if the subdomains showed no pages indexed, and then all of a sudden at the exact time you dropped, the subdomains showed thousands of indexed pages, then you can definitely assume they are related.
I didnt realize there was such a clear correlation. The suggestions above still stand - you might want to go one further and simply add a noindex right in the robots.txt on those subdomains (make sure its on the subdomains and not the money site!).
Dont forget in WMT you can also do a change of address under Configuration. Youve already completed the first two steps, so you can simply tell Google exactly where the subs have moved.
There's no reason at all why these steps will not prompt google to de-index the subs. The links by the way are simply a 'nudge' to get Google to look at the subdomains again and 'discover' the changes.
-
We'll give the links a shot.
We did consider that the high number of similar static pages may be viewed negatively by Google, but we were ranking very well on many long tail searches before the drop. On WMT, the subdomains show no pages indexed until the exact date range that our rankings dropped, when they spike to the tens of thousands.
What do you think is the likelihood that the subdomains are the culprit in this case?
Thanks for all of your help.
-
Its definitely hard to say with that many URLs - I would definitely point a few at the sub's home page however. It could be that those sub-domains were cached at such long intervals, that Google simply hasn't checked the site again.
Sometimes, adding the sub to WMT, then submitting an xml sitemap, waiting until Google acknowledges it (and tells you how many are indexed) then removing the sitemap can help.
If and when the subdomains are de-indexed (and theres no reason to believe they wouldn't be), then watch your positioning for a week or two after - if it doesnt change, you have to consider that the drop in positioning may be from another cause. For example, the way that each sorting variable for the products lands on its own static page can be viewed as good for SEO but slightly risky since so many pages are so close to duplicated.
-
Thanks Jared. The subdomains are www.ww and www.lnirfrx. We configured all subdomans to 301 to www. We did not receive any messages in WMT -- just the sudden drop ranking.
I'm thinking about putting some links on a forum that I know doesn't have nofollows and is crawled several times a day. But we have tens of thousands of these subdomain pages indexed, will posting a couple of the links help? I wouldn't want to post more than that because it would look spammy.
-
Hi tact - what were your subdomains?
You mentioned that you sent in a Recon. Request - did you receive an unnatural links penalty in WMT?
If you have properly 301'd your subs so that NO subdomain page can be accessed, then by simply pointing a few links at the redirect like Ben said should help it de-index faster. Make sure though that the 301's are properly set up (do a header check) and also make sure that no content from the sub is available unless you are certain the the redirect is applied properly (clear the subdomain of files).
-
Do some guest blogging and point links at your 301s from your guest posts. Google will see that you mean business. You'll have new links and the old pages will be deindexed quicker.
-
I would submit a sitemap and keep moving forward with creating valuable content and sharing it to the right people. It can take Google a long time to get to your message.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Duplicate Content Brainstorming
Hi, New here in the SEO world. Excellent resources here. We have an ecommerce website that sells presentation templates. Today our templates come in 3 flavours - for PowerPoint, for Keynote and both - called Presentation Templates. So we've ended up with 3 URLS with similar content. Same screenshots, similar description.. Example: https://www.improvepresentation.com/keynote-templates/social-media-keynote-template https://www.improvepresentation.com/powerpoint-templates/social-media-powerpoint-template https://www.improvepresentation.com/presentation-templates/social-media-presentation-template I know what you're thinking. Why not make a website with a template and give 3 download options right? But what about https://www.improvepresentation.com/powerpoint-templates/ https://www.improvepresentation.com/keynote-templates/ These are powerfull URL's in my opinion taking into account that the strongest keyword in our field is "powerpoint templates" How would you solve this "problem" or maybe there is no problem at all.
Technical SEO | | slidescamp0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
Joomla creating duplicate pages, then the duplicate page's canonical points to itself - help!
Using Joomla, every time I create an article a subsequent duplicate page is create, such as: /latest-news/218-image-stabilization-task-used-to-develop-robot-brain-interface and /component/content/article?id=218:image-stabilization-task-used-to-develop-robot-brain-interface The latter being the duplicate. This wouldn't be too much of a problem, but the canonical tag on the duplicate is pointing to itself.. creating mayhem in Moz and Webmaster tools. We have hundreds of duplicates across our website and I'm very concerned with the impact this is having on our SEO! I've tried plugins such as sh404SEF and Styleware extensions, however to no avail. Can anyone help or know of any plugins to fix the canonicals?
Technical SEO | | JamesPearce0 -
Is this a Penguin hit; how to recover?
Our site is: www.kibin.com Our organic search traffic has dropped significantly in the last few days (about 50%). Although we don't have a ton of traffic coming in via organic search, this is a huge blow. We we previously the top spot for keywords like 'edit my essay', 'fix my paper', and 'paper revision'. These kw phrases linked to smaller services pages on our site such as: kibin.com/s/edit-my-essay kibin.com/s/fix-my-paper kibin.com/s/paper-revision We no longer rank for these keywords... at all. My first question/concern is... is this a Penguin penalty? I'm not sure. Honestly, I know we have some directory links, but we've never been aggressive on the backlink front. In fact, in the last few months we've been investing quite a bit into content: kibin.com/blog If this is a Penguin penalty, how should I best go about cleaning this up? I'm not even sure what links Google would be considering spammy, and again... we really don't have that extensive of a backlink profile. Please help, this is a real blow to our business and it's got me a big freaked out. Thanks!
Technical SEO | | Kibin0 -
Recovering from Blocked Pages Debaucle
Hi, per this thread: http://www.seomoz.org/q/800-000-pages-blocked-by-robots We had a huge number of pages blocked by robots.txt by some dynamic file that must have integrated with our CMS somehow. In just a few weeks hundreds of thousands of pages were "blocked." This number is now going down, but instead of by the hundreds of thousands, it is going down by the hundreds and very sloooooowwwwllly. So, we really need to speed up this process. We have our sitemap we will re-submit, but I have a few questions related to it: Previously the sitemap had the <lastmod>tag set to the original date of the page. So, all of these pages have been changed since then. Any harm in doing a mass change of the <lastmod>field? It would be an accurate reflection, but I don't want it to be caught by some spam catcher. The easy thing to do would be to just set that date to now, but then they would all have the same date. Any other tips on how to get these pages "unblocked" faster? Thanks! Craig</lastmod></lastmod>
Technical SEO | | TheCraig0 -
Duplicate video content question
This is really two questions in one. 1. If we put a video on YouTube and on our site via Wistia, how would that affect our rankings/authority/credibility? Would we get punished for duplicate video content? 2. If we put a Wistia hosted video on our website twice, on two different pages, we would get hit for having duplicate content? Any other suggestions regarding hosting on Wistia and YouTube versus just Wistia for product videos would be much appreciated. Thank you!
Technical SEO | | ShawnHerrick1 -
How do I eliminate duplicate page titles?
Almost...I repeat almost all of my duplicate page titles show up as such because the page is being seen twice in the crawl. How do I prevent this? <colgroup><col width="336"> <col width="438"></colgroup>
Technical SEO | | ENSO
| www.ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics |
| ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics | This is what is from the CSV...there are many more just like this. How do I cut out all of these duplicate urls?0