Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain vs Main Domain Penalties
-
We have a client who's main root.com domain is currently penalized by Google, but the subdomain.root.com is appearing very well. We're stumped - any ideas why?
-
Extremely helpful insight Marie - I will be contacting you directly soon.
It appears that the duplicate content you've found (and other dupe content we've found) is actually our content that other sites have repurposed. Seems like Google has determined our site as the culprit, so this would be an issue we need to address - the only thought that comes to mind right away is adding an 'Author' tag, then start working on what appears to be a hefty cleanup project, something that looks like you are an expert on and will most likely be working directly with you in the near future!
The 2nd level pages that have little content and lots of links are 'noindex,follow' but I'm nervous about the number of these tags throughout our site which could be seen as spammy to a search engine. Of note, the 2nd level page section you have found ranks quite well since it is a subdomain which is interesting. Our suspicion is that since we made the 404 (200 success) error that Google detected on Dec. 9, 2011, we have been on some sort of Google 'watch-list' and any little thing we do incorrectly that they find, we immediately are penalized.
The homepage description of our company is reused on industry directories that we are listed on, so perhaps we must consider re-writing our description to be unique, and adding more content to the homepage would be a good thing and is certainly easily doable.
-
You have some significant duplicate content issues with www.ides.com. There is not a lot of text on your home page and what is there is duplicated in many places across the web.
Your second level pages are all just links. I would noindex, follow these.
I looked at two inner pages:
http://plastics.ides.com/generics/6/alphamethylstyrene-ams - extremely thin content
Here is a Google search for text I copied from the styrene-acrylonitrile page. There are 247 pages that use this opening sentence.
My guess is that this is indeed a Panda issue. But please know that I've only just taken a quick look so I can't say for sure. What doesn't make sense is that your traffic drops don't happen on Panda dates which really should be the case if it was Panda.
Panda definitely can affect just one part of a site (such as a root and not a subdomain). I would work on making these pages completely unique and also noindexing the thin pages.
-
Thank you Marie,
We 301 redirect any traffic going to root.com to www.root.com, and any content that we moved from www.root.com to subdomain.root.com has been completely removed from www.root.com. There doesn't appear to be any duplicate content between the two. There is some duplicate content that we treat with canonicals on subdomain.root.com - very small portion of total pages (less than 1%).
As for your other questions, no warnings in WMT. Robots txt file looks clean, canonicals are in place correctly, and no accidental non-indexing that we know of.
Here is the actual site that might help to look at:
http://www.ides.com
http://plastics.ides.com/materials
http://www.ides.com/robots.txt -
I think the answer here depends on whether or not you have actually been penalized and why the site is dropping out of the SERPS. Do you have a warning in WMT? If not, then you're probably not penalized.
It's unlikely to be Penguin because Penguin did not refresh lately. Similarly, Panda did not refresh on the days you mentioned. So, it's not likely a penalty but rather some type of site structure issue.
Is there duplicate content between the subdomain and the root? If so, then Google will choose one as the owner and not show the other prominently. Any issues with robots.txt? Are the canonicals set correctly? Any chance of accidental noindexing?
-
Subdomains and root domains are not necessarily always owned by the same person and therefore will not always be given the same penalties. As Scott mentioned, they are seen as different sites.
e.g. If I create a new WordPress account and create me.wordpress.com and then build a black hat site which gets penalized, this is not going to affect you.wordpress.com or www.wordpress.com.
-
Thank you all for your insight - good stuff, but still stumped.
Here's everything we know that might help point out why the main domain (ie www.root.com) was penalized by Google. We redirect root.com to www.root.com with a 301 redirect, and it is setup this way in Google Webmaster Tools too.
December 9, 2011 - the site's 404 error page was incorrectly setup as a 200, resulting in a quick bloat of 1 million plus pages. The website dropped from Google immediately. The error page was correctly setup 2 days later. The site still appeared in Google's index via site: query. However the site didn't reappear in Google's SERPs until May 2, 2012.
October 25, 2012 - the website again drops from Google for an unknown reason. We then moved a significant portion of content from www.root.com to subdomain.root.com. Pages from subdomain.root.com began appearing within 3 days as high they appeared previously on Google. From December 9, 2011 throughout this entire time we were correcting any errors reported in Google Webmaster Tools on a daily basis.
February 26, 2013 - The website yet again is dropped from Google, the subdomain.root.com continues to appear and rank well.
Due to moving most of the content from www.root.com to subdomain.root.com, the index for www.root.com from October 2012 dropped from 142,000 slowly to an average of 21,400 ending at today's 4,230. However this index count fluctuates greatly every few days (probably due to moving content from www.root.com to subdomain.root.com).
Of note, the site is NOT a content farm, but legitimate unique technical content that is hosted for hundreds of clients.
Again any ideas are most welcome!
-
From my understanding subdomains are considered completely separate from root domains unless you have a 301 redirect or conical that tells search engines you want them to consider the root or the subdomain to be the same; for example, http://www.yourdomain.com (subdomain) points to http://yourdomain.com
Therefore, you could have a subdomain out rank a root domain, or in your case a root domain penalized and the subdomain continue to rank well. The fact that they share an IP address shouldn't affect all the domains under that IP as many websites are on shared hosting which use the same IP address.
-
This isn't necessarily surprising. Penalties and negative ranking algorithms can be applied at a page level, a subdomain level, a root domain level, etc.
For example, HubPages used subdomains to help escape from a Panda slap.
Another example: Google placed a manual penalty on a single page of BBC's website.
-
hmmm...
do they point to the same IP address?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forwarding a .org domain to a .com domain: any negative impact to consider?
Hello! I have a question I've been unable to find a clear answer to. My client's primary domain is a .com with a satisfactorily high DA. My client owns the .org version of its domain (which has a very low DA, I suppose due to inactivity) but has never forwarded it on. For branding/visibility/traffic reasons, I'd like to recommend they set up the .org domain to forward to the .com domain, but I wanted to ask a few questions first: 1. Does forwarding low-value DA domains to high-value DA domains have any negative authority/SEO impact? 2. If the .org domain was to be forwarded, am I correct that an SSL cert is not necessary for it if the .com domain has an SSL cert? Thanks in advance!
Technical SEO | | mollykathariner_ms1 -
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
Moving from a subdomain to subfolder
Hello, I am currently working on a site that is leveraging multiple subdomains. I wanted to see if it suggested to migrate them into subfolders. One of the subdomains is a .shop and the other is location specific. Thanks, T
Technical SEO | | Tucker_100 -
403s vs 404s
Hey all, Recently launched a new site on S3, and old pages that I haven't been able to redirect yet are showing up as 403s instead of 404s. Is a 403 worse than a 404? They're both just basically dead-ends, right? (I have read the status code guides, yes.)
Technical SEO | | danny.wood1 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0 -
Mobile Domain Setup
Hi, If I want to serve a subset of pages on my mobile set from my desktop site or the content is significantly different, i.e. it is not one to one or pages are a summarised version of the desktop, should I use m.site.com or is it still better to use site.com? Many thanks any help appreciated.
Technical SEO | | MarkChambers0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0