Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain vs Main Domain Penalties
-
We have a client who's main root.com domain is currently penalized by Google, but the subdomain.root.com is appearing very well. We're stumped - any ideas why?
-
Extremely helpful insight Marie - I will be contacting you directly soon.
It appears that the duplicate content you've found (and other dupe content we've found) is actually our content that other sites have repurposed. Seems like Google has determined our site as the culprit, so this would be an issue we need to address - the only thought that comes to mind right away is adding an 'Author' tag, then start working on what appears to be a hefty cleanup project, something that looks like you are an expert on and will most likely be working directly with you in the near future!
The 2nd level pages that have little content and lots of links are 'noindex,follow' but I'm nervous about the number of these tags throughout our site which could be seen as spammy to a search engine. Of note, the 2nd level page section you have found ranks quite well since it is a subdomain which is interesting. Our suspicion is that since we made the 404 (200 success) error that Google detected on Dec. 9, 2011, we have been on some sort of Google 'watch-list' and any little thing we do incorrectly that they find, we immediately are penalized.
The homepage description of our company is reused on industry directories that we are listed on, so perhaps we must consider re-writing our description to be unique, and adding more content to the homepage would be a good thing and is certainly easily doable.
-
You have some significant duplicate content issues with www.ides.com. There is not a lot of text on your home page and what is there is duplicated in many places across the web.
Your second level pages are all just links. I would noindex, follow these.
I looked at two inner pages:
http://plastics.ides.com/generics/6/alphamethylstyrene-ams - extremely thin content
Here is a Google search for text I copied from the styrene-acrylonitrile page. There are 247 pages that use this opening sentence.
My guess is that this is indeed a Panda issue. But please know that I've only just taken a quick look so I can't say for sure. What doesn't make sense is that your traffic drops don't happen on Panda dates which really should be the case if it was Panda.
Panda definitely can affect just one part of a site (such as a root and not a subdomain). I would work on making these pages completely unique and also noindexing the thin pages.
-
Thank you Marie,
We 301 redirect any traffic going to root.com to www.root.com, and any content that we moved from www.root.com to subdomain.root.com has been completely removed from www.root.com. There doesn't appear to be any duplicate content between the two. There is some duplicate content that we treat with canonicals on subdomain.root.com - very small portion of total pages (less than 1%).
As for your other questions, no warnings in WMT. Robots txt file looks clean, canonicals are in place correctly, and no accidental non-indexing that we know of.
Here is the actual site that might help to look at:
http://www.ides.com
http://plastics.ides.com/materials
http://www.ides.com/robots.txt -
I think the answer here depends on whether or not you have actually been penalized and why the site is dropping out of the SERPS. Do you have a warning in WMT? If not, then you're probably not penalized.
It's unlikely to be Penguin because Penguin did not refresh lately. Similarly, Panda did not refresh on the days you mentioned. So, it's not likely a penalty but rather some type of site structure issue.
Is there duplicate content between the subdomain and the root? If so, then Google will choose one as the owner and not show the other prominently. Any issues with robots.txt? Are the canonicals set correctly? Any chance of accidental noindexing?
-
Subdomains and root domains are not necessarily always owned by the same person and therefore will not always be given the same penalties. As Scott mentioned, they are seen as different sites.
e.g. If I create a new WordPress account and create me.wordpress.com and then build a black hat site which gets penalized, this is not going to affect you.wordpress.com or www.wordpress.com.
-
Thank you all for your insight - good stuff, but still stumped.
Here's everything we know that might help point out why the main domain (ie www.root.com) was penalized by Google. We redirect root.com to www.root.com with a 301 redirect, and it is setup this way in Google Webmaster Tools too.
December 9, 2011 - the site's 404 error page was incorrectly setup as a 200, resulting in a quick bloat of 1 million plus pages. The website dropped from Google immediately. The error page was correctly setup 2 days later. The site still appeared in Google's index via site: query. However the site didn't reappear in Google's SERPs until May 2, 2012.
October 25, 2012 - the website again drops from Google for an unknown reason. We then moved a significant portion of content from www.root.com to subdomain.root.com. Pages from subdomain.root.com began appearing within 3 days as high they appeared previously on Google. From December 9, 2011 throughout this entire time we were correcting any errors reported in Google Webmaster Tools on a daily basis.
February 26, 2013 - The website yet again is dropped from Google, the subdomain.root.com continues to appear and rank well.
Due to moving most of the content from www.root.com to subdomain.root.com, the index for www.root.com from October 2012 dropped from 142,000 slowly to an average of 21,400 ending at today's 4,230. However this index count fluctuates greatly every few days (probably due to moving content from www.root.com to subdomain.root.com).
Of note, the site is NOT a content farm, but legitimate unique technical content that is hosted for hundreds of clients.
Again any ideas are most welcome!
-
From my understanding subdomains are considered completely separate from root domains unless you have a 301 redirect or conical that tells search engines you want them to consider the root or the subdomain to be the same; for example, http://www.yourdomain.com (subdomain) points to http://yourdomain.com
Therefore, you could have a subdomain out rank a root domain, or in your case a root domain penalized and the subdomain continue to rank well. The fact that they share an IP address shouldn't affect all the domains under that IP as many websites are on shared hosting which use the same IP address.
-
This isn't necessarily surprising. Penalties and negative ranking algorithms can be applied at a page level, a subdomain level, a root domain level, etc.
For example, HubPages used subdomains to help escape from a Panda slap.
Another example: Google placed a manual penalty on a single page of BBC's website.
-
hmmm...
do they point to the same IP address?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain 403 error
Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
Technical SEO | | WeAreDigital_BE
Jens0 -
DNS vs IIS redirection
I'm working on a project where a site has gone through a rebrand and is therefore also moving to a new domain name. Some pages have been merged on the new site so it's not a lift and shift job and so I'm writing up a redirect plan. Their IT dept have asked if we want redirects done by DNS redirect or IIS redirect. Which one will allow us to have redirects on a page level and not a domain level? I think IIS may be the right route but would love your thoughts on this please.
Technical SEO | | Marketing_Today1 -
Rel canonical between mirrored domains
Hi all & happy new near! I'm new to SEO and could do with a spot of advice: I have a site that has several domains that mirror it (not good, I know...) So www.site.com, www.site.edu.sg, www.othersite.com all serve up the same content. I was planning to use rel="canonical" to avoid the duplication but I have a concern: Currently several of these mirrors rank - one, the .com ranks #1 on local google search for some useful keywords. the .edu.sg also shows up as #9 for a dirrerent page. In some cases I have multiple mirrors showing up on a specific serp. I would LIKE to rel canonical everything to the local edu.sg domain since this is most representative of the fact that the site is for a school in Singapore but...
Technical SEO | | AlexSG
-The .com is listed in DMOZ (this used to be important) and none of the volunteers there ever respoded to requests to update it to the .edu.sg
-The .com ranks higher than the com.sg page for non-local search so I am guessing google has some kind of algorithm to mark down obviosly local domains in other geographic locations Any opinions on this? Should I rel canonical the .com to the .edu.sg or vice versa? I appreciate any advice or opinion before I pull the trigger and end up shooting myself in the foot! Best regards from Singapore!0 -
403s vs 404s
Hey all, Recently launched a new site on S3, and old pages that I haven't been able to redirect yet are showing up as 403s instead of 404s. Is a 403 worse than a 404? They're both just basically dead-ends, right? (I have read the status code guides, yes.)
Technical SEO | | danny.wood1 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0