Subdomain vs Main Domain Penalties
-
We have a client who's main root.com domain is currently penalized by Google, but the subdomain.root.com is appearing very well. We're stumped - any ideas why?
-
Extremely helpful insight Marie - I will be contacting you directly soon.
It appears that the duplicate content you've found (and other dupe content we've found) is actually our content that other sites have repurposed. Seems like Google has determined our site as the culprit, so this would be an issue we need to address - the only thought that comes to mind right away is adding an 'Author' tag, then start working on what appears to be a hefty cleanup project, something that looks like you are an expert on and will most likely be working directly with you in the near future!
The 2nd level pages that have little content and lots of links are 'noindex,follow' but I'm nervous about the number of these tags throughout our site which could be seen as spammy to a search engine. Of note, the 2nd level page section you have found ranks quite well since it is a subdomain which is interesting. Our suspicion is that since we made the 404 (200 success) error that Google detected on Dec. 9, 2011, we have been on some sort of Google 'watch-list' and any little thing we do incorrectly that they find, we immediately are penalized.
The homepage description of our company is reused on industry directories that we are listed on, so perhaps we must consider re-writing our description to be unique, and adding more content to the homepage would be a good thing and is certainly easily doable.
-
You have some significant duplicate content issues with www.ides.com. There is not a lot of text on your home page and what is there is duplicated in many places across the web.
Your second level pages are all just links. I would noindex, follow these.
I looked at two inner pages:
http://plastics.ides.com/generics/6/alphamethylstyrene-ams - extremely thin content
Here is a Google search for text I copied from the styrene-acrylonitrile page. There are 247 pages that use this opening sentence.
My guess is that this is indeed a Panda issue. But please know that I've only just taken a quick look so I can't say for sure. What doesn't make sense is that your traffic drops don't happen on Panda dates which really should be the case if it was Panda.
Panda definitely can affect just one part of a site (such as a root and not a subdomain). I would work on making these pages completely unique and also noindexing the thin pages.
-
Thank you Marie,
We 301 redirect any traffic going to root.com to www.root.com, and any content that we moved from www.root.com to subdomain.root.com has been completely removed from www.root.com. There doesn't appear to be any duplicate content between the two. There is some duplicate content that we treat with canonicals on subdomain.root.com - very small portion of total pages (less than 1%).
As for your other questions, no warnings in WMT. Robots txt file looks clean, canonicals are in place correctly, and no accidental non-indexing that we know of.
Here is the actual site that might help to look at:
http://www.ides.com
http://plastics.ides.com/materials
http://www.ides.com/robots.txt -
I think the answer here depends on whether or not you have actually been penalized and why the site is dropping out of the SERPS. Do you have a warning in WMT? If not, then you're probably not penalized.
It's unlikely to be Penguin because Penguin did not refresh lately. Similarly, Panda did not refresh on the days you mentioned. So, it's not likely a penalty but rather some type of site structure issue.
Is there duplicate content between the subdomain and the root? If so, then Google will choose one as the owner and not show the other prominently. Any issues with robots.txt? Are the canonicals set correctly? Any chance of accidental noindexing?
-
Subdomains and root domains are not necessarily always owned by the same person and therefore will not always be given the same penalties. As Scott mentioned, they are seen as different sites.
e.g. If I create a new WordPress account and create me.wordpress.com and then build a black hat site which gets penalized, this is not going to affect you.wordpress.com or www.wordpress.com.
-
Thank you all for your insight - good stuff, but still stumped.
Here's everything we know that might help point out why the main domain (ie www.root.com) was penalized by Google. We redirect root.com to www.root.com with a 301 redirect, and it is setup this way in Google Webmaster Tools too.
December 9, 2011 - the site's 404 error page was incorrectly setup as a 200, resulting in a quick bloat of 1 million plus pages. The website dropped from Google immediately. The error page was correctly setup 2 days later. The site still appeared in Google's index via site: query. However the site didn't reappear in Google's SERPs until May 2, 2012.
October 25, 2012 - the website again drops from Google for an unknown reason. We then moved a significant portion of content from www.root.com to subdomain.root.com. Pages from subdomain.root.com began appearing within 3 days as high they appeared previously on Google. From December 9, 2011 throughout this entire time we were correcting any errors reported in Google Webmaster Tools on a daily basis.
February 26, 2013 - The website yet again is dropped from Google, the subdomain.root.com continues to appear and rank well.
Due to moving most of the content from www.root.com to subdomain.root.com, the index for www.root.com from October 2012 dropped from 142,000 slowly to an average of 21,400 ending at today's 4,230. However this index count fluctuates greatly every few days (probably due to moving content from www.root.com to subdomain.root.com).
Of note, the site is NOT a content farm, but legitimate unique technical content that is hosted for hundreds of clients.
Again any ideas are most welcome!
-
From my understanding subdomains are considered completely separate from root domains unless you have a 301 redirect or conical that tells search engines you want them to consider the root or the subdomain to be the same; for example, http://www.yourdomain.com (subdomain) points to http://yourdomain.com
Therefore, you could have a subdomain out rank a root domain, or in your case a root domain penalized and the subdomain continue to rank well. The fact that they share an IP address shouldn't affect all the domains under that IP as many websites are on shared hosting which use the same IP address.
-
This isn't necessarily surprising. Penalties and negative ranking algorithms can be applied at a page level, a subdomain level, a root domain level, etc.
For example, HubPages used subdomains to help escape from a Panda slap.
Another example: Google placed a manual penalty on a single page of BBC's website.
-
hmmm...
do they point to the same IP address?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Domain Redirect from old domain with HTTPS
My domain was indexed with HTTPS://WWW. now that we redirected it the certificate has been removed and if you try to visit the old site with https it throws an obvious error that this sites not secure and the 301 does not happen. My question is will googles bot have this issue. Right now the domain has been in redirection status to the new domain for a couple months and the old site is still indexed, while the new one is not ranking well for half its terms. If that is not causing the problem can anyone tell me why would the 301 take such a long time. Ive double and quadruple checked the 301's and all settings to ensure its being redirected properly. Yet it still hasn't fully redirected. Something is wrong and my clients ready to ditch the old domain we worked on for a good amount of time. backgorund:About 30 days ago we found some redirect loops .. well not loop but it was redirecting from old domain to the new domain several times without error. I removed the plugins causing the multi redirects and now we have just one redirect from any page on the old domain to the new https version. Any suggestions? This is really frustrating me and I just can't figure it out. My only answer at this point is wait it out because others have had this issue where it takes up to 2 months to redirect the domain. My only issue is that this is the first domain redirect out of many that have ever taken more than a week or three.
Technical SEO | | waqid0 -
Link to AMP VS AMP Google Cache VS Standard page?
Hi guys, During the link building strategy, which version should i prefer as a destination between: to the normal version (php page) to the Amp page of the Website to the Amp page of Google Cache The main doubt is between AMP of the website or standard Version. Does the canonical meta equals the situation or there is a better solution? Thank you so mutch!
Technical SEO | | Dante_Alighieri0 -
Old domain still being crawled despite 301s to new domain
Hi there, We switched from the domain X.com to Y.com in late 2013 and for the most part, the transition was successful. We were able to 301 most of our content over without too much trouble. But when when I do a site:X.com in Google, I still see about 6240 URLs of X listed. But if you click on a link, you get 301d to Y. Maybe Google has not re-crawled those X pages to know of the 301 to Y, right? The home page of X.com is shown in the site:X.com results. But if I look at the cached version, the cached description will say :This is Google's cache of Y.com. It is a snapshot of the page as it appeared on July 31, 2014." So, Google has freshly crawled the page. It does know of the 301 to Y and is showing that page's content. But the X.com home page still shows up on site:X.com. How is the domain for X showing rather than Y when even Google's cache is showing the page content and URL for Y? There are some other similar examples. For instance, you would see a deep URL for X, but just looking at the <title>in the SERP, you can see it has crawled the Y equivalent. Clicking on the link gives you a 301 to the Y equivalent. The cached version of the deep URL to X also shows the content of Y.</p> <p>Any suggestions on how to fix this or if it's a problem. I'm concerned that some SEO equity is still being sequestered in the old domain.</p> <p>Thanks,</p> <p>Stephen</p></title>
Technical SEO | | fernandoRiveraZ1 -
One server, two domains - robots.txt allow for one domain but not other?
Hello, I would like to create a single server with two domains pointing to it. Ex: domain1.com -> myserver.com/ domain2.com -> myserver.com/subfolder. The goal is to create two separate sites on one server. I would like the second domain ( /subfolder) to be fully indexed / SEO friendly and have the robots txt file allow search bots to crawl. However, the first domain (server root) I would like to keep non-indexed, and the robots.txt file disallowing any bots / indexing. Does anyone have any suggestions for the best way to tackle this one? Thanks!
Technical SEO | | Dave1000 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
Parking Domains
I currently have a website domain.com.au, an American branch of the company who own domain.com are currently having their site built and want to forward there domain.com to domain.com.au while construction is taking place. Are there any negative effects to parking the domain.com on my domain.com.au? What is the best method to do this without causing any problems for my domain.com.au?
Technical SEO | | Pork0 -
How do doorway page penalties work?
In case of a doorway page penalty, are both the doorway page and the external domain affected?
Technical SEO | | nicole.healthline0 -
301 Redirect vs Domain Alias
We have hundreds of domains which are either alternate spelling of our primary domain or close keyword names we didn't want our competitor to get before us. The primary domain is running on a dedicated Windows server running IIS6 and set to a static IP. Since it is a static IP and not using host headers any domain pointed to the static IP will immediately show the contents of the site, however the domain will be whatever was typed. Which could be the primary domain or an alias. Two concerns. First, is it possible that Google would penalize us for the alias domains or dilute our primary domain "juice"? Second, we need to properly track traffic from the alias domains. We could make unique content for those performing well and sell or let expire those that are sending no traffic. It's not my goal to use the alias domains to artificially pump up our primary domain. We have them for spelling errors and direct traffic. What is the best practice for handling one or both of these issues?
Technical SEO | | briankb0