Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain vs Main Domain Penalties
-
We have a client who's main root.com domain is currently penalized by Google, but the subdomain.root.com is appearing very well. We're stumped - any ideas why?
-
Extremely helpful insight Marie - I will be contacting you directly soon.
It appears that the duplicate content you've found (and other dupe content we've found) is actually our content that other sites have repurposed. Seems like Google has determined our site as the culprit, so this would be an issue we need to address - the only thought that comes to mind right away is adding an 'Author' tag, then start working on what appears to be a hefty cleanup project, something that looks like you are an expert on and will most likely be working directly with you in the near future!
The 2nd level pages that have little content and lots of links are 'noindex,follow' but I'm nervous about the number of these tags throughout our site which could be seen as spammy to a search engine. Of note, the 2nd level page section you have found ranks quite well since it is a subdomain which is interesting. Our suspicion is that since we made the 404 (200 success) error that Google detected on Dec. 9, 2011, we have been on some sort of Google 'watch-list' and any little thing we do incorrectly that they find, we immediately are penalized.
The homepage description of our company is reused on industry directories that we are listed on, so perhaps we must consider re-writing our description to be unique, and adding more content to the homepage would be a good thing and is certainly easily doable.
-
You have some significant duplicate content issues with www.ides.com. There is not a lot of text on your home page and what is there is duplicated in many places across the web.
Your second level pages are all just links. I would noindex, follow these.
I looked at two inner pages:
http://plastics.ides.com/generics/6/alphamethylstyrene-ams - extremely thin content
Here is a Google search for text I copied from the styrene-acrylonitrile page. There are 247 pages that use this opening sentence.
My guess is that this is indeed a Panda issue. But please know that I've only just taken a quick look so I can't say for sure. What doesn't make sense is that your traffic drops don't happen on Panda dates which really should be the case if it was Panda.
Panda definitely can affect just one part of a site (such as a root and not a subdomain). I would work on making these pages completely unique and also noindexing the thin pages.
-
Thank you Marie,
We 301 redirect any traffic going to root.com to www.root.com, and any content that we moved from www.root.com to subdomain.root.com has been completely removed from www.root.com. There doesn't appear to be any duplicate content between the two. There is some duplicate content that we treat with canonicals on subdomain.root.com - very small portion of total pages (less than 1%).
As for your other questions, no warnings in WMT. Robots txt file looks clean, canonicals are in place correctly, and no accidental non-indexing that we know of.
Here is the actual site that might help to look at:
http://www.ides.com
http://plastics.ides.com/materials
http://www.ides.com/robots.txt -
I think the answer here depends on whether or not you have actually been penalized and why the site is dropping out of the SERPS. Do you have a warning in WMT? If not, then you're probably not penalized.
It's unlikely to be Penguin because Penguin did not refresh lately. Similarly, Panda did not refresh on the days you mentioned. So, it's not likely a penalty but rather some type of site structure issue.
Is there duplicate content between the subdomain and the root? If so, then Google will choose one as the owner and not show the other prominently. Any issues with robots.txt? Are the canonicals set correctly? Any chance of accidental noindexing?
-
Subdomains and root domains are not necessarily always owned by the same person and therefore will not always be given the same penalties. As Scott mentioned, they are seen as different sites.
e.g. If I create a new WordPress account and create me.wordpress.com and then build a black hat site which gets penalized, this is not going to affect you.wordpress.com or www.wordpress.com.
-
Thank you all for your insight - good stuff, but still stumped.
Here's everything we know that might help point out why the main domain (ie www.root.com) was penalized by Google. We redirect root.com to www.root.com with a 301 redirect, and it is setup this way in Google Webmaster Tools too.
December 9, 2011 - the site's 404 error page was incorrectly setup as a 200, resulting in a quick bloat of 1 million plus pages. The website dropped from Google immediately. The error page was correctly setup 2 days later. The site still appeared in Google's index via site: query. However the site didn't reappear in Google's SERPs until May 2, 2012.
October 25, 2012 - the website again drops from Google for an unknown reason. We then moved a significant portion of content from www.root.com to subdomain.root.com. Pages from subdomain.root.com began appearing within 3 days as high they appeared previously on Google. From December 9, 2011 throughout this entire time we were correcting any errors reported in Google Webmaster Tools on a daily basis.
February 26, 2013 - The website yet again is dropped from Google, the subdomain.root.com continues to appear and rank well.
Due to moving most of the content from www.root.com to subdomain.root.com, the index for www.root.com from October 2012 dropped from 142,000 slowly to an average of 21,400 ending at today's 4,230. However this index count fluctuates greatly every few days (probably due to moving content from www.root.com to subdomain.root.com).
Of note, the site is NOT a content farm, but legitimate unique technical content that is hosted for hundreds of clients.
Again any ideas are most welcome!
-
From my understanding subdomains are considered completely separate from root domains unless you have a 301 redirect or conical that tells search engines you want them to consider the root or the subdomain to be the same; for example, http://www.yourdomain.com (subdomain) points to http://yourdomain.com
Therefore, you could have a subdomain out rank a root domain, or in your case a root domain penalized and the subdomain continue to rank well. The fact that they share an IP address shouldn't affect all the domains under that IP as many websites are on shared hosting which use the same IP address.
-
This isn't necessarily surprising. Penalties and negative ranking algorithms can be applied at a page level, a subdomain level, a root domain level, etc.
For example, HubPages used subdomains to help escape from a Panda slap.
Another example: Google placed a manual penalty on a single page of BBC's website.
-
hmmm...
do they point to the same IP address?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain 403 error
Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
Technical SEO | | WeAreDigital_BE
Jens0 -
English and French under the same domain
A friend of mine runs a B&B and asked me to check his freshly built website to see if it was <acronym title="Search Engine Optimization">SEO</acronym> compliant.
Technical SEO | | coolhandluc
The B&B is based in France and he's targeting a UK and French audience. To do so, he built content in english and french under the same domain:
https://www.la-besace.fr/ When I run a crawl through screamingfrog only the French content based URLs seem to come up and I am not sure why. Can anyone enlighten me please? To maximise his business local visibility my recommendation would be to build two different websites (1 FR and 1 .co.uk) , build content in the respective language version sites and do all the link building work in respective country sites. Do you think this is the best approach or should he stick with his current solution? Many thanks1 -
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
Redirect root domain to www
I've been having issues with my keyword rankings with MOZ and this is what David at M0Z asked me to do below. Does anyone have a solution to this? I'm not 100% sure what to do. Does it hurt ranking to have a domain at the root or not? Can I 301 redirect a whole site or do I have to do individual pages. "Your campaign is looking for rankings for the www version of the campaign but the URL resolves as a root domain. This would explain the discrepancy. Since there is no re-direct between the two, you can have brickmarkers.com 301 re-direct to www.site.com which will prevent you from re-creating your campaign to track the root domain. Once the re-direct is in place it will take a while for Google to show the www version in the results in which your campaign rankings will be accurate." Thanks
Technical SEO | | SeaDrive0 -
CNAME vs 301 redirect
Hi all, Recently I created a website for a new client and my next job is trying to get them higher in Google. I added them in OSE and noticed some strange backlinks. To my surprise the client has about 20 domain names. All automatically poiting to (showing) the same new mainsite now. www.maindomain.nl www.maindomain.be
Technical SEO | | Houdoe
www.maindomain.eu
www.maindomain.com
www.otherdomain.nl
www.otherdomain.com
... Some of these domains have backlinks too (but not so much). I suggested to 301 redirect them all to the main site. Just to avoid duplicate content. But now the webhoster comes into play: "It's a problem, client has only 1 hosting account, blablabla...". They told me they could CNAME the 20 domains to the main domain. Or A-record them to an IP address. This is too technical stuff for me. So my concrete questions are: Is it smart to do anything at all or am I just harming my client? The main site is ranking pretty well now. And some backlinks are from their copy sites (probably because everywhere the logo links to the full mainsite url). Does the CNAME or A-record solution has the same effect as a 301 redirect, from SEO perspective? Many thanks,
Hans0 -
Umlaut in domain
Hi, My client wants to expand it's business to Germany and logically we need a domain name to match. We've found a great one and regsiterd several variants to it. However I just found out that in Germany it is possible (while here it's not) to register a domain with an umlaut. My question is: will google assign more value to: schädlinge.de than schadlinge.de when users search for schädlinge? If yes, how large will the difference be? (I will use an umlaut in the title etc) Kind regards,
Technical SEO | | media-surfer
Jason.0 -
Domain authority and keyword difficulty
I know there are too many variables for a certain answer, however do people take their domain authority into account when using keyword difficulty tool? I have a new domain which only has a score of seven at the moment. When using the keyword searching tool what is the maximum difficulty level keywords people would target initially? Obviously I would seek to increase the difficulty of the words over time but to start off its a hard choice between keywords which can be ranked for in a reasonable period of time and the keywords which are getting enough traffic to make the effort worthwhile.
Technical SEO | | Grumpy_Carl0 -
Exact match subdomains
Hi, I have seen significant SEO benefits from owning exact match domains and was wondering whether exact match subdomains hold the same (or some) of these benefits? eg. halloweencostumes.co.uk vs. halloween [dot] costumes.co.uk Many thanks.
Technical SEO | | martyc0