Using Subdomains to Avoid Sitewide Penalties?
-
If I'm using a subdomain on my website, for instance news.mywebsite.com, and it gets penalized by Panda (or whatever animal update), would that affect the main domain and/or other subdomains?
-
It depends, in part, on whether what you're seeing is a manual penalty or an algorithmic update that is causing your site to rank lower.
Often times with Penguin updates, what looks like a penalty is actually just that a lot of the links pointing to a site have been devalued and are no longer passing juice. This is definitely something that can affect your whole domain, if there are enough inbound links to enough different pages on your site that have all been devalued. Subdomains won't be immune to this effect. However, as Michael points out below, losses from Penguin are usually more page/keyword specific, so you won't necessarily see your whole domain/subdomains hit.
With a manual penalty, whether or not the subdomain is hit depends on whether or not it shares the low quality signals that got the main domain hit. I certainly wouldn't recommend isolating black-hat practices on a subdomain, however; a penalty on a subdomain CAN affect other subdomains and the main domain, but it doesn't always do so. It's something to keep an eye on.
-
Panda is often a sitewide penalty, Penguin is usually isolated e.g. specific keywords, pages etc.
Take a look at this article, should give you some better insight into the updates and how they can affect your site/in what capacity they affect sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will adding 1M (legitimate/correct) internal backlinks to an orphan page trip algo penalty?
We have a massive long tail user generated gamification strategy that has worked really well. Because of that success we haven't really been paying enough attention to SEO and in looking around caught some glaring issues. The section of our site that works as long tail goes from overview page > first classification > sub classification > specific long tail term page. Looks like we were relying on google to crawl/use forms to go from our overview page to the first classification BUT those resulting pages were orphaned - so www.mysite.com/product/category_1 defaulted back to the search page creating duplicate issues. www.mysite.com/product/category_1 and www.mysite.com/product/category_2 and www.mysite.com/product/category_3 all had duplicate content as they all reverted to the overview page. It's clear we need to make an actual breadcrumb trail and proper site taxonomy/linkage. I'm wanting to do this on just this one area first, but it's a big section with over 3M indexed "specific long tail term pages". I want to just add a simple breadcurmb trail in a sub navigation menu but doing so will literally create millions of new internal backlinks from specific term pages to their sub & parent category pages. Although we're missing the intermediary category breadcrumbs, we did have a breadcrumb coming back to the main overview page - that was tagged nofollow. So now I'm contemplating adding millions of (proper) backlinks and removing a nofollow tag from another million internal back links. All of this seems in line with "best practices" but what I have not been able to determine is if there is a proper/better way to roll these changes out so as to not trigger an algorithm penalty. I am also reticent about making too many changes too quickly but these are SEO 101 basics that need to be rectified. Is it a mistake to make good improvements too quickly? Thanks!
On-Page Optimization | | DrewProZ1 -
Can we use internal no-follow links without negatively affecting rankings
we are creating a site structure for a travel website. the site homepage has a top navigation bar with 8 top level links and a total link count of 33 links in this (within menus). There are also 10 footer and ad-hoc links As this top navigation bar is a site-wide navigation, when entering s specific "travel destination" page, the "travel destination" page has its own contextual links and reference links, making the total inks on the destination page approx 107. do you think its ok to make all links in the top navigation bar no follow on all pages except the homepage? how would you approach this to create less links to maintain effective link-juice flow to required pages
On-Page Optimization | | Direct_Ram0 -
Product Landing page- Key Words used too often.
Hi There to everyone. I'm very new at SEO so I appreciate all the help I can get! Question: on my e-commerce website, one heading is "USB KEYS", then when you click on that, you get a page full of products with the word "USB KEYS" on the title, such as "4gig black USB KEY".. So then I do the MOZ Page grader and get a low score because on the product landing page it thinks I've used the word "USB KEYS" too much, like 72 times,.. But I have to use it- Cause it has to be in the product title! Is this ok!? thanks
On-Page Optimization | | cowhidesdirect0 -
Should you use Plural version of a keyword or singular
H If kw research shows that singular version of a keyword has higher search volume than plural version should you still use plural version in main on-page areas to try and catch both instances or focus on the singular ? cheers dan
On-Page Optimization | | Dan-Lawrence0 -
Removing old URLs that are being used for my on page optimization?
Is there a way to remove old URL's that are still being used for my keywords for my on page optimization? They are giving me grades of F since they no longer exist and if I change the URL to the current one, the grade becomes an A, but they are still showing after the new crawl.
On-Page Optimization | | Dirty0 -
Avoid Keyword Self-Cannibalization
Okay so if my website has the keyword 'dog training'. My main page shows one of my posts in which the title is 'dog training'. I cannot change it because the SEOMOZ on page tool tells me to keep the keyword at the front of the title for SEO increase I don't understand... <dl> <dd>Although employing targeted keywords in the H1 tag does not correlate well to high rankings, it does appear to provide some slight value. It's also considered a best practice for accessibility and to describe a page's content, hence our recommendation.
On-Page Optimization | | 678648631264
However, keyword stuffing may be perceived negatively by the search engines and can impact rankings. Thus, we suggest keeping keyword usage in the H1 to 2 or fewer uses.</dd> <dd>If I remove it then this shows up or change the anchor text. It pisses me the hell off because this damn tool doesn't even say how to fix the problem. If I go into one of my posts, the main title of my blog shows up in the top right corner which happens to be my keyword (I cannot change it) so what the f' do I do?????</dd> <dd>If someone could just take a look at my blog and tell me all the wrongs about it and how to fix everything, that would be amazing.
</dd> </dl>0 -
Help with Appropriate Use of Rel Canonical
Whenever i enable Canonical URL through the 3DCart Control panel I get this Critical Factor error when running the on page report card: Appropriate Use of Rel Canonical Moderate fix <dl> <dt>Canonical URL</dt> <dd>"http://rcnitroshop.com/Nitro-Monster-Truck"</dd> <dt>Explanation</dt> <dd>If the canonical tag is pointing to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. Make sure you're targeting the right page (if this isn't it, you can reset the target above) and then change the canonical tag to reference that URL.</dd> <dt>Recommendation</dt> <dd>We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply.</dd> </dl> Now if I disable Canonical URL then run the on page report card again the critical error goes away but I get this Optional Factor error instead: Canonical URL Tag Usage Moderate fix <dl> <dt>Number of Canonical tags</dt> <dd>0</dd> <dt>Explanation</dt> <dd>Although the canonical URL tag is generally thought of as a way to solve duplicate content problems, it can be extremely wise to use it on every (unique) page of a site to help prevent any query strings, session IDs, scraped versions, licensing deals or future developments to potentially create a secondary version and pull link juice or other metrics away from the original. We believe the canonical URL tag is a best practice to help prevent future problems, even if nothing is specifically duplicate/problematic today.</dd> <dt>Recommendation</dt> <dd>Add a canonical URL tag referencing this URL to the header of the page.</dd> </dl> So basically I disabled it because obviously a Critical error is much worse then an optional error. Is there a way I can get rid of both errors?
On-Page Optimization | | bilsonx0 -
Subdomains vs. Subfolders Inheriting Authority/Ranking Value
Our website is a continuing education website that is linked to a large university, and our URL is a subdomain of that larger university domain. We offer degrees as well, but because of the modifications we'd like to make to the degree webpages, our content management system won't let them be a part of our website. Now we're trying to figure out if we should create a separate subdomain for all degrees, and put the individual degrees in separate folders (so, all degrees' URLs woudl be degrees.us.university.edu/degree-name/), or if we should give each its own URL, which would be completely separate from ours (degree urls would be degree-name.university.edu). So our question is, how well do subdomains carry the value of the domain? Is it better to have twenty websites that are all separate subdomains of a strong domain, or one subdomain in a subdomain that houses all twenty websites in folders? And, as a side note, will housing the degrees in degrees.us.university.edu pass value to us (us.university.edu)? Thanks!
On-Page Optimization | | UWPCE0