Are sub domains considered completely different than the root domain?
-
We have a project that is going to generate duplicate content. If we move the new content to a sub-domain (E.g. product.domain.com) will it still be considered duplicate content to the root domain? Or is it like having two completely different domains?
Thanks!
-
Cyrus is right!
-
Unfortunately, it doesn't work this way. See my comment above.
-
In the case of duplicate content, you always have to tell the search engines which is the canonical (the authoritative version). It doesn't matter if they are on the same website, different sub-domains, or completely different websites.
If you leave it up to Google to choose, often the ranking power of both pages is diminished, and if you have too much duplicate content, your entire site can suffer.
Either choose the page you want to rank, and send the appropriate signals to search engines, or create different content for different pages.
-
Yes they do
Just so you know , when you add a rel canonical the duplicate pages will drop from the SERPs eventually ( assuming google accepts the rel canonical , they don't always do it )
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
-
Thanks Saijo,
That is exactly what we are going to do. We are going to use the canonical tag on the new duplicate pages referencing the existing pages that have achieved rank. Quick question on that, do you know if "juice" is transferred through a canonical tag? The new pages will be getting some pretty great links but not enough to rank highly. Will this juice be passed to the existing pages through the canonical tag?
-
There is not enough information here to give you the right answer, but here is my 2 cents on the topic
Say you need these duplicate pages to exist for some reason and do NOT want them to be indexed . You can noindex them and have them in the subdomain. So they do not appear in the SERPs and are not duplicate content because Google ignores them. ( Think of it this way : you are telling google , these are duplicate pages and its not meant for the search results )
Say you need these duplicate pages to be indexed , then you use rel canonical on the duplicate pages all tell Google the main page you want to be indexed. ( Think of it this way : you are telling google , these are duplicate pages and you want importance to go to the main one ) FYI : over time this means the other pages will fade away in SERPs
What you don't want to do : put duplicate pages up on sub domain and making no effort to advise the bots they are duplicate .. They will figure it out and that is when they dont like it .
Hope that clears up your doubts , if you need a more specific answer . We will need more info on what exactly you are trying to achieve .
-
Thats perfect. Thank you.
The only issue I have is that in this scenario I am going to have to use close to duplicate title tags, h1's and URL's. Since each page is going after the exact same keyword. But if you are saying that Google has no problem with ranking these 2 pages because one of them is on a sub-domain than that solves at least one problem.
-
Google treats sub domains as a different site in a sense.. and hardly ever passes any benefit of the root domain to the sub domain. Usually, your root domain would rank higher than the sub domain.. naturally.. unless your sub domain has better seo on/off site.
If both pages were equal it would show both. but would have to choose which one would receive the better ranking.
-
Do not create exact copies of the site.
Google will not choose they will just "punish" your site and have it dropped.
If you must create a copy of the site, make it unique.
It is very important
-
I guess I should explain a little more. I realize that it is going to be duplicate content but would Google have to choose between the two pages as which one to show or would the sub-domain page be allowed to rank as well as the root domain page? All thing being equal of course.
E.g. A tractor re-seller sells John Deere Tractors. They have since signed a white label deal to be the official reseller of John Deere Tractors. They have a page for the riding model x at www.example.com/riding-mower-x that ranks very well. But they also have a new white label page now for the exact same mower. They want both pages to rank and not force google to choose which page to show. Would Google look at the sub-domain page of ridingmowerx.example.com as a completely different site so that it would not have to choose and could rank both pages?
-
short answer is yes as long as the content is available for crawl that will be duplicaed content.
please elaborate of he goal of this duplication so I can try to tackle this problem
-
It is duplicate content either way across the world wide web. For example.. a blog posts an article, I take that article and post it to my blog.. that is considered duplicate content. Use this post by rand to help determine when to use sub domains!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difference synonyms lsi and lda ?
For example a synonym of piedmont bike tour is piedmont cycling holiday correct ? what about lsi ? Is it words sch as barolo or barbaresco ? But what about lda ?
Intermediate & Advanced SEO | | seoanalytics0 -
Linking from & to in domains and sub-domains
What's the best optimised linking between sub-domains and domains? And every time we'll give website link at top with logo...do we need to link sub-domain also with all it's pages? If example.com is domain and example.com/blog is sub-domain or sub-folder... Do we need to link to example.com from /blog? Do we need to give /blog link in all pages of /blog? Is there any difference in connecting domains with sub-domains and sub-folders?
Intermediate & Advanced SEO | | vtmoz0 -
Old Sub domain removal and deletion of content
There are two questions here. I have waited for over 2-3 weeks now and they are still not resolved till now. An old sub-domain is still indexed on Google (blog.nirogam.com) of which all pages have been redirected or 404'd to main domain. There is no webmasters, no authority of this old sub-domain. Hosting of the same might be there. (this has been deleted and does not exist - we own main domain only) How do I de-index and remove them for good? _(Around ~1,000 pages)_I am trying this public tool - any better approaches?Even after removing pages and submission on the tool, 600 pages are still indexed after 2-3 weeks! We deleted a lot of thin content/duplicate pages from the domain (nirogam.com) in Wordpress - All these pages are still in Google's index. They are in Trash folder now. This is causing an increase in 404s in the webmasters etcI have served a 410 header (using wordpress plugin) on all these pages as these should not be directed to anything. However, Google does not always fully understand 410 properly and it still shows up in webmasters as read in this detailed post.All these pages are still indexed.How do I de-index these pages? Any other approach to stop the 404s and remove these pages for good?Any feedback/approach will be highly appreciated.
Intermediate & Advanced SEO | | pks3330 -
Why do some domains out rank stronger authority domains
Hi, If we take the Moz stats into account here, how comes sometimes weak Moz stat domains out ranking strong Moz stat domains? For example: A inner page with DA56 / PA40 is outranking a Wikipedia inner page with DA100 / PA82. That's a massive difference basically twice as strong on the Wikipedia page but being out ranking. In this case I assume the onpage SEO is playing a big part, but can onpage optimisation be that powerful? And I see this all the time, what SEO factors cause this? Thanks.
Intermediate & Advanced SEO | | Bondara0 -
Domain name Length
How many characters max per domain name ,is that important factor ?
Intermediate & Advanced SEO | | innofidelity0 -
Consolidate 150 domains to 1
Hi! Just as the questions tell we are looking at a project where we might have to consolidate 150 different domains into 1 (of course with a corresponding page on the new domain). We aim at preserving as much of the linkjuice as possible from each domain. Any advice on doing this propely? I, of course, see a risk of opening the new domain and just redirecting (301) the old domains to the specific page on the new domain but is there any right or wrong way of doing this? I might add that each domain has a more or less unique linkprofiles in terms om linking domains, number of linking domains and such. Our dear friend Cutts has some information on this topic, http://www.youtube.com/watch?v=l7M22teF3Ho but he only talks about 4 domains - which of course seem like a bit more natural occurring phenomenon. But what about 150 of them? Anyone got any advice? Is this as much of a no-go that I feel it is? Thanks! Edit: There domains are all owned by the same entitiy, share the same GWT and such.
Intermediate & Advanced SEO | | bebetteronline0 -
Multi domain redirect to single domain
Hello, all SEOers. Today, I would like to get some ideas about handling multiple domains. I have a client who bought numerous domains under purpose of prevent abuse of their brand name and at the same time for future uses. This client bought more than 100 domains. Some domains are paused, parked, lived and redirected to other site. I don't worry too much of parked domains and paused domains. However, what I am worrying is that there are about 40 different domains are now redirected to single domain and meta refresh was used for redirections. As far as I know, this can raise red flag for Google. I asked clients to clean up unnecessary domains, yet they want to keep them all. So now I have to figure out how to handle all domains which are redirect to single domain. So far, I came up with following ideas. 1. Build gateway page which shows lists of my client sites and redirect all domains to gateway page. 2. Implement robots.txt file to all different domains 3. Delete the redirects and leave it as parked domains. Could anyone can share other ideas in order to handling current status? Please people, share your ideas for me.
Intermediate & Advanced SEO | | Artience0 -
Hosting images on multiple domains
I'm taking the following from http://developer.yahoo.com/performance/rules.html "Splitting components allows you to maximize parallel downloads. Make sure you're using not more than 2-4 domains because of the DNS lookup penalty. For example, you can host your HTML and dynamic content on www.example.org and split static components between static1.example.org and static2.example.org" What I want to do is load page images (it's an eCommerce site) from multiple sub domains to reduce load times. I'm assuming that this is perfectly OK to do - I cannot think of any reason that this wouldn't be a good tactic to go with. Does anyone know of (or can think of) a reason why taking this approach could be in any way detrimental. Cheers mozzers.
Intermediate & Advanced SEO | | eventurerob0