Rel canonical and duplicate subdomains
-
Hi,
I'm working with a site that has multiple sub domains of entirely duplicate content. So, the production level site that visitors see is (for made-up illustrative example):
Then, there are sub domains which are used by different developers to work on their own changes to the production site, before those changes are pushed to production:
Google ends up indexing these duplicate sub domains, which is of course not good.
If we add a canonical tag to the head section of the production page (and therefor all of the duplicate sub domains) will that cause some kind of problem... having a canonical tag on a page pointing to itself? Is it okay to have a canonical tag on a page pointing to that same page?
To complete the example...
In this example, where our production page is 123abc456.edu, our canonical tag on all pages (this page and therefor the duplicate subdomains) would be:
Is that going to be okay and fix this without causing some new problem of a canonical tag pointing to the page it's on?
Thanks!
-
Hi Bob,
That excellent question I'll have to look in to and confirm. More later. Thanks!
-
Is the subdomain data stored on the server as directories?
So for example, is the Moe.123abc456.edu data stored in a folder like 123abc456.edu/Moe
If so, you can simply have one robots.txt on your root domain, blocking those directories
Disallow: /Moe/
-
Well, Bob, it looks like you're right! I guess it will for sure see all the pages in
as the ones to remove and not
Also, how does that robots text not get pushed to production as the developer working on that branch completes his work and pushes it to production.
I must confess, it still feels a little like bomb disposal.
-
This should be exactly what you need: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
-
Hi Bob,
Thanks for the suggestion/question. I'm thinking about that, but wouldn't putting some robots do not crawl text on pages already indexed be a little like closing the barn door after the horses left? Do you think it would un-index the already crawled sub-domain? Thanks!
-
Assuming that you do not need the development environments indexed in Google, why not simply block all crawlers on those subdomains?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo-Targeted Sub-Domains & Duplicate Content/Canonical
For background the sub domain structure here is inherited and commited to due to tech restrictions with some of our platforms. The brand I work with is splitting out their global site into regional sub sites (not too relevant but this is in order to display seasonal product in different hemispheres and to link to stores specific to the region). All sub-domains except EU will be geo-targeted to their relevant country. Regions and sub domains for reference: AU - Australia CA - Canada CH - Switzeraland EU - All Euro zone countries NZ - New Zealand US - United States This will be done with Wordpress multisite. The set up allows to publish content on one 'master' sub site and then decide which other sub sites to 'broadcast' to. Some content is specific to a sub-domain/region so no issue with duplicate and can set the sub-site version as canonical. However some content will appear on all sub-domains. au.example.com/awesome-content/ nz.example.com/awesome-content/ Now first question is since these domains are geo-targeted should I just have them all canonical to the version on that sub-domain? eg Or should I still signal the duplicate content with one canonical version? Essentially the top level example.com exists as a site only for publishing purposes - if a user lands on the top level example.com/awesome-content/ they are given a pop up to select region and redirected to the relevant sub-domain version. So I'm also unsure whether I want that content indexed at all?? I could make the top level example.com versions of all content be the canonical that all others point to eg. and rely on geo-targeting to have the right links show in the right search locations. I hope that's kind of clear?? Obviously I find it confusing and therefore hard to relay! Any feedback at all gratefully received. Cheers, Steve
Intermediate & Advanced SEO | | SteveHoney0 -
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
301 Redirect of subdomain?
Fellow Mozzers, I'm having a hard time wrapping my brain around a redirect issue and thought it was worth posing the question to the Moz community. I did a search first but couldn't find the exact answer I was looking for. How does a 301 redirect work when you redirect a sub domain example.homepage.com to www.homepage.com but you keep the sub directories of example.homepage.com/page-1 active and are trying to rank them? I'm dealing with a current project where this is happening and this doesn't make sense to me, to redirect the subdomain if you're also trying to rank/create search traffic for pages, sub directories on example.homepage.com. This also get's into the debate of if a sub domain site is viewed as it's own website and therefore has to rank itself. If this is true, it seems like we're kind of killing the authority of the site by redirecting it. Additionally, www.homepage.com has a much stronger link profile than example.homepage.com I hope this makes sense. Any thoughts are appreciated. Thanks for your time.
Intermediate & Advanced SEO | | SMG-Texas0 -
Duplicate content
I run about 10 sites and most of them seemed to fall foul of the penguin update and even though I have never sought inorganic links I have been frantically searching for a link based answer since April. However since asking a question here I have been pointed in another direction by one of your contributors. It seems At least 6 of my sites have duplicate content issues. If you search Google for "We have selected nearly 200 pictures of short haircuts and hair styles in 16 galleries" which is the first bit of text from the site short-hairstyles.com about 30000 results appear. I don't know where they're from nor why anyone would want to do this. I presume its automated since there is so much of it. I have decided to redo the content. So I guess (hope) at some point in the future the duplicate nature will be flushed from Google's index? But how do I prevent it happening again? It's impractical to redo the content every month or so. For example if you search for "This facility is written in Flash® to use it you need to have Flash® installed." from another of my sites that I coincidently uploaded a new page to a couple of days ago, only the duplicate content shows up not my original site. So whoever is doing this is finding new stuff on my site and getting it indexed on google before even google sees it on my site! Thanks, Ian
Intermediate & Advanced SEO | | jwdl0 -
Canonical Not Fixing Duplicate Content
I added a canonical tag to the home page last month, but I am still showing duplicate content for the home page. Here is the tag I added: What am I missing? Duplicate-Content.jpg
Intermediate & Advanced SEO | | InnoInsulation0 -
Better to have one subdomain or multiple subdomains?
We have quite a bit of content we are considering subdomaining. There are about 13 topic centers that could deserve their own subdomain, but there are about 2000 original articles that we also want to subdomain. We are considering a) putting the 13 centers (i.e. babies.domain.com, acne.domain.com, etc) and the 3000 articles (on a variety of topics) on one subdomain b) putting the 13 centers on their own subdomain and the remaining 3000 articles on their own subdomain as well (14 subdomains total) What do you think is the best solution and why?
Intermediate & Advanced SEO | | nicole.healthline0 -
Rel Canonical = WHAT
can someone please explain this "NOTICE" i am getting from my campaign...Is this a problem that needs attention?
Intermediate & Advanced SEO | | SEObleu.com0 -
How to manage duplicate content?
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries. The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate. What strategies exist to ensure that I'm not suffereing as a result of this content? Should I : Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole? Link back to the clients site to indicate that they are the original source Any suggestions?
Intermediate & Advanced SEO | | Mulith0