Can I robots.txt an entire site to get rid of Duplicate content?
-
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option.
I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)?
Thank you.
-
Just disallow in Robots. No need to do anything else.
-Andy
-
What if the site is not live yet?
-
Hi,
I do mean use robots.txt to block crawlers.
What you need to do is first noindex the site in question and then after a period of time, you can disallow it via the robots.txt.
The reason you do it this way is because right now you will have pages from this site indexed in Google - these need to be removed first. You can either do this with the noindex META and wait for Google to spider the site and action all of the noindex requests, or to speed things up, noindex the page and then remove it with Webmaster Tools.
If you don't do this, you are then just blocking the site from Google ever seeing it, so you will probably find that pages remain in the index - which you don't want as this is duplicate content.
-Andy
-
Thank you. I do mean use robots.txt to block crawlers.
-
Hi there!
Just for clarification, I'm really not sure what you mean by "robots.txt-ing" the site. Do you mean, should you use robots.txt to block crawlers from accessing the entire site? That would be fine, if you're not concerned about that site never ranking, ever.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding internal duplicate content
Suppose two of my webpages from the same site are having 30% to 35% common content. The reason behind this common content is that I put same data and images (in the main content area) since both pages are partially related. But, title tag, meta description, h1 tag, urls are different.
On-Page Optimization | | b.me
My questions are Can Google consider it as duplicate content?
Can it hamper the ranking of my pages ?
How can I deal with it?0 -
Duplicate page content
Hi Crawl errors is showing 2 pages of duplicate content for my clients WordPress site: /news/ & /category/featured/ Yoast is installed so how best to resolve this ? i see that both pages are canonicalised to themselves so presume just need to change the canonical tag on /category/featured/ to reference /news/ ?(since news is the page with higher authority and the main page for showing this info) or is there other way in Yoast or WP to deal with this & prevent from happening again ? Cheers Dan
On-Page Optimization | | Dan-Lawrence0 -
How to optimize WordPress Pages with Duplicate Page Content?
I found the non WWW ans WWW duplicate pages URL only, more than thousand pages.
On-Page Optimization | | eigital0 -
What is the best way to resolve duplicate content issue
Hi I have a client whose site content has been scraped and used in numerous other sites. This is detrimental to ranking. One term we wish to rank for is nowhere. My question is this: what's the quickest way to resolve a duplicate content issue when other sites have stolen your content? I understand that maybe I should firstly contact these site owners and 'appeal to their better nature'. This will take time and they may not even comply. I've also considered rewriting our content. Again this takes time. Has anybody experienced this issue before? If so how did you come to a solution? Thanks in advance.
On-Page Optimization | | sicseo0 -
Duplicate Content - Deleting Pages
The Penguin update in April 2012 caused my website to lose about 70% of its traffic overnight and as a consequence, the same in volume of sales. Almost a year later I am stil trying to figure out what the problem is with my site. As with many ecommerce sites a large number of the product pages are quite similar. My first crawl with SEOMOZ identified a large number of pages that are very similar - the majority of these are in a category that doesn't sell well anyway and so to help with the problem I am thinking of removing one of my categories (about 1000 products). My question is - would removing all these links boost the overall SEO of the site since I am removing a large chunk of near-duplicate links? Also - if I do remove all these links would I have to put in place a 301 redirect for every single page and if so, what's the quickest way of doing this. My site is www.modern-canvas-art.com Robin
On-Page Optimization | | robbowebbo0 -
Can Page Authority of a site be higher than its Domain Authority?
I own a website called Takeyourtips.com. While doing a search no Google, I found that the page authority of the home page (31) is higher than the domain authority (23). I was wondering if it's really possible because my understanding was page authority of a page is determined by its domain authority. Therefore, it the domain authority of a website is 23, none of its page could have a higher page authority. Plus, upon consulting an SEO expert, I was told that neither Domain Authority or Page Authority of a page carries any importance as far as higher ranking of a website is concerned. Is this true? Thanks in advance for the answers. Cheers, Sushant large
On-Page Optimization | | suskanchan0 -
Duplicate content issue in SEOmoz campaign.
Hi, We are running a campaign for a website in SEOmoz. We get a dup content issue warning: http://www.oursite.com and http://www.oursite.com/ are being seen as 2 different urls. Only difference among 2 urls is the trailing slash at the end of the second url. Why is this happening? I was aware of www vs non www but never heard of an issue related to the slash. Thanks for your help!
On-Page Optimization | | gerardoH1 -
Is there any benefit in on-site duplicate content?
I have about 50 internal pages on my site that I want to add a "Do it yourself tutorial" to in an effort to build the quality of the pages. Is this going to de-value the content if I put it on all 50 pages? It's difficult to write similar content 50 different ways.
On-Page Optimization | | BradBorst0