Can I robots.txt an entire site to get rid of Duplicate content?
-
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option.
I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)?
Thank you.
-
Just disallow in Robots. No need to do anything else.
-Andy
-
What if the site is not live yet?
-
Hi,
I do mean use robots.txt to block crawlers.
What you need to do is first noindex the site in question and then after a period of time, you can disallow it via the robots.txt.
The reason you do it this way is because right now you will have pages from this site indexed in Google - these need to be removed first. You can either do this with the noindex META and wait for Google to spider the site and action all of the noindex requests, or to speed things up, noindex the page and then remove it with Webmaster Tools.
If you don't do this, you are then just blocking the site from Google ever seeing it, so you will probably find that pages remain in the index - which you don't want as this is duplicate content.
-Andy
-
Thank you. I do mean use robots.txt to block crawlers.
-
Hi there!
Just for clarification, I'm really not sure what you mean by "robots.txt-ing" the site. Do you mean, should you use robots.txt to block crawlers from accessing the entire site? That would be fine, if you're not concerned about that site never ranking, ever.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content - Pricing Plan tables
Hey guys, We're faced with a problem that we want to solve. We're working on the designs for a few pages for a drag & drop email builder we're currently working on, and we will be having the same pricing table on several pages (much like Moz does). We're worried that Google will take this as duplicate content and not be very fond of it. Any ideas about how we could integrate the same flow without potentially harming ranking efforts? And NO, re-writing the content for each table is not an option. It would do nothing but confuse the heck out of our clients. 😄 Thanks everybody!
On-Page Optimization | | andy.bigbangthemes0 -
Robots.txt Question for E-Commerce Sites
Hi All, I have a couple of e-commerce clients and have a question about URLs. When you perform a search on website all URLs contain a question mark, for example: /filter.aspx?search=blackout I'm not sure that I want these indexed. Could I be causing any harm/danger if I add this to the robots.txt file? /*? Any suggestions welcome! Gavin
On-Page Optimization | | IcanAgency0 -
Can the Lightboxes on My Site be Crawled?
I'm trying to optimize my site, but I have lightboxes and I don't know if they are visible to the search engines. If they aren't, could you suggest something that I could do? THANK YOU so much!!!!! My site is lymphexpo.com
On-Page Optimization | | bosleypalmer0 -
How best to deal with internal duplicate content
hi having an issue with a client site and internal duplicate content. The client has a custom cms and when they post new content it can appear, in full, at two different urls on the site. Short of getting the client to move cms, which they won't do, I am trying to find an easy fix that they could do themselves. ideally they would add a canonical on one of the versions but the cms does allow them to view posts in html view, also would be a lot if messing about wth posting the page and then going back to the cms and adding the tag. the cms is unable to auto generate this either. The content editors are copywriters not programmers. Would there be a solution using wmt for this? They have the skill level to be able to add a url in wmt so im thinking that a stop gap solution could be to noindex one of the versions using the option in webmaster tools. Ongoing we will consult developers about modifying the cms but budgets are limited so looking for a cheap and quick solution to help until the new year. anyone know of a way other than wmt to block Google from seeing duplicate content. We can block Google from folders because only a small percentage of the content in the folder would be internally duplicate. would be very grateful for any suggestions anyone could offer. thanks.
On-Page Optimization | | daedriccarl0 -
Duplicate Content - But it isn't!
Hi All, I have a site that releases alerts for particular problem/events/happenings. Due to legal stuff we keep the majority of the content the same on each of these event pages. The URLs are all different but it keeps coming back as duplicate content. The canonical tag is not right (i dont think for this) egs http://www.holidaytravelwatch.com/alerts/call-to-arms/egypt/coral-sea-waterworld-resort-sharm-el-sheikh-egypt-holiday-complaints-july-2014 http://www.holidaytravelwatch.com/alerts/call-to-arms/egypt/hotel-concorde-el-salam-sharm-el-sheikh-egypt-holiday-complaints-may-2014
On-Page Optimization | | Astute-Media0 -
Redirecting pages (old site to new site)
I have a question- there is one location, one set of pages for both the old and new site on the same host environment so when I did the redirect it get into a loop trying to redirect from itself to itself Not sure how its gonna affect SEO. Will pages get hit for duplicate content?
On-Page Optimization | | Yanez0 -
WordPress - duplicate content
I'm using WordPress for my website. However, whenever I use the post section for news, I get a report back from SEOmoz saying that there's duplicate content. What it does is it posts them in the Category and Archive section. Does anyone know if Google sees this as duplicate content and if so how to stop it? Thanks
On-Page Optimization | | AAttias0 -
My blog title is getting added to my post title and I can't figure out how to remove it
Hello everyone, I am sure most of you guys know how to do this. I am finding that when my posts are listed in google, my blog title is added to my post title. 2 questions. 1. Is this hurting my rankings because the blog title is diluting my keywords making it look like I have a huge title? (the on page optimizer is dinging me for this. 2. I use a self hosted wordpress blog with the all in one seo plugin. Does anyone know how to remove the blog title from being appended to the post title? I thought I could do that in all in one seo by checking the rewrite title box and just having it say %post_title% but that didn't fix it. Also it was doing it before I checked that box as well. Any other ideas to try? Thanks
On-Page Optimization | | FastLearner0