Solve duplicate content issues by using robots.txt
-
Hi,
I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok?
Thank for any help!
-
yes, robots.txt is the bad way, I will try to use canonical tag. Thanks for your help!
-
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
I have tried to use cross-domain canonical, but this is too difficult for me. So, I want to confirm if I use the robots.txt file is ok or not?
Thanks
-
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content - Pricing Plan tables
Hey guys, We're faced with a problem that we want to solve. We're working on the designs for a few pages for a drag & drop email builder we're currently working on, and we will be having the same pricing table on several pages (much like Moz does). We're worried that Google will take this as duplicate content and not be very fond of it. Any ideas about how we could integrate the same flow without potentially harming ranking efforts? And NO, re-writing the content for each table is not an option. It would do nothing but confuse the heck out of our clients. 😄 Thanks everybody!
On-Page Optimization | | andy.bigbangthemes0 -
Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
On-Page Optimization | | sparrowdog0 -
Photo Gallery with Duplicate Content and Titles
I have a photo Gallery that is coming up as a lot of Duplicate Titles and Page Content and fixing each photo just isn't possible right now. Should I just block the search engines from indexing them to resolve the errors?
On-Page Optimization | | NeilBelliveau0 -
Duplicate Page Content Should we 301 - Best Practices?
What would be the best way to avoid a Duplicate Page Content for these type of pages. Our website generates user friendly urls, for each page..
On-Page Optimization | | 365ToursSafaris
So it is the same exact page, just both versions of the url work.. Example: http://www.safari365.com/about-africa/wildebeest-migration http://www.safari365.com/wildebeest-migration I don't think adding code to the page will work because its the same page for the incorrect and correct versions of the page. I don't think i can use the URL parameter setting because the version with /about-africa/ is the correct (correct as it it follows the site navigation) I was thinking of using the htaccess to redirect to the correct version.. Will that work ? and does it follow best Practices ? any other suggestions that would work better ?0 -
E-commerce site product descriptions and duplicate content
Hi everyone. I'm developing an e-commerce site using Prestashop and concerned about the issue of duplicate content among product descriptions. My main concerns are: If there are 500 or more products and those product descriptions are obtained from a manufacturer or supplier's website hence running into external duplicate content issues. Internal duplicate content is also an issue, if there are multiple similar products and each product has the same description across several pages. What would be the best approach to eliminate the possibility of incurring a duplicate content penalty due to similar product descriptions? I've already considered the suggestion of noindex-ing the complete range of products to help protect from duplicate content penalties and having unique articles written in the site blog discussing products instead linking to certain products on the site. Another consideration I had was noindex-ing all product pages except pages for featured products in the store and rewriting descriptions for a set amount of those featured products regularly (this will still have the problem of internal duplicate content across pages if similar product descriptions are rewritten). The product range is intended to be very large so I'm really seeking an alternative solution from the insane task of rewriting many product descriptions. Any suggestions to make SEO work efficient are very much welcome and appreciated. Thank you!
On-Page Optimization | | valuepets0 -
The crawl diagnosis indicated that my domain www.mydomain.com is duplicate with www.mydomain.com/index.php. How can I correct this issue?
How can I fix this issue when crawl diagnosis indicated that my www.mydomain.com is duplicate with www.mydomain.com/index.php? That suppose to be the same page and not duplicate, right?
On-Page Optimization | | jsevilla0 -
Is rel=canonical used only for duplicate content
Can the rel-canonical be used to tell the search engines which page is "preferred" when there are similar pages? For instance, I have an internal page that Google is showing on the first page of the SERPs that I would prefer the home page be ranked for. Both the home and internal page have been optimized for the same keyword. What is interesting is that the internal page has very few backlinks compared to the home page but Google seems to favor it since the keyword is in the URL. I am afraid a 301 will drop us from the first page of the SERPs.
On-Page Optimization | | surveygizmo0 -
Robots.txt file
Does it serve any purpose if we omit robots.txt file ? I wonder if spider has to read all the pages, why do we insert robots.txt file ?
On-Page Optimization | | seoug_20050