Solve duplicate content issues by using robots.txt
-
Hi,
I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok?
Thank for any help!
-
yes, robots.txt is the bad way, I will try to use canonical tag. Thanks for your help!
-
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
I have tried to use cross-domain canonical, but this is too difficult for me. So, I want to confirm if I use the robots.txt file is ok or not?
Thanks
-
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I run a Business Directory, Where all the businesses are listed. I am having an issue.. with Duplication content. I have categories, Like A, B, C Now a business in Category A, User can filter it by different locations, by State, City, Area So If they filter it by the State and the State has 10 businesses and all of them are in one City. Both of the page
On-Page Optimization | | Adnan4SEO
The state filtered and the city filtered are same, What can i do to avoid that? canonical-url-tag or changing page Meta's and Body text? Please help 🙂0 -
Does hreflang restrain my site from being penalized for duplicated content?
I am curently setting up a travel agency website. This site is going to be targeting both american and mexican costumers. I will be working with an /es subdirectory. Would hreflang, besides showing the matching language version in the SERP´s, restrain my site translated content (wich is pretty much the same) from being penalized fro duplicated content? Do I have to implement relcannonical? Thank ypu in advanced for any help you can provide.
On-Page Optimization | | kpi3600 -
Any issue with my On page SEO
Kindly review my website and let me know if there is any issue with On page or I am missing anything? Home page - http://www.ayurjeewan.com Deep Page - http://www.ayurjeewan.com/natural/divya-triphala-churna/
On-Page Optimization | | MasonBaker0 -
Wordpress blog duplicate issue
So after looking at the set up of the blog ive found this. http://www.trespass.co.uk/blog/ http://www.trespass.co.uk/blog/category/news/ http://www.trespass.co.uk/blog/category/general/ http://www.trespass.co.uk/blog/category/snow/ Content shown on http://www.trespass.co.uk/blog/ can also be found on the other 3 urls. The permalink structure is set up as /%category%/%postname%/ which I want to change to just %postname% Obviously i want to make things as seo friendly as possible so any suggestions to do this right without losing any indexed pages etc. I have limited access to make changes to plugins etc aswell as these need to be done through the development company who manage our site. Cheers Robert
On-Page Optimization | | Trespass0 -
Issue: Rel Canonical
My SEO Report shows issues: Rel Canonical I have a wordpress website each page has its content but I'm getting errors from my SEOMOZ report. I instaledl the yoast plug in to fix the issue but I'm still getting 29 errors. Wordpress 3.4.1
On-Page Optimization | | mobiledudes0 -
Duplicat page content issue I don't know how to solve
I've got a few pages (click here to see the fist on with the others as side bar links). They are all thumbnail pages of different products. The tiles are pretty different but the page content is virtually the same for all of them as is the meta description tag. I'm getting error's on the SEOmoz crawl for those pages. I know the meta tag shouldn't be a problem in SEO but is the content of the page going to cause me issues? Are the error messages from SEOmoz a result of the page content or the meta description? The pages are very similar but they are different enough that I want to separate them onto different pages. There would be too many links on that single page as well if all the thumbs where on the same page. Should I just ignore the error messages?
On-Page Optimization | | JAARON0 -
Will a "no follow" "no index" meta tag resolve duplicate content issue?
I have a duplicate content issue. If the page has already been indexed will a no follow no index tag resolve the issue or do I also need a rel canonical statement?
On-Page Optimization | | McKeeMarketing0 -
Cross Domain Duplicate Content
Hi My client has a series of websies, one main website and several mini websites, articles are created and published daily and weekly, one will go on a the main website and the others on one, two, or three of the mini sites. To combat duplication, i only ever allow one article to be indexed (apply noindex to articles that i don't wanted indexed by google, so, if 3 sites have same article, 2 sites will have noindex tag added to head). I am not completely sure if this is ok, and whether there are any negative affects, apart from the articles tagged as noindex not being indexed. Are there any obvious issues? I am aware of the canonical link rel tag, and know that this can be used on the same domain, but can it be used cross domain, in place of the noindex tag? If so, is it exactly the same in structure as the 'same domain' canonical link rel tag? Thanks Matt
On-Page Optimization | | mattys0