Solve duplicate content issues by using robots.txt
-
Hi,
I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok?
Thank for any help!
-
yes, robots.txt is the bad way, I will try to use canonical tag. Thanks for your help!
-
Using robots.txt is perhaps not the best way of doing it. Using the canonical or a noindex meta tag would likely be best. I think the reasons for this are best summed up in this article which explains, probably better than I could, why robots.txt is not the best way of dealing with duplicate content. Hope this helps.
-
I have tried to use cross-domain canonical, but this is too difficult for me. So, I want to confirm if I use the robots.txt file is ok or not?
Thanks
-
Why not use a cross-domain canonical whereby you reference the pages on your primary website as the canonical version on your secondary websites, thereby eliminating the duplication.
For example on each page that is duplicate on your secondary website you would add the following to the head to reference the primary pages:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect issue
http://www.themorrisagency.co.uk/wedding-band redirects to http://www.themorrisagency.co.uk/wedding-band-cost-much-hire/ NO canonical in place. Nothing in Htaccess. I cant think where else to look to find it. If I amend it to http://www.themorrisagency.co.uk/we it still redirects. Any ideas? I have spent way too long on it now
On-Page Optimization | | agentmorris10 -
Long list of companies spread out over several pages - duplicate content?
Hi all, I am currently working with a company formation agent. They have a list of every limited company spread over hundreds of pages. What do you guys think? Is there a need for Canonicals? The website is ranking pretty well but I want to make sure there aren't any problems in the future. Here are two pages as examples: http://www.formationsdirect.com/companysearchlist.aspx?start=MULLAGHBOY+CONSTRUCTION+LIMITED&next=1# http://www.formationsdirect.com/companysearchlist.aspx?start=%40a+company+limited&next=1# Also what about the actual company pages? See an example below http://www.formationsdirect.com/companysearchlist.aspx?name=AMNA+CONSTRUCTION+LTD&number=06630333#.U8PW6_ldX1s Thanks in advance Aaron
On-Page Optimization | | AaronGro0 -
Product Attribute pages and Duplicate content
Hiya I have two queries is about a jewellery shop running on wordpress and woocommerce. 1. I am a little indecisive on how to index the product categories without creating duplicate pages which will get me into trouble. For example: All earrings are listed on the category page: chainsofgold.co.uk/buy/earrings/ We also have product attribute pages which lists all the subcategories for the earrings: chainsofgold.co.uk/earrings/creoles/
On-Page Optimization | | bongoheads
chainsofgold.co.uk/earrings/drop/
chainsofgold.co.uk/earrings/studs/ I have the category URL and the product attribute URLs set to be indexed on my sitemaps. Will this get me into trouble creating duplicate content with the main category page? Should I only have the main category indexed and "no-index, follow" all the product attribute pages? 2. I am also thinking about incorporating these product attribute URLS into my menu so when people hover over earrings they get shown the types of earrings they can buy. However, I have the woocommerce faceted navigation working on the category pages. So if someone is visiting the page chainsofgold.co.uk/buy/earrings/ The user can click on the left hand side, and select "drops". The URL they will get though is one which is not indexed: http://www.chainsofgold.co.uk/buy/earrings/?filter_earrings=123 Can I link to those product attribute pages without the risk of getting accused of creating duplicate content? Thank you for your help. Carolina0 -
Wordpress blog duplicate issue
So after looking at the set up of the blog ive found this. http://www.trespass.co.uk/blog/ http://www.trespass.co.uk/blog/category/news/ http://www.trespass.co.uk/blog/category/general/ http://www.trespass.co.uk/blog/category/snow/ Content shown on http://www.trespass.co.uk/blog/ can also be found on the other 3 urls. The permalink structure is set up as /%category%/%postname%/ which I want to change to just %postname% Obviously i want to make things as seo friendly as possible so any suggestions to do this right without losing any indexed pages etc. I have limited access to make changes to plugins etc aswell as these need to be done through the development company who manage our site. Cheers Robert
On-Page Optimization | | Trespass0 -
Duplicate Pages software
Hey guys, i was told few hours ago about a system that can take few of your keywords and automatically will create new links and pages (in the map file) for your website, so a website that was build with 20 pages( for example) will be shown to SE as a site with hundreds of pages, thing that should help the SEO IS anyone heard about such a software? is it legal? any advice that you can give on this mater? Thanks i.
On-Page Optimization | | iivgi0 -
Duplicating content on multiple domains
Hey guys, I've started working with a new client recently called Resource Investing News. I'm more a Social Media person, though I do have SEO experience. RIN has about 40 URLs all of which have original news content published on them. One SEO-related issue that I can see here though is that the primary domain re-publishes all of the original content that the other URLs do. In other words: resourceinvestingnews.com will have an article on it that is also published on goldinvestingnews.com with the same date stamp and a link out to the original article. E.g. http://resourceinvestingnews.com/42539-molybdenum-goes-far-beyond-steelmaking.html http://molyinvestingnews.com/5301-molybdenum-steelmaking-vehicle-demand-electronics-lubricant.html Does anyone have an idea if this is something that should be reviewed and/or whether the content is being negatively affected in search? Many thanks!
On-Page Optimization | | blahblahblah20150 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
Duplicate Page Content Issues
How can I fix Duplicate Page Content Issues on my site : www.ifocalmedia.com. This is a WP site and the diagnostics shows I have 115 errors? I know this is damaging to my SEO campaign how do I clear these? Any help is very welcome.
On-Page Optimization | | shami0