Duplicate Page content | What to do?
-
Hello Guys,
I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this:
www.exemple.com/user/login?destination=node/125%23comment-form
What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster?
Thanks in advance!
Pedro Pereira
-
Hi Carly,
It needs to be done to each of the pages. In most cases, this is just a minor change to a single page template. Someone might tell you that you can add an entry to robots.txt to solve the problem, but that won't remove them from the index.
Looking at the links you provided, I'm not convinced you should deindex them all - as these are member profile pages which might have some value in terms of driving organic traffic and having unique content on them. That said I'm not party to how your site works, so this is just an observation.
Hope that helps,
George
-
Hi George,
I am having a similar issue with my site, and was looking for a quick clarification.
We have several "member" pages that have been created as a part of registration (thousands) and they are appearing as duplicate content. When you say add noindex and and a canonical, is this something that needs to be done to every individual page or is there something that can be done that would apply to the thousands of pages at once?
Here are a couple of examples of what the pages look like:
http://loyalty360.org/me/members/8003
http://loyalty360.org/me/members/4641
Thank you!
-
1. If you add just noindex, Google will crawl the page, drop it from the index but it will also crawl the links on that page and potentially index them too. It basically passes equity to links on the page.
2. If you add nofollow, noindex, Google will crawl the page, drop it from the index but it will not crawl the links on that page. So no equity will be passed to them. As already established, Google may still put these links in the index, but it will display the standard "blocked" message for the page description.
If the links are internal, there's no harm in them being followed unless you're opening up the crawl to expose tons of duplicate content that isn't canonicalised.
noindex is often used with nofollow, but sometimes this is simply due to a misunderstanding of what impact they each have.
George
-
Hello,
Thanks for your response. I have learn more which is great
My question is should I add a noindex only to that page or a noidex, nofolow?
Thanks!
-
Yes it's the worst possible scenario that they basically get trapped in SERPs. Google won't then crawl them until you allow the crawling, then set noindex (to remove from SERPS) and then add nofollow,noindex back on to keep them out of SERPs and to stop Google following any links on them.
Configuring URL parameters again is just a directive regarding the crawl and doesn't affect indexing status to the best of my knowledge.
In my experience, noindex is bulletproof but nofollow / robots.txt is very often misunderstood and can lead to a lot of problems as a result. Some SEOs think they can be clever in crafting the flow of PageRank through a site. The unsurprising reality is that Google just does what it wants.
George
-
Hi George,
Thanks for this, It's very interesting... the urls do appear in search results but their descriptions are blocked(!)
Did you try configuring URL parameters in WMT as a solution?
-
Hi Rafal,
The key part of that statement is "we might still find and index information about disallowed URLs...". If you read the next sentence it says: "As a result, the URL address and, potentially, other publicly available information such as anchor text in links to the site can still appear in Google search results".
If you look at moz.com/robots.txt you'll see an entry for:
Disallow: /pages/search_results*
But if you search this on Google:
site:moz.com/pages/search_results
You'll find there are 20 results in the index.
I used to agree with you, until I found out the hard way that if Google finds a link, regardless of whether it's in robots.txt or not it can put it in the index and it will remain there until you remove the nofollow restriction and noindex it, or remove it from the index using webmaster tools.
George
-
George,
I went to check with Google to make sure I am correct and I am!
"While Google won't crawl or index the content blocked by
robots.txt
, we might still find and index information about disallowed URLs from other places on the web." Source: https://support.google.com/webmasters/answer/6062608?hl=enYes, he can fix these problems on page but disallowing it in robots will work fine too!
-
Just adding this to robots.txt will not stop the pages being indexed:
Disallow: /*login?
It just means Google won't crawl the links on that page.
I would do one of the following:
1. Add noindex to the page. PR will still be passed to the page but they will no longer appear in SERPs.
2. Add a canonical on the page to: "www.exemple.com/user/login"
You're never going to try and get these pages to rank, so although it's worth fixing I wouldn't lose too much sleep on the impact of having duplicate content on registration pages (unless there are hundreds of them!).
Regards,
George
-
In GWT: Crawl=> URL Parameters => Configure URL Parameters => Add Parameter
Make sure you know what you are doing as it's easy to mess up and have BIG issues.
-
Add this line to your robots.txt to prevent google from indexing these pages:
Disallow: /*login?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does a JS script who scroll automaticaly into pages could make some content "hidden" ?
Hello everybody, Sorry for my english (I'm French), I will try to do my best... We've got an e-commerce website : kumulusvape.fr
On-Page Optimization | | KumulusVape
On each categories, to improve our conversion rate, we put a javascript to automaticaly scroll into the page to the product list. You can see an example here : http://www.kumulusvape.fr/44-e-liquide-savourea-smookies This script scroll and make some content "hidden".
It's not really a scroll, just changing page position. Do you think that our h1 and our category content could be consider "hidden" by Google ? Thank you very much for your help0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Duplicate Content Issue in Magento
Hi I need help in resolving the duplicate content issue on my magento site I got a product My main product url is https://www.oakfurnitureking.co.uk/shop-by-product/boston-solid-oak-4-drawer-chest and it got variation of url see below that are causing duplicate content issue , I have inserted the canonical tag on the below url and my main url is https://www.oakfurnitureking.co.uk/shop-by-product/boston-solid-oak-4-drawer-chest but still moz is showing it as duplicate content. Help Please <colgroup><col width="1003"></colgroup>
On-Page Optimization | | Adnan.Hassan.Khan
| https://www.oakfurnitureking.co.uk/product/oak-bedroom-furniture/boston-solid-oak-4-drawer-chest |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/6/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/17/ |
| https://www.oakfurnitureking.co.uk/shop-by-range/boston/boston-solid-oak-4-drawer-chest |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/42/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/63/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/67/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/46/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/79/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/88/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/75/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/90/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/92/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/33/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/27/ |
| https://www.oakfurnitureking.co.uk/shop-by-range/boston-solid-oak-4-drawer-chest |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/50/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/22/ |
| https://www.oakfurnitureking.co.uk/catalog/product/view/id/45/s/boston-solid-oak-4-drawer-chest/category/74/ |0 -
Empty public profiles are viewed as duplicate content. What to do?
Hi! I manage a social networking site. We have a lot of public user profiles that are viewed as duplicate content. This is because these users haven't filled out any public profile info and thus the profiles are "empty" (except for the name). Is this something I should worry about? If yes, what are my options to solve this? Thanks!
On-Page Optimization | | thomasvanderkleij0 -
Is reported duplication on the pages or their canonical pages?
There are several sections getting flagged for duplication on one of our sites: http://mysite.com/section-1/?something=X&confirmed=true
On-Page Optimization | | Safelincs
http://mysite.com/section-2/?something=X&confirmed=true
http://mysite.com/section-3/?something=X&confirmed=true Each of the above are showing as having duplicates of the other sections. Indeed, these pages are exactly the same (it's just an SMS confirmation page you enter your code in), however, they all have canonical links back to the section (without the query string), i.e. section-1, section-2 and section-3 respectively. These three sections have unique content and aren't flagged up for duplications themselves, so my questions are: Are the pages with the query strings the duplicates, and if so why are the canonical links being ignored? or Are the canonical pages without the query strings the duplicates, and if so why don't they appear as URLs in their own right in the duplicate content report? I am guessing it's the former, but I can't figure out why it would ignore the canonical links. Any ideas? Thanks0 -
Duplicate page
Just getting started and had a question regarding one of the reports. It is telling me that I have duplicate pages but I'm not sure how to resolve that.
On-Page Optimization | | KeylimeSocial0 -
Strategies for revising my duplicate content?
New to SEO and SEOmoz. I tried searching for this first and I'm sure it's on here but I could not find it. I have a site that markets fishing charters in a few dozen cities. Up to now I was relying on PPC and using each city page as a landing page of sorts. Each citiy page is very similar (there are only so many ways to write about a type of fish or fishing). What would be the recommended way for optimizing this, keeping in mind the duplicate information we provide on each page seems to be important to people. Site is www.vipfishingcharters.com Thanks!
On-Page Optimization | | NoahC0 -
Duplicate Content
We offer Wellness programs for dogs and cats. A lot of the information is the same except for specifics that relate to young vs. senior pets. I have these different pages: Senior Wellness Kitten Wellness Puppy Wellness Adult Wellness Can each page have approx. 75% of the same text? Or should I rewrite each page so the information (though the same) appears unique.
On-Page Optimization | | PMC-3120870