Duplicate Page content | What to do?
-
Hello Guys,
I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this:
www.exemple.com/user/login?destination=node/125%23comment-form
What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster?
Thanks in advance!
Pedro Pereira
-
Hi Carly,
It needs to be done to each of the pages. In most cases, this is just a minor change to a single page template. Someone might tell you that you can add an entry to robots.txt to solve the problem, but that won't remove them from the index.
Looking at the links you provided, I'm not convinced you should deindex them all - as these are member profile pages which might have some value in terms of driving organic traffic and having unique content on them. That said I'm not party to how your site works, so this is just an observation.
Hope that helps,
George
-
Hi George,
I am having a similar issue with my site, and was looking for a quick clarification.
We have several "member" pages that have been created as a part of registration (thousands) and they are appearing as duplicate content. When you say add noindex and and a canonical, is this something that needs to be done to every individual page or is there something that can be done that would apply to the thousands of pages at once?
Here are a couple of examples of what the pages look like:
http://loyalty360.org/me/members/8003
http://loyalty360.org/me/members/4641
Thank you!
-
1. If you add just noindex, Google will crawl the page, drop it from the index but it will also crawl the links on that page and potentially index them too. It basically passes equity to links on the page.
2. If you add nofollow, noindex, Google will crawl the page, drop it from the index but it will not crawl the links on that page. So no equity will be passed to them. As already established, Google may still put these links in the index, but it will display the standard "blocked" message for the page description.
If the links are internal, there's no harm in them being followed unless you're opening up the crawl to expose tons of duplicate content that isn't canonicalised.
noindex is often used with nofollow, but sometimes this is simply due to a misunderstanding of what impact they each have.
George
-
Hello,
Thanks for your response. I have learn more which is great
My question is should I add a noindex only to that page or a noidex, nofolow?
Thanks!
-
Yes it's the worst possible scenario that they basically get trapped in SERPs. Google won't then crawl them until you allow the crawling, then set noindex (to remove from SERPS) and then add nofollow,noindex back on to keep them out of SERPs and to stop Google following any links on them.
Configuring URL parameters again is just a directive regarding the crawl and doesn't affect indexing status to the best of my knowledge.
In my experience, noindex is bulletproof but nofollow / robots.txt is very often misunderstood and can lead to a lot of problems as a result. Some SEOs think they can be clever in crafting the flow of PageRank through a site. The unsurprising reality is that Google just does what it wants.
George
-
Hi George,
Thanks for this, It's very interesting... the urls do appear in search results but their descriptions are blocked(!)
Did you try configuring URL parameters in WMT as a solution?
-
Hi Rafal,
The key part of that statement is "we might still find and index information about disallowed URLs...". If you read the next sentence it says: "As a result, the URL address and, potentially, other publicly available information such as anchor text in links to the site can still appear in Google search results".
If you look at moz.com/robots.txt you'll see an entry for:
Disallow: /pages/search_results*
But if you search this on Google:
site:moz.com/pages/search_results
You'll find there are 20 results in the index.
I used to agree with you, until I found out the hard way that if Google finds a link, regardless of whether it's in robots.txt or not it can put it in the index and it will remain there until you remove the nofollow restriction and noindex it, or remove it from the index using webmaster tools.
George
-
George,
I went to check with Google to make sure I am correct and I am!
"While Google won't crawl or index the content blocked by
robots.txt
, we might still find and index information about disallowed URLs from other places on the web." Source: https://support.google.com/webmasters/answer/6062608?hl=enYes, he can fix these problems on page but disallowing it in robots will work fine too!
-
Just adding this to robots.txt will not stop the pages being indexed:
Disallow: /*login?
It just means Google won't crawl the links on that page.
I would do one of the following:
1. Add noindex to the page. PR will still be passed to the page but they will no longer appear in SERPs.
2. Add a canonical on the page to: "www.exemple.com/user/login"
You're never going to try and get these pages to rank, so although it's worth fixing I wouldn't lose too much sleep on the impact of having duplicate content on registration pages (unless there are hundreds of them!).
Regards,
George
-
In GWT: Crawl=> URL Parameters => Configure URL Parameters => Add Parameter
Make sure you know what you are doing as it's easy to mess up and have BIG issues.
-
Add this line to your robots.txt to prevent google from indexing these pages:
Disallow: /*login?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated Content on Wordpress Mobile&Desktop versions- Is it bad for SEO?
Hello, I use a Wordpress theme for displaying mobile and desktop versions separately. The problem is that if you use tools like screaming frog or if you look at the code (view source), you can detect duplicated content. But if you're browsing on your mobile you will see only the content I have created for the mobile version and the same if you're looking at the desktop version, you will only see the desktop content.
On-Page Optimization | | AlphaRoadside
Is this creating an SEO problem? If it is, please let me know why and if it has a solution. Thanks in advance.0 -
Question about creating content pages for keywords
Good morning, We are trying to rank our India based company which provides the following services
On-Page Optimization | | harshal.khatavkar
Engineering Design Services
Architectural Design Services
MEP Design Services Our target audiences are in the US and UK. Offcource, we are targetting above services keywords on most of our main pages and created dedicated services pages too. But lately, we found out that we are ranking well for keywords like Outsourcing Engineering Design Services, Outsourcing Architectural Design Services, etc... which are actually very very good keywords in terms of closing the leads/inquiries as people are actually looking out for outsourcing but the search count for those keywords is low. (though we closed 2 inquiries from those keywords). These pages we created in past just to increase the content of the website. I really want to give it a try to target those keywords by creating more pages, blog posts, backlinks, etc... My question is if we create more and more pages around those keywords then will it affect the rankings of the pages which are already ranking for those keywords or will the new pages compete against those pages or the new pages will help to boost current pages? We can write good content and blog posts on the outsourcing topic but not sure if we should create new pages or increase the length of the existing pages. Can you guys please help with some directions on this as I really don't want to take the wrong route. Look forward! Regards0 -
Is there anything wrong with having duplicate description tags if they are relevant to their pages?
I have duplicate description tags, but they make sense for the pages they're on. Is there anything wrong with this? Thanks for reading!
On-Page Optimization | | DA20130 -
Should I consolidate pages to prevent "thin content"
I have a site based on downloadable images which tend to slow a site. In order to increase speed I divided certain pages up so that there will be less images on each page such as here: http://www.kindergartenteacherresources.com/2011/09/23/spongebob-alphabet-worksheets-uppercase-letters-a-h/ http://www.kindergartenteacherresources.com/2011/09/23/spongebob-alphabet-worksheets-uppercase-letters-i-q/ http://www.kindergartenteacherresources.com/2011/09/23/spongebob-alphabet-worksheets-uppercase-letters-r-z/ The problem is that I now have potential duplicate content and thin content. Should I consolidate them and put all of the content from the 3 pages on one page? or maybe keep them as they are but add a rel previous / next tag? or any other suggestion to prevent a duplicate/thin content penalty while not slowing down the site too much?
On-Page Optimization | | JillB20130 -
Duplicate content on ecommerce
We have a website that we created a little over a year ago and have included our core products we have always focused on such as mobility scooters and power wheelchairs. We have been going through and updating product descriptions, adding product reviews that our customers have provided etc in order to improve on our SEO rankings and not be penalized by the Panda update. We were approached by a manufacturer last year about their products and they had close to 10k products that we were able to upload easily into our system. Obviously these all have standard manufacturers descriptions many sites are also using. It will take us forever to go through and change all of these and many products are similar to each other anyway they just vary in size, color etc. Will it help our rankings for our core products to simply go through and delete all of these additional products and categories and just add them one by one with unique descriptions and more detailed information when we have time? We aren't really selling many of them anyway so it won't hurt our sales. I'm clearly new to SEO and any help at all would be greatly appreciated. My main website is www.bestmedicalsuppliesonsale dot com A sample core category that we have changed descriptions for is http://www.bestmedicalsuppliesonsale.com/mobility-scooters-s/36.htm A sample of a category and products we simply uploaded would be at http://www.bestmedicalsuppliesonsale.com/Wound-Care-s/4837.htm I'm open to all suggestions I would just like to see my traffic and obviously sales increase. If there are any other glaring problems please let me know. I need help!
On-Page Optimization | | BestMedical0 -
Should I worry about duplicate titles on pages where there is paginated content?
LivingThere.com is a real estate search site and many of our content pages are "search result" - ish in that a page often provides all the listings that are available and this may go on for multiple pages. For example, this is a primary page about a building: http://livingthere.com/building/31308-Cocoa-Exchange Because of the number of listings, the listings paginate to a second page: http://livingthere.com/building/31308-Cocoa-Exchange?MListings_page=2 Both pages have the same Page Title. Is this a concern? If so is there a "best practice" for giving paginated content different titles? Thanks! Nate
On-Page Optimization | | nate1230 -
Duplicate Page Titles and Keywords
Still new to this SEO world, so please bear with me. I have an eCommerce site so one of the issues is duplicate content and page titles. So what I was thinking was this...for each product that I sell I have 4 or 5 keywords that I have targeted. For example for personalized iPhone cases I have decided on: iphone 4 case personalized, monogrammed iphone 4 case, personalized and monogrammed iphone case, preppy phone case, personalized iPhone case, monogrammed iPhone case For each of my products I was going to a product description (ie: trendy color block diagonal stripes) and a targeted keyword. But I was going to rotate the keywords through so as to try to avoid the duplicate page title issue. Will that help? Thanks much, Shara
On-Page Optimization | | Confections0 -
Urgent, Duplicate page title and content at eCommerce site- how to solve
Hi, there, does anyone can help to solve 'duplicate page title, duplicate page content' problem? it is a eCommerce site, each categories has hundreds of products, so there are more than 10 pages, but the report crawl the errors, i totally have no idea, can anyone help? Thanks a lot! Anna
On-Page Optimization | | anna-2944510