Using Rel Nofollow on Duplicate Pages
-
Hi there,
I have a rather large site that has duplicate content on many pages due to how it's being spidered by google. I was hoping I could set the internal link to this page as "nofollow."
My question is that I have hundreds of other sites with backlinks to these duplicate content pages.. will this affect me negatively if I tell google not to index the duplicated pages?
-
Nofollow is no longer a way to block indexing of internal links. External yes, internal no.
Chad's answer about canonicals is a good one. Depending on how/why the duplicate pages are created/required example, sort orders or page 2, page 3, page 4 of the same page, you also have a couple of other options:
- Managing url paramaters in Google Webmaster Tools http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
- Googles rel next and rel prev tags http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html
-
Cheers - great answer thanks!
-
I found this page within SEO MOZ that answers the link juice question:
"Will this pass 100% of the link juice from a given page to another? More or less than a 301 redirect does now? Note that Google's official representative from the web spam team, Matt Cutts, said today that it passes link juice akin to a 301 redirect but also noted (when SEOmoz's ownGillian Muessig asked specifically) that "it loses no more juice than a 301," which suggests that there is some fractional loss when either of these are applied."
-
Add this to the top of your pages that you feel have the same content somewhere else on the website.
As far as the backlinks, I am not sure how that will pan-out. In theory they should still flow. I'll have to double check and make sure. However, duplicate content is a big issue. I deal with it on a national level and it is a total nightmare, especially when the site is very large-dynamically produce by a cms that is not seo friendly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it good practice to use hreflang on pages that have canonicals?
I have a page in English that has both English & Spanish translations on it. It is pulled in from a page generated on another site and I am not able to adjust the CSS to display only one language. Until I can fix this, I have made the English page the canonical for both. Do I still want to use hreflang for English & Spanish pages? What if I do not have a Spanish page at all. I assume (from what I've read) I should not have an hreflang on the English page. Is this correct? Thank you in advance.
Technical SEO | | RoxBrock0 -
Duplicate Page Titles For Paginated Topics In Blog
Hello, I've just run a site audit and it has come up with a duplicate title tag issue for the topics section of our blog. For example it is flagging that the following have the same page title. https://blog.companyname.com/topic/topic-name https://blog.companyname.com/topic/topic-name/page/2 How significant is this as an SEO issue and what are the ways we can go about fixing this? I look forward to any suggestions and guidance that can be provided. Thanks, John
Technical SEO | | SEOCT1 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
How do I fix issue regarding near duplicate pages on website associated to city OR local pages?
I am working on one e-commerce website where we have added 300+ pages to target different local cities in USA. We have added quite different paragraphs on 100+ pages to remove internal duplicate issue and save our website from Panda penalty. You can visit following page to know more about it. And, We have added unique paragraphs on few pages. But, I have big concerns with other elements which are available on page like Banner Gallery, Front Banner, Tool and few other attributes which are commonly available on each pages exclude 4 to 5 sentence paragraph. I have compiled one XML sitemap with all local pages and submitted to Google webmaster tools since 1st June 2013. But, I can see only 1 indexed page by Google on Google webmaster tools. http://www.bannerbuzz.com/local http://www.bannerbuzz.com/local/US/Alabama/Vinyl-Banners http://www.bannerbuzz.com/local/MO/Kansas-City/Vinyl-Banners and so on... Can anyone suggest me best solution for it?
Technical SEO | | CommercePundit0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Double problem: mobile friendly site and shopping cart page duplication
I have a website that has two issues related to SEO: 1) the main website (product.com) is not mobile-friendly and 2) I have a shopping cart site (buymyproduct.com) using Magento that basically duplicates our product pages that exist on the main marketing website. Uses click "buy now" on a product page and are sent to the checkout at "buymyproductnow.com". The company cannot overhaul product.com website right away and our shopping site (buymyproduct.com) uses a responsive theme and works well for iphone and iPad so I am thinking of making buymyproduct.com the mobile-friendly version of our website by using a sniffer on product.com and forwarding users to the mobile friendly version. If I add canonical references from the shopping cart product pages and articles back to product.com associated pages, will this lessen the blow to any seo issues? What other factors am I missing/need to consider Complicated and painful. Maybe doing nothing right now is best. Thanks for any feedback.
Technical SEO | | Timmmmy0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0