Need help with some duplicate content.
-
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was.
It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this:
http://noahsdad.com/resources/
http://noahsdad.com/resources/page/2/
http://noahsdad.com/therapy/page/2/
I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category.
What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed."
There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories.
Any ideas what I should do?
Thanks.
-
I didn't mention "prev" and "next" as they are already implemented in the head tag, would you add them directly to the links as well? Also, I think Google is the only search engine that supports them at the moment.
-
Gianluca is correct. prev next would work here, but i thought this would be too confusing, i did not know there were plugins that can do this for you. also, this would make page one rank for all the content, this may confuse users when they dont find the content the searched for on that page. so technicaly it would work, but foor the user i dont know if it is the right solutions, this works best for one article over many pages.
-
The correct answer to your kind of issue, which is related to psgination, is this one: Use the rel="prev" rel="next" tags. These are the tags Google suggest to use in order to specify that a set of pages are paginated, hence it will just consider the first one. Check these links: http://googlewebmastercentral.blogspot.com.es/2011/09/pagination-with-relnext-and-relprev.html http://googlewebmastercentral.blogspot.com.es/2012/03/video-about-pagination-with-relnext-and.html http://www.seomoz.org/q/need-help-with-some-duplicate-content There are several plugins for Wordpress about this solution.
-
Yes, I have about 60 404's and 403's I'm trying to correct...
Thanks for the feedback by the way.
-
I've never used Wordpress but does this help?
http://www.malcolmcoles.co.uk/blog/avoid-duplicate-meta-descriptions-in-pages-2-and-higher-of-the-wordpress-loop/It's strange how it's possible to add canonical page numbers, but not add the same thing to the title tag, I think.
-
You look like you're doing a good job, you even have unique text content for each video on the pages, so I can't see why they're flagging as duplicates. Is this in the SEOmoz software? That takes into account the whole structure of the page rather than just the content. Like Alan says, add the page number to the title tag if possible, though I'd add it at the beginning of the tag - it just helps show the search engines that page 1 is the most important.
P.S. this is still a good article a couple of years later: http://www.seomoz.org/blog/pagination-best-practices-for-seo-user-experience
-
Thats why i said if it is difficult then i would not worry.
i would not no-index them,
if you had unique titles, you may rank a bit better, you ae not going to get punished for it if they dont. but if you no-index, you are punishing yourself.
not only do no-indexed pages not appear in search results, but any link pointing to them is wasting link juice.
-
I'm not sure how you would give the author pages different titles on a Wordpress powered site...
Should I check some of the no index settings within the plugin?
-
OK, then yes try to give them unique page titles, even add page 2 on the end, if this is difficault to do then i would not worry too much about it.
-
On my reports they show up as duplicate page titiles....
-
Maybe i am not understading you, but these pages dont apear to be duplicates top me
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonicals from sub-domain to main domain: How much content relevancy matters? Any back-links impact?
Hi Moz community, I have this different scenario of using canonicals to solve the duplicate content issue in our site. Our subdomain and main domain have similar landing pages of same topics with content relevancy about 50% to 70%. Both pages will be in SERP and confusing users; possibly search engine too. We would like solve this by using canonicals on subdomain pointing to main domain pages. Even our intention is to only to show main domain pages in SERP. I wonder how Google handles it? Will the canonicals will be respected with this content relevancy? What happens if they don't respect? Just ignore or penalise for trying to do this? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
Sub-domain with spammy content and links: Any impact on main website rankings?
Hi all, One of our sub-domains is forums. Our users will be discussing about our product and many related things. But some of the users in forum are adding a lot of spammy content everyday. I just wonder whether this scenario is ruining our ranking efforts of main website? A sub domain with spammy content really kills the ranking of main website? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate pages in language versions, noindex in sitemap and canonical URLs in sitemap?
Hi SEO experts! We are currently in the midst of reducing our amount of duplicate titles in order to optimize our SEO efforts. A lot of the "duplicate titles" come from having several language versions of our site. Therefore, I am wondering: 1. If we start using "" to make Google (and others) aware of alternative language versions of a given site/URL, how big a problem will "duplicate titles" then be across our domains/site versions? 2. Is it a problem that we in our sitemap include (many) URL's to pages that are marked with noindex? 3. Are there any problems with having a sitemap that includes pages that includes canonical URL's to other pages? Thanks in advance!
Algorithm Updates | | TradingFloor.com0 -
Site has disappeared since Panda 4 despite quality content, help!
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0. All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high. The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty). Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively? Would really appreciate any help.
Algorithm Updates | | search_shop0 -
Am I doing enough to rid duplicate content?
I'm in the middle of a massive cleanup effort of old duplicate content on my site, but trying to make sure I'm doing enough. My main concern now is a large group of landing pages. For example: http://www.boxerproperty.com/lease-office-space/office-space/dallas http://www.boxerproperty.com/lease-office-space/executive-suites/dallas http://www.boxerproperty.com/lease-office-space/medical-space/dallas And these are just the tip of the iceberg. For now, I've put canonical tags on each sub-page to direct to the main market page (the second two both point to the first, http://www.boxerproperty.com/lease-office-space/office-space/dallas for example). However this situation is in many other cities as well, and each has a main page like the first one above. For instance: http://www.boxerproperty.com/lease-office-space/office-space/atlanta http://www.boxerproperty.com/lease-office-space/office-space/chicago http://www.boxerproperty.com/lease-office-space/office-space/houston Obviously the previous SEO was pretty heavy-handed with all of these, but my question for now is should I even bother with canonical tags for all of the sub-pages to the main pages (medical-space or executive-suites to office-space), or is the presence of all these pages problematic in itself? In other words, should http://www.boxerproperty.com/lease-office-space/office-space/chicago and http://www.boxerproperty.com/lease-office-space/office-space/houston and all the others have canonical tags pointing to just one page, or should a lot of these simply be deleted? I'm continually finding more and more sub-pages that have used the same template, so I'm just not sure the best way to handle all of them. Looking back historically in Analytics, it appears many of these did drive significant organic traffic in the past, so I'm going to have a tough time justifying deleting a lot of them. Any advice?
Algorithm Updates | | BoxerPropertyHouston0 -
Is this the best way to get rid of low quality content?
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination. However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process). Some advise on how to proceed from here would be fantastico and danke <colgroup><col width="493"></colgroup>
Algorithm Updates | | BrianYork-AIM0 -
Addicted to SEO: Please Help!
Hello. My name is Justin. And I am an addict. I have recently starting dabbling around with SEO about five months ago. And I have found myself wanting more and more. I feel like I needed to come to this group of other addicts and admit it to you. I am constantly checking my ranking in Google. I am obsessed with my keywords. And I will sell my soul for a PR8 in-bound link. I desperately want my site: www.findahealthcarejob.com to be the best in the universe and it has been a wonderful drug for me. This must stop. Please help me. What are some steps of your own 12-steps that help you detox from this addicting lifestyle? Thank you.
Algorithm Updates | | findachristianjob0