Need help with some duplicate content.
-
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was.
It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this:
http://noahsdad.com/resources/
http://noahsdad.com/resources/page/2/
http://noahsdad.com/therapy/page/2/
I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category.
What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed."
There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories.
Any ideas what I should do?
Thanks.
-
I didn't mention "prev" and "next" as they are already implemented in the head tag, would you add them directly to the links as well? Also, I think Google is the only search engine that supports them at the moment.
-
Gianluca is correct. prev next would work here, but i thought this would be too confusing, i did not know there were plugins that can do this for you. also, this would make page one rank for all the content, this may confuse users when they dont find the content the searched for on that page. so technicaly it would work, but foor the user i dont know if it is the right solutions, this works best for one article over many pages.
-
The correct answer to your kind of issue, which is related to psgination, is this one: Use the rel="prev" rel="next" tags. These are the tags Google suggest to use in order to specify that a set of pages are paginated, hence it will just consider the first one. Check these links: http://googlewebmastercentral.blogspot.com.es/2011/09/pagination-with-relnext-and-relprev.html http://googlewebmastercentral.blogspot.com.es/2012/03/video-about-pagination-with-relnext-and.html http://www.seomoz.org/q/need-help-with-some-duplicate-content There are several plugins for Wordpress about this solution.
-
Yes, I have about 60 404's and 403's I'm trying to correct...
Thanks for the feedback by the way.
-
I've never used Wordpress but does this help?
http://www.malcolmcoles.co.uk/blog/avoid-duplicate-meta-descriptions-in-pages-2-and-higher-of-the-wordpress-loop/It's strange how it's possible to add canonical page numbers, but not add the same thing to the title tag, I think.
-
You look like you're doing a good job, you even have unique text content for each video on the pages, so I can't see why they're flagging as duplicates. Is this in the SEOmoz software? That takes into account the whole structure of the page rather than just the content. Like Alan says, add the page number to the title tag if possible, though I'd add it at the beginning of the tag - it just helps show the search engines that page 1 is the most important.
P.S. this is still a good article a couple of years later: http://www.seomoz.org/blog/pagination-best-practices-for-seo-user-experience
-
Thats why i said if it is difficult then i would not worry.
i would not no-index them,
if you had unique titles, you may rank a bit better, you ae not going to get punished for it if they dont. but if you no-index, you are punishing yourself.
not only do no-indexed pages not appear in search results, but any link pointing to them is wasting link juice.
-
I'm not sure how you would give the author pages different titles on a Wordpress powered site...
Should I check some of the no index settings within the plugin?
-
OK, then yes try to give them unique page titles, even add page 2 on the end, if this is difficault to do then i would not worry too much about it.
-
On my reports they show up as duplicate page titiles....
-
Maybe i am not understading you, but these pages dont apear to be duplicates top me
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to unrank your content by following expert advice [rant]
Hi, As you can probably see from the title, a massive rant is coming up. I must admit I no longer understand SEO and I just wanted to see if you have any ideas what might be wrong. So, I read this blog post on MOZ https://moz.com/blog/influence-googles-ranking-factor - where the chap is improving ranking of content that is already ranking reasonably well. I've got two bits of news for you. The good news is - yes, you can change your articles' ranking in an afternoon. Bad news - your articles drop out of Top 100. I'll give you a bit more details hoping you can spot what's wrong. Disclaimer - I'm not calling out BS, I'm sure the blogger is a genuine person and he's probably has had success implementing this. The site is in a narrow but popular ecommerce niche where the Top 20 results are taken by various retailers who have simply copy/pasted product descriptions from the manufacturer's websites. The link profile strength is varied and I'm not making this up. The Top 20 sites range from DA:4 to DA:56. When I saw this I said to myself, it should be fairly easy to rank because surely the backlinks ranking factor weight is not as heavy in this niche as it is in other niches. My site is DA:18 which is much better than DA:4. So, even if I make my pages tiny tiny bit better than this DA:4 site, I should outrank it, right? Well, I managed to outrank it with really crap content. So, I got to rank two high-traffic keywords in #8 or #9 with very little effort. And I wish I stayed there because what followed just completely ruined my rankings. I won't repeat what was written in the blog. If you're interested, go and read it, but I used it as a blueprint and bingo, indeed Google changed my ranking in just a couple of hours. Wait, I lost more than 90 positions!!!! I'm now outside Top100. Now even irrelevant sites in Chinese and Russian are in front of me. They don't even sell the products. No, they're even in different niches altogether but they still outrank me. I now know exactly what Alice in Wonderland felt like. I want out please!!!!
Algorithm Updates | | GiantsCauseway0 -
Duplicate website got indexed: Caused rank drop?
Hi all, We have replica of our website with exact pages and content. That website got indexed by mistake and allowed for bots for more than 10 days. Our ranking dropped now and we moved from 2nd page to 5th page. But previously we had this happened and didn't hurt much. We got punished now? Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Need to be reindexed quickly - SERP is showing a 404
So there was a mistake made where a 404 error was placed in the canonical URL for the pages my company made. We need to have these pages quickly reindexed. I asked GWT to fetch them and have an updated sitemap but the SERPs are still the same. Any tricks anyone knows that would allow me to get reindexed faster?
Algorithm Updates | | mattdinbrooklyn0 -
Help for a webstore with Google Warnings for Watermark Images and Panda
I have not had too much experience with helping websites that have been hit by Panda - any tried and tested formulas I can pass to website owner would be great. He does not want to reveal domain name - its in the area of children/baby products 'Web site featured on page 1 of Google search results for many years (website 5 years old- Australian domain) . In April/May 2014, Google suspended our Google Shopping account because we used watermarks on all our images. We were advised that the suspension would remain in place indefinitely or until such time the watermarks were removed. We wrote back to Google to explain that these watermarks were put in place by our store back 2005 with the sole purpose of protecting our intellectual property. Needless to say, their attitude was unwavering. And as a result, revenue plummeted. However, the perfect storm was about to hit our store without warning. In the same month, Panda 4.0 was unleashed and our store was hit once again. This update alone reduced visitor numbers by around 50% overnight. The Panda 4.0 algorithm update was designed to target poor quality, duplicate content and unfortunately we had some of it. We have now begun creating original content with many of the new products we're uploading onto our web site. It's slow and tedious. We have modified our web site to now include a tag on a the home page (this was missing). We have removed many duplicate links from our footer (it was too big and contained hundreds of links that were also repeated from the header). We introduced a blog and we have engaged the services of a local seo company to disavow any bad backlinks and add missing or improve existing content to category and brand pages. No improvement in our situation is yet visible and with Christmas just 3 months away, poor sales during our 'bread and butter' period will mean even tougher times for our store in 2015. ANY PANDA EXPERTS who can help please email me felicity@gardenbeet.com - looking for independent freelancers rather than agencies
Algorithm Updates | | GardenBeet0 -
Getting listed in the Google local result - help!
Good day, I'm really struggling to get a client to appear in the Google Local map snapshot (on the right of the SERPs), even when their company name is Googled. I've tried everything including getting the main Google Local account verified, had some reviews put up, all the required and relevant info has been completed, yet their location and the map never appear. Any help out there as to how I can remedy this? Thanks
Algorithm Updates | | Martin_S1 -
Panda, Negative SEO and now Penguin - help needed
Hi,
Algorithm Updates | | mlm12
We are small business owners who've been running a website for 5 years that provides our income. We've done very little backlinking ourselves, and never did paid directories or anything like that - usually just occasional forum or blog responses. A few articles here and there with some of our keyword phrases for internal pages. Of course I admit we've done some kwp backlinks on some blogs, but our anchor text profile is largely brand names and our domain name and non keywords (excepting for some "bad" backlinks). Our DA is 34, PA 45 for our home page. We were doing great until last Sept 27 when we got hit by Panda and have been working on deoptimizing our site for keywords, we made a new site in Wordpress for good architecture and ease of use for our customers, and we're deleting/repurposing low quality pages and making our content more robust. We haven't yet recovered from this and now it appears we got hit May 22 for Penguin...ARGH! I recently discovered (hard to have time to devote to everything with just two of us) that others can "negative seo" a site now and I feel this has happened based upon results below... I signed up for linkdetox.com yesterday and it gives a grim picture of our backlinks (says we are in "deadly risk" territory). We have 83 "toxic" links and 600 some "suspicious" links (many are in malware/malicious listed sites, many are .pl domains from Poland, others are I believe foreign domains, or domains that are a bunch or letters that make no sense, or spammy sounding emd domains), - this makes up 80% of our links. As this is our only business, our income is now 1/3 of what it has been, even with PPC ads going as we've been hit hard by all of this and are wondering if we can survive fixing this. We do have an SEO firm minimally helping us along with guidance on recovering, but with income so low, we are doing the work ourselves and can't afford much. Needless to say, we are quite distressed and from reading around, not sure if we'll be able to recover and that is deeply saddening, especially from Negative SEO. We want to make sure we are on the right path for recovery if possible, hence my questions. We haven't been in contact with Google for reconsideration, again, no penalty messages from them. First of all, if we don't have a manual penalty, would you still contact all the toxic/malicious/possible porn looking sites and ask for a link removal, wait, ask for link removal, wait then disavow? Or just go straight to Google disavow? For backlinks coming from sites that are "gone" (like a message saying the account has been suspended), or there is no website there anymore, do I try and contact them too? Or go direct to disavow? Or do nothing? For the sites flagged as malicious (by linkdetox, my browser, or by Google), I don't want to try and open them on my browser to see if this site is legitimate. If linkdetox doesn't have the contact info for these - what are we supposed to do? For "suspicious" foreign sites that I can't read the webpage -would you still disavow them (I've seen many here say links from foreign sites should be disavowed). How do you keep up with all this is someone is negative SEOing you? We're really frustrated that Google's change has made it possible for competitors to tank your business (arguably though, if we had a stronger backlink profile this may not have hurt, or not as much - not sure). When you are small biz owners and can't hire a group to constantly monitor backlinks, get quality backlinks, content, site optimization, etc - it seems an almost impossible task to do. Are wordpress left nav and footer link anchor text an issue for Penguin? I would think Google would realize these internal links will be repetitive for the same anchor text on Wordpress (I know Matt Cutts said to not use the same anchor text more than once for internal linking -but obviously nav and footer menus will do this). What would you do if this was you? Try and fix it all? Start over with a new domain and 301 it (some say this has been working)? Just start over with a new domain and don't redirect? Thanks for your input and advice. We appreciate it.0 -
Question relates to mobile site & duplicate content.
We are working on the mobile version of a large site (migraine.com) and will be using a separate theme for it (directing visitors to m.migraine.com)- what are the necessary code or other important step we should take so that we do get penalized for having duplicate content? Thank you in advance for your responses
Algorithm Updates | | OlivierChateau0