Need help with some duplicate content.
-
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was.
It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this:
http://noahsdad.com/resources/
http://noahsdad.com/resources/page/2/
http://noahsdad.com/therapy/page/2/
I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category.
What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed."
There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories.
Any ideas what I should do?
Thanks.
-
I didn't mention "prev" and "next" as they are already implemented in the head tag, would you add them directly to the links as well? Also, I think Google is the only search engine that supports them at the moment.
-
Gianluca is correct. prev next would work here, but i thought this would be too confusing, i did not know there were plugins that can do this for you. also, this would make page one rank for all the content, this may confuse users when they dont find the content the searched for on that page. so technicaly it would work, but foor the user i dont know if it is the right solutions, this works best for one article over many pages.
-
The correct answer to your kind of issue, which is related to psgination, is this one: Use the rel="prev" rel="next" tags. These are the tags Google suggest to use in order to specify that a set of pages are paginated, hence it will just consider the first one. Check these links: http://googlewebmastercentral.blogspot.com.es/2011/09/pagination-with-relnext-and-relprev.html http://googlewebmastercentral.blogspot.com.es/2012/03/video-about-pagination-with-relnext-and.html http://www.seomoz.org/q/need-help-with-some-duplicate-content There are several plugins for Wordpress about this solution.
-
Yes, I have about 60 404's and 403's I'm trying to correct...
Thanks for the feedback by the way.
-
I've never used Wordpress but does this help?
http://www.malcolmcoles.co.uk/blog/avoid-duplicate-meta-descriptions-in-pages-2-and-higher-of-the-wordpress-loop/It's strange how it's possible to add canonical page numbers, but not add the same thing to the title tag, I think.
-
You look like you're doing a good job, you even have unique text content for each video on the pages, so I can't see why they're flagging as duplicates. Is this in the SEOmoz software? That takes into account the whole structure of the page rather than just the content. Like Alan says, add the page number to the title tag if possible, though I'd add it at the beginning of the tag - it just helps show the search engines that page 1 is the most important.
P.S. this is still a good article a couple of years later: http://www.seomoz.org/blog/pagination-best-practices-for-seo-user-experience
-
Thats why i said if it is difficult then i would not worry.
i would not no-index them,
if you had unique titles, you may rank a bit better, you ae not going to get punished for it if they dont. but if you no-index, you are punishing yourself.
not only do no-indexed pages not appear in search results, but any link pointing to them is wasting link juice.
-
I'm not sure how you would give the author pages different titles on a Wordpress powered site...
Should I check some of the no index settings within the plugin?
-
OK, then yes try to give them unique page titles, even add page 2 on the end, if this is difficault to do then i would not worry too much about it.
-
On my reports they show up as duplicate page titiles....
-
Maybe i am not understading you, but these pages dont apear to be duplicates top me
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing text content on a site affects seo?
HI, i have changed some h1 and h2 , changed and added paragraphs,fixed plagiarism,grammar and added some pics with alt text, I have just done it today, I am ranking on second page QUESTION-1 is it gonna affect my 2 months SEO efforts? QUESTION -2 Do I have to submit sitemap to google again? QUESTION-3 does changing content on the site frequently hurts SEO?
Algorithm Updates | | Sam09schulz0 -
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
Does my website need the SSL Cert / HTTPS Update?
So, I own a car shipping company called Car Shipping Carriers ( www.carshippingcarriers.com ) and I am trying to find out if I need the SSL Cert / HTTPS for the site. I attached a picture that shows I dropped a HUGE amount of rank back in the beginning of August 2014 and that the SSL/HTTPS update happened at nearly the exact same time. I do have a quote box on my website asking for: Name, Phone, Email, Origin, Destination, Move Date, Year/Make/Model of Vehicle, and Carrier Type. I was unsure if I needed the HTTPS because I am not asking for sensitive data, but it seems that I might need to bite the bullet and update the site to HTTPS. What do you all think? Any expert opinion and/or advice? JWV4N1o
Algorithm Updates | | Dutko23850 -
Need to be reindexed quickly - SERP is showing a 404
So there was a mistake made where a 404 error was placed in the canonical URL for the pages my company made. We need to have these pages quickly reindexed. I asked GWT to fetch them and have an updated sitemap but the SERPs are still the same. Any tricks anyone knows that would allow me to get reindexed faster?
Algorithm Updates | | mattdinbrooklyn0 -
Multiple products with legitimate duplicate descriptions
We are redeveloping a website for a card company who have far too many products to write unique descriptions for each. Even if they could I don't think it would be beneficial to the user. However they do have unique descriptions for each range which is useful for users viewing an individual card. Which is better practice: a) Ignore the duplicate content issue and supply the user with info about the range b) Provide clear enticing links to find out more about the range which will leave the individual card page a little void of content. Many thanks
Algorithm Updates | | SoundinTheory0 -
New linkbuilding: If networks are useless, and I need high volume through a 1-man team, what's the best option?
I work for an online retailer, and we have thousands of product pages and our vertical for content is brutal -- half of them are owned by our competitors. Are there any new linkbuilding strategies that can be done through a 1-man team? I'm not talking about bots or traditional link networks. Our current strat revolves around the following: 1. Link prospecting through buzzstream tools and singular contacts 2. Finding bloggers/vloggers, sending product and having them send backlinks to our homepage level with their reviews (slow turnaround, low juice). 3. Syndicating our videos through multiple avenues. 4. Being active on social. We need to gain more authority outside of simple content building. Are there any alternatives to link networks to optimize build outs via a 1-man team? Many thanks!
Algorithm Updates | | eugeneku0 -
Host name per content
Hello everyone. I'm in charge of the website HispaZone.com in which apart from many other things we provide free program downloads in spanish in a similar way to softpedia, tucows, cnet, softonic and others. I'm not a great SEO but I try to do my best. Several months ago based on my most important competence (softonic.com and uptodown.com) I decided that I would give a host name under the domain hispazone.com for the landing page of each program download. For downloading Nero for example the landing page would be http://nero.hispazone.com and like this for the whole of our 800 program database. The thing is that after 5-6 months since that change and after many other improvements, the traffic coming from google to these downloads dropped dramatically. We thought it could have been related to Google Panda but we recently hired an SEO consultant and he says that it's because of not having the downloads under the same host name. That we lose the page authority and the link flow from the hostname http://www.hispazone.com. The SEO consultant seems to be great, very up to date with all new changes in google. We made many improvements thanks to him and I can say that I trust him with everything. But now comes the time for deciding if we move our program download landing pages back to the www.hispazone.com hostname. I would like some second opinion about this because the fact that the biggest ones in Spain like Softonic and Uptodown have a hostname for each program download when these companies invest really a lot in their SEO makes me be unsure of going back into having all under the same hostname. Thanks a lot.
Algorithm Updates | | HispaZone0 -
To use the same content just changing the keywords could be seen as duplicate content?
I want to offer the same service or product in many different cities, so instead of creating a new content for each city what I want to do it to copy the content already created for the product and service of a city and then change the name of the city and create a new url inside my website for each city. for example let say I sell handmade rings in the USA, but I want o target each principal city in the USA, so I have want to have a unque url for ecxh city so for example for Miami I want to have www.mydomain.com/handmade-rings-miami and for LA the url would be www.mydomain.com/handmade-rings-la Can I have the same content talking about the handmade rings and just change the keywords and key phrases? or this will count as a duplicate content? content: TITLE: Miami Handmade Rings URL :www.mydomain.com/handmade-rings-miami Shop Now handmade rings in Miami in our online store and get a special discount in Miami purchases over $50 and also get free shipping on Miami Local address... See what our Miami handmade rings clients say about our products.... TITLE: LA Handmade Rings URL: www.mydomain.com/handmade-rings-la Shop Now handmade rings in LA in our online store and get a special discount in LA purchases over $50 and also get free shipping on LA Local address... See what our LA handmade rings clients say about our products.... There are more than 100 location in the country I want to do this, so that is why I want to copy paste and replace.. Thanks in advance, David Orion
Algorithm Updates | | sellonline1230