Need help with some duplicate content.
-
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was.
It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this:
http://noahsdad.com/resources/
http://noahsdad.com/resources/page/2/
http://noahsdad.com/therapy/page/2/
I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category.
What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed."
There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories.
Any ideas what I should do?
Thanks.
-
I didn't mention "prev" and "next" as they are already implemented in the head tag, would you add them directly to the links as well? Also, I think Google is the only search engine that supports them at the moment.
-
Gianluca is correct. prev next would work here, but i thought this would be too confusing, i did not know there were plugins that can do this for you. also, this would make page one rank for all the content, this may confuse users when they dont find the content the searched for on that page. so technicaly it would work, but foor the user i dont know if it is the right solutions, this works best for one article over many pages.
-
The correct answer to your kind of issue, which is related to psgination, is this one: Use the rel="prev" rel="next" tags. These are the tags Google suggest to use in order to specify that a set of pages are paginated, hence it will just consider the first one. Check these links: http://googlewebmastercentral.blogspot.com.es/2011/09/pagination-with-relnext-and-relprev.html http://googlewebmastercentral.blogspot.com.es/2012/03/video-about-pagination-with-relnext-and.html http://www.seomoz.org/q/need-help-with-some-duplicate-content There are several plugins for Wordpress about this solution.
-
Yes, I have about 60 404's and 403's I'm trying to correct...
Thanks for the feedback by the way.
-
I've never used Wordpress but does this help?
http://www.malcolmcoles.co.uk/blog/avoid-duplicate-meta-descriptions-in-pages-2-and-higher-of-the-wordpress-loop/It's strange how it's possible to add canonical page numbers, but not add the same thing to the title tag, I think.
-
You look like you're doing a good job, you even have unique text content for each video on the pages, so I can't see why they're flagging as duplicates. Is this in the SEOmoz software? That takes into account the whole structure of the page rather than just the content. Like Alan says, add the page number to the title tag if possible, though I'd add it at the beginning of the tag - it just helps show the search engines that page 1 is the most important.
P.S. this is still a good article a couple of years later: http://www.seomoz.org/blog/pagination-best-practices-for-seo-user-experience
-
Thats why i said if it is difficult then i would not worry.
i would not no-index them,
if you had unique titles, you may rank a bit better, you ae not going to get punished for it if they dont. but if you no-index, you are punishing yourself.
not only do no-indexed pages not appear in search results, but any link pointing to them is wasting link juice.
-
I'm not sure how you would give the author pages different titles on a Wordpress powered site...
Should I check some of the no index settings within the plugin?
-
OK, then yes try to give them unique page titles, even add page 2 on the end, if this is difficault to do then i would not worry too much about it.
-
On my reports they show up as duplicate page titiles....
-
Maybe i am not understading you, but these pages dont apear to be duplicates top me
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing server location nearest to visitors? i am confused with the content part.
hi there, currently hosted in Singapore, and target audience is the US, john mueller said keep the url, content and cms the same. i am confused with the content part i have been tweaking the content for a month now because i have changed content on my site a day ago if i change the server the next day? is that bad? what should be done?
Algorithm Updates | | maria-cooper90 -
An Educated Eye Needed
As a small video production company, we rely on word of mouth and good Internet placement to generate business. From first glance, what would cause potential customers/search engines to rank our site low? https://episode11productions.com We believe that we have done all that we "know" to do, and are now at a loss.
Algorithm Updates | | e11productions1 -
Am I hit by an update????? Please help
Hello. Is possible to have just a page or two penalized?? From 24th april ( around that date i heard was an update) two of my pages ( out of 37) do not rank for their main keywords but are indexed. Also, lately a lot of my pages drop out of top 100 and get back on their position. Any help? Am I in trouble? Anyone has the same problem?
Algorithm Updates | | Ag96adc0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Lost 75% of my traffic on Oct 25, help appreciated
So I've been running coolquotescollection.com since 1997 (!) as a hobby project. I lost about 75% of my organic search traffic on the 25th of October, literally overnight. I've been doing a lot of research but I still don't know why I was penalized. Image attached. I naturally thought this was because of Penguin (Oct 17, my drop was Oct 25). However, after checking backlinks I only discovered 11 domains with about 100-400 links each, the major ones were forum signatures and blog sidebars, 6 domains were spam sites / directories. They almost exclusively used the same anchor text (domain name or similar), so this doesn't seem like a black hat attack. Some of the directories used keywords in their urls however (like "funny quotes").
Algorithm Updates | | Sire
1. Is this really enough for such a heavy penalty?
I added these domains to be disavowed today, I'm aware this might take weeks or months to change. I've automated so that pictures gets uploaded to my Facebook page with a link to my page. This started in early 2014.
2. Can Facebook links be considered link spam?
They don't even show up in webmaster tools.
Example: https://www.facebook.com/CoolQuotesCollection/photos/a.510328825689624.1073741825.326096120779563/615403025182203/?type=1&theater I analyzed keywords and the major ones dropped between 2 and 6 positions. Notable exception: I seem to still rank nr 1 for "cool movie quotes" even though page is not optimized for that keyword. Moz warned about over 5000 pages with duplicate content. It was a single page that used a querystring url parameter I have excluded in webmaster tools. I have now entered a canonical link on these pages. Example:
http://coolquotescollection.com/Home/TShirts?url=http-url-example...
http://coolquotescollection.com/Home/TShirts?url=http-another-url.......
3. Could the Google algo penalize this even though I have excluded the "url" parameter? I have a lot of internal links in the page navigation. Can this cause problems? See the absolute bottom of this page where I have 94 links for example: http://coolquotescollection.com/laughs
4. Could a lot of internal links (navigation to page numbers) be the problem? Some more facts: Site is http://coolquotescollection.com/ Domain is 14 years old. The web site launched in Sep 1997, a year before Google! (Not relevant but you might understand why this is important to me). I haven't done any SEO work for at least 12 months, probably closer to two years. The only SEO work I've done is to optimize the pages, no link building at all, no black hat stuff. I'm automatically building a sitemap that contains all pages, see here: http://coolquotescollection.com/robots.txt I've used webmaster tools for years, haven't gotten any warnings. I checked backlinks there, also here from moz and ahrefs. I'm annoyed that a quality content site can be penalized so hard (75% drop) when there are no, or just smaller issues. I'm just lucky this is not my business site, if so I would have gone out of business. Any help in this matter would be greatly appreciated! z3yFNdb.png Cci7vfI.png0 -
Does it impact over ranking of any website if their same content being used some other external sources
Hi Moz & members, I just want to make sure over website www.1st-care.org , does it impact over ranking this website if the same content (of about us or home care services) being used some other external sources or local citations places. Do those published same content create any ranking drop issue with this website's and making its content strengthen week? . As I was on 9th position in Google.com before, now it has slipped to 29th position. WHY? is there content issue or anything else which i am not aware.
Algorithm Updates | | Futura
See the content used:
Home page content
About us page content Regards,
Teginder Ravi0 -
Will we no longer need Location + Keyword? Do we even need it at all?
Prepare yourselves. This is a long question. With the rise of schema and Google Local+, do you think Google will now have enough data about where a business is located, so that when someone searches for, a keyword such as "Atlanta Hyundai dealers" a business in Atlanta that's website: has been properly marked up with schema (or microdata for business location) has claimed its Google Local+ has done enough downstream work in Local Search listings for its NAP (name, address, phone number) will no longer have to incorporate variations of "Atlanta Hyundai dealers" in the text on the website? Could they just write enough great content about how they're a Hyundai dealership without the abuse of the Atlanta portion? Or if they're in Boston and they're a dentist or lawyer, could the content be just about the services they provided without so much emphasis tied to location? I'm talking about removing the location of the business from the text in all places other than the schema markup or the contact page on the website. Maybe still keep a main location in the title tags or meta description if it would benefit the customer. I work in an industry where location + keywords has reached such a point of saturation, that it makes the text on the website read very poorly, and I'd like to learn more about alternate methods to keep the text more pure, read better and still achieve the same success when it comes to local search. Also, I haven't seen other sites penalized for all the location stuffing on their websites, which is bizarre because it reads so spammy you can't recognize where the geotargeted keywords end and where the regular text begins. I've been working gradually in this general direction (more emphasis on NAP, researching schema, and vastly improving the content on clients' websites so it's not so heavy with geo-targeted keywords). I also ask because though the niche I work in is still pretty hell-bent on using geo-targeted keywords, whenever I check Analytics, the majority of traffic is branded and geo-targeted keywords make up only a small fraction of traffic. Any thoughts? What are other people doing in this regard?
Algorithm Updates | | EEE30 -
Lesser visited, but highly ranked landing paged dropped in rank on Google. Time for a content update?
I noticed that my page one ranked landing pages that don't get a lot of love from me have dropped in rank big time on Google this week. This is a site that has static (meaning, I can't freshen up the content easily) landing pages for products that we sell. The pages that dropped are the ones that have the fewest inbound links, and don't get much attention on the social media side. Our most important landing pages have also dropped, but just a few spots on page one. This is a first for me. Does anyone think that this is a "lack of freshness" penalty? We are still number one on page one for our brand search terms. Would fresh content give me a shot at getting the pages back up? I'm willing to update them slowly, but before I go crazy, I'm reaching out to the pros here.
Algorithm Updates | | Ticket_King0