Need help with some duplicate content.
-
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was.
It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this:
http://noahsdad.com/resources/
http://noahsdad.com/resources/page/2/
http://noahsdad.com/therapy/page/2/
I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category.
What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed."
There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories.
Any ideas what I should do?
Thanks.
-
I didn't mention "prev" and "next" as they are already implemented in the head tag, would you add them directly to the links as well? Also, I think Google is the only search engine that supports them at the moment.
-
Gianluca is correct. prev next would work here, but i thought this would be too confusing, i did not know there were plugins that can do this for you. also, this would make page one rank for all the content, this may confuse users when they dont find the content the searched for on that page. so technicaly it would work, but foor the user i dont know if it is the right solutions, this works best for one article over many pages.
-
The correct answer to your kind of issue, which is related to psgination, is this one: Use the rel="prev" rel="next" tags. These are the tags Google suggest to use in order to specify that a set of pages are paginated, hence it will just consider the first one. Check these links: http://googlewebmastercentral.blogspot.com.es/2011/09/pagination-with-relnext-and-relprev.html http://googlewebmastercentral.blogspot.com.es/2012/03/video-about-pagination-with-relnext-and.html http://www.seomoz.org/q/need-help-with-some-duplicate-content There are several plugins for Wordpress about this solution.
-
Yes, I have about 60 404's and 403's I'm trying to correct...
Thanks for the feedback by the way.
-
I've never used Wordpress but does this help?
http://www.malcolmcoles.co.uk/blog/avoid-duplicate-meta-descriptions-in-pages-2-and-higher-of-the-wordpress-loop/It's strange how it's possible to add canonical page numbers, but not add the same thing to the title tag, I think.
-
You look like you're doing a good job, you even have unique text content for each video on the pages, so I can't see why they're flagging as duplicates. Is this in the SEOmoz software? That takes into account the whole structure of the page rather than just the content. Like Alan says, add the page number to the title tag if possible, though I'd add it at the beginning of the tag - it just helps show the search engines that page 1 is the most important.
P.S. this is still a good article a couple of years later: http://www.seomoz.org/blog/pagination-best-practices-for-seo-user-experience
-
Thats why i said if it is difficult then i would not worry.
i would not no-index them,
if you had unique titles, you may rank a bit better, you ae not going to get punished for it if they dont. but if you no-index, you are punishing yourself.
not only do no-indexed pages not appear in search results, but any link pointing to them is wasting link juice.
-
I'm not sure how you would give the author pages different titles on a Wordpress powered site...
Should I check some of the no index settings within the plugin?
-
OK, then yes try to give them unique page titles, even add page 2 on the end, if this is difficault to do then i would not worry too much about it.
-
On my reports they show up as duplicate page titiles....
-
Maybe i am not understading you, but these pages dont apear to be duplicates top me
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help guide pages from subdirectory must be opened in a new tab?
Hi, We have help guide pages for every feature we provide. They been hosted on different sub directory and we linked them from our website pages. Do we need to make these sub directory pages to open in a new tab when clicked from our website pages? Thanks
Algorithm Updates | | vtmoz0 -
Would having links on wikipedia help search engine rankings?
I am wondering that having a listing on Wikipedia would help search engine rankings in general? I know all of the Wikipedia links are no-follow links but I think it'd still help to rank higher on search engines. What are your thoughts?
Algorithm Updates | | ahmetkul1 -
Does my website need the SSL Cert / HTTPS Update?
So, I own a car shipping company called Car Shipping Carriers ( www.carshippingcarriers.com ) and I am trying to find out if I need the SSL Cert / HTTPS for the site. I attached a picture that shows I dropped a HUGE amount of rank back in the beginning of August 2014 and that the SSL/HTTPS update happened at nearly the exact same time. I do have a quote box on my website asking for: Name, Phone, Email, Origin, Destination, Move Date, Year/Make/Model of Vehicle, and Carrier Type. I was unsure if I needed the HTTPS because I am not asking for sensitive data, but it seems that I might need to bite the bullet and update the site to HTTPS. What do you all think? Any expert opinion and/or advice? JWV4N1o
Algorithm Updates | | Dutko23850 -
Need Advice - Google Still Not Ranking
Hi Team - I really need some expert level advice on an issue I'm seeing with our site in Google. Here's the current status. We launched our website and app on the last week of November in 2014 (soft launch): http://goo.gl/Wnrqrq When we launched we were not showing up for any targeted keywords, long tailed included, even the title of our site in quotes. We ranked for our name only, and even that wasn't #1. Over time we were able to build up some rankings, although they were very low (120 - 140). Yesterday, we're back to not ranking for any keywords. Here's the history: While developing our app, and before I took over the site, the developer used a thin affiliate site to gather data and run a beta app over the course of 1 - 2 years. Upon taking on the site and moving to launch the new website/app I discovered what had been run under the domain. Since than the old site has been completely removed and rebuild, with all associated urls (.uk, .net, etc...) and subdomains shutdown. I've allowed all the old spammy pages (thousands of them to 404). We've disavowed the old domains (.net, .uk that were sending a ton of links to this), along with some links that seemed a little spammy that were pointing to our domain. There are no manual actions or messaged in Google Webmaster Tools. The new website uses (SSL) https for the entire site, it scores a 98 / 100 for a mobile usability (we beat our competitors on Google's PageSpeed Tool), it has been moved to a business level hosting service, 301's are correctly setup, added terms and conditions, have all our social profiles linked, linked WMT/Analytics/YouTube, started some Adwords, use rel="canonical", all the SEO 101 stuff ++. When I run the page through the moz tool for a specific keyword we score an A. When I did a crawl test everything came back looking good. We also pass using other tools. Google WMT, shows no html issues. We rank well on Bing, Yahoo and DuckDuckGo. However, for some reason Google will not rank the site, and since there is no manual action I have no course of action to submit a reconsideration request. From an advanced stance, should we bail on this domain, and move to the .co domain (that we own, but hasn't been used before)? If we 301 this domain over, since all our marketing is pointed to .com will this issue follow us? I see a lot of conflicting information on algorithmic issues following domains. Some say they do, some say they don't, some say they do since a lot of times people don't fix the issue. However, this is a brand new site, and we're following all of Google's rules. I suspect there is an algorithmic penalty (action) against the domain because of the old thin affiliate site that was used for the beta and data gathering app. Are we stuck till Google does an update? What's the deal with moving us up, than removing again? Thoughts, suggestions??? I purposely, did a short url to leave out the company name, please respect that, since I don't want our issues to popup on a web search. 🙂
Algorithm Updates | | get4it0 -
Lost 75% of my traffic on Oct 25, help appreciated
So I've been running coolquotescollection.com since 1997 (!) as a hobby project. I lost about 75% of my organic search traffic on the 25th of October, literally overnight. I've been doing a lot of research but I still don't know why I was penalized. Image attached. I naturally thought this was because of Penguin (Oct 17, my drop was Oct 25). However, after checking backlinks I only discovered 11 domains with about 100-400 links each, the major ones were forum signatures and blog sidebars, 6 domains were spam sites / directories. They almost exclusively used the same anchor text (domain name or similar), so this doesn't seem like a black hat attack. Some of the directories used keywords in their urls however (like "funny quotes").
Algorithm Updates | | Sire
1. Is this really enough for such a heavy penalty?
I added these domains to be disavowed today, I'm aware this might take weeks or months to change. I've automated so that pictures gets uploaded to my Facebook page with a link to my page. This started in early 2014.
2. Can Facebook links be considered link spam?
They don't even show up in webmaster tools.
Example: https://www.facebook.com/CoolQuotesCollection/photos/a.510328825689624.1073741825.326096120779563/615403025182203/?type=1&theater I analyzed keywords and the major ones dropped between 2 and 6 positions. Notable exception: I seem to still rank nr 1 for "cool movie quotes" even though page is not optimized for that keyword. Moz warned about over 5000 pages with duplicate content. It was a single page that used a querystring url parameter I have excluded in webmaster tools. I have now entered a canonical link on these pages. Example:
http://coolquotescollection.com/Home/TShirts?url=http-url-example...
http://coolquotescollection.com/Home/TShirts?url=http-another-url.......
3. Could the Google algo penalize this even though I have excluded the "url" parameter? I have a lot of internal links in the page navigation. Can this cause problems? See the absolute bottom of this page where I have 94 links for example: http://coolquotescollection.com/laughs
4. Could a lot of internal links (navigation to page numbers) be the problem? Some more facts: Site is http://coolquotescollection.com/ Domain is 14 years old. The web site launched in Sep 1997, a year before Google! (Not relevant but you might understand why this is important to me). I haven't done any SEO work for at least 12 months, probably closer to two years. The only SEO work I've done is to optimize the pages, no link building at all, no black hat stuff. I'm automatically building a sitemap that contains all pages, see here: http://coolquotescollection.com/robots.txt I've used webmaster tools for years, haven't gotten any warnings. I checked backlinks there, also here from moz and ahrefs. I'm annoyed that a quality content site can be penalized so hard (75% drop) when there are no, or just smaller issues. I'm just lucky this is not my business site, if so I would have gone out of business. Any help in this matter would be greatly appreciated! z3yFNdb.png Cci7vfI.png0 -
Do Explainer Videos Help SEO?
My company makes explainer videos. I often come across a lot of (seemingly) inflated & unprovable stats, pertaining to explainer videos, from other companies. This article claims that "Having an explainer video on your web page makes it 53% more likely to show up on the first page of Google search results" Is there any real data to back up such a claim? Do explainer videos really help SEO? How?
Algorithm Updates | | WickVideo0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
How can I use Intuit without getting duplicate content issues
All of my Intuit site show duplicate content on the index pages. How can I avoid this
Algorithm Updates | | onestrohm0