Duplicate Titles for Large Lists
-
Our blog (www.cowleyweb.com/blog) has recently been given topic categories so we can utilize our old blogs. Otherwise, users would only see what's new and never look back (our blogs are organized by the month they were published) and all that hard work would kind of be a waste after a while.
So we came up with a few topics (i.e. social media, internet marketing, etc.) and adding those as tags to blogs. Now, users can click the topics and get a results page on our blog of all the previously published blogs related to that topic. Sounds great.
BUT, it's hurting our SEO crawl report. If the list goes beyond one page of search results, the 2nd and subsequent pages get dinged as "duplicate title" b/c they share the same title (i.e. "Social Media"). How can I fix this?
I'm not the web designer but something tells me maybe some sort of tag that says "Page 2" or something would do the trick. We use Drupal which is good for customization.
I assume tons of bloggers and websites have dealt with this problem.
Please help. Want to give the web guy some solutions.
Thank you.
-
I have never used it but try implementing the Smart Paging module + Tokens - http://drupal.org/project/smart_paging
-
I would suggest going to Analytics, segmenting by organic search traffic, and seeing if anyone has landed on those pages from search results in the last 2-3 months. If Google is not returning them in search results, and they are not bringing traffic, Google usually favors cleaning pages out of the index that don't need to be there.
If you don't want to noindex them, you can add "Page 2" etc to the title tags to eliminate the duplicate title errors in the crawl report.
-Dan
-
Talked to our web designer. He said he's nervous about no-indexing as if Google will get suspicious and it will hurt more than be a solution. Don't know what to say.
-
Hey There
Sounds like you are all set - just want to add that the type of page you're referring to: page/2 etc is "subpages" and you'll also want to look into noindexing those as well, in addition to "tags" and "categories". That should also fix the errors you're seeing in the Moz report.
-Dan
-
Thanks guys! That's awesome. Forwarding to our developer.
-
It's usually suggested that Tag archives and Category archives be set to NoIndex which will help in alleviating this issue.
-
This sounds like a Drupal/CMS issue. if Drupal doesn't have a fix natively, i'm sure a 3rd developer may have a solution:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
Should i not use hyphens in web page titles? Google Penalty for hyphens?
all the page titles in my site have hyphens between the words like this: http://texas.com/texas-plumbers.html I have seen tests where hyphenated domain names ranked lower than non hyphenated domain names. Does this mean my pages are being penalized for hyphens or is this only in the domain that it is penalized? If I create new pages should I not use hyphens in the page titles when there are two or more words in the title? If I changed all my page titles to eliminate the hyphens, I would lose all my rankings correct? My site is 12 years old and if I changed all these titles I'm guessing that each page would be thrown in the google sandbox for several months, is this true? Thanks mozzers!
Web Design | | Ron100 -
Parallax, SEO, and Duplicate Content
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize. We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function. Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang. www.example.com/About < This is its own page www.example.com/about/#/history < This is a subpage that you scroll to on the About page We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content? Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user. I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
Web Design | | PaulRonin2 -
Pointless copy on product list pages makes me feel compromised...
When working on ecommerce websites we insist that product list pages need at least 250 words of copy that is optimised for our keyword phrase ... lets say "17 inch bike frames". So we have some crappy copy written that goes something like this.... "We have a great 17 inch bike frame for you whatever your requirement. Take a look at the frames below .... blah blah blah totally pointless text blah blah blah........." This text is of no use to the user as the page is merely a means of them getting to a suitable product page. However, the copy is pretty essential if we want to rank well for "17 inch bike frames" and not having copy on product list pages could land us in hot water with Panda ...especially if we have lots of them on a site using the same page template and with no copy on them. Does anyone else feel uneasy with adding this crappy text to pages? It's only there for search engines and that is something that Google say's we shouldn't do but I know for sure they're not going to rank me as well if I don't have it. I'd be interested to hear other people's opinion on this. It's always annoyed me. Does anyone have any good tips for making this type of copy on product list pages less forced and crappy?
Web Design | | QubaSEO0 -
How do SEOMOZ calculate duplicate content?
first of all i have to much duplicate stuff on my website end cleaning it up. But if i look at GWMC the duplicate stuff is a lot less than in SEOMOZ? can someone explain to me what the difference is? Thnx, Leonie.
Web Design | | JoostBruining0 -
Getting a lot more duplicate content warnings than I expected.
I run WordPress on many of my sites and a site crawl has found MANY duplicate content pages on the latest domain I started a campaign for. I expected to see quite a lot on the tag pages that only had one post but even tag pages with multiple posts and author and category pages with many posts are showing as duplicate content. Is this normal for a WordPress site to have so much duplicate content warnings from the taxonomy pages? I have the option to bulk noindex, follow the category and tag pages but should I do it? I get some traffic directly to the tag pages so removing the pages from search results would dent the traffic of the site a little (generally high bounce rate, low engagement traffic anyway) but could removing the apparent duplicate content actually improve the article pages themselves? Or does anyone have any WordPress specific advice for making the pages not duplicate content? I've toyed with the idea of just displaying excerpts but creating manual excerpts for the 4 years worth of posts, some of which I have no personal knowledge of the subject matter so other suggestions are welcome.
Web Design | | williampatton0 -
Hi Everybody. I have a large site that is made up of the main site then a large support site. The support site has a lot of overlapping content and similar titles. Would it be beneficial to separate the two? Thank you. All answers appreciated.
Hi Everybody. I have a large site that is made up of the main site then a large support site. The support site has a lot of overlapping content and similar titles. Would it be beneficial to separate the two? Thank you. All answers appreciated.
Web Design | | arithon0 -
Page Title or Search Friendly Urls?
We are currently auditing our website as part of our SEO strategy. One item which hascome up is the importance of search friendly urls against the search engine friendly page titles. Do url's or page titles carry more relevance than the other in search engines? Obviously the ideal would be to have both to maximise search impact but do either carry more importance. Thanks
Web Design | | bwfc770