Duplicate content
-
the report shows duplicate content for a category page that has more than one page.
how can we avoid this since i cannot make a different meta content for the second page of the category page:
http://www.geographics.com/2-Cool-Colors-Poster-Board-14x22/c183_66_327_387/index.html
http://www.geographics.com/2-Cool-Colors-Poster-Board-14x22/c183_66_327_387/index.html?page=2
thanks,
Madlena
-
I'm not seeing that Google is currently indexing either of these pages, so they may be too deep or duplicated in other ways. Pagination is a tough issue, but in general pages 2+ have little or no search value (and, post-Panda, can actually harm you).
I would strongly recommend NOT using a canonical tag to page 1 - Google generally advises against this. You can use rel=prev/next, although it's a bit tough to implement and isn't honored by Bing. Generally, I'd advise one of two things:
(1) META NOINDEX, FOLLOW pages 2, 3, etc. - they really have no SEO value.
(2) If you have a View All page, link to it and rel-canonical to view all. This seems to be accepted by Google, but then the larger page will rank.
Generally, I find (1) easier and pretty effective.
Sorry, just saw Nakul's comment, and didn't realize you already have canonical tags in place. While it's not preferred solution, since it's already there and seems to be keeping these pages out of the index, I'd probably leave it alone. It doesn't look like Google is indexing these pages at all right now, though, which you may need to explore in more depth.
-
I see that you already have canonical tags in place. If I am on either of the 2 URLs you posted or if I am on http://www.geographics.com/?cPath=183_66_327_387&custom_perpage=48 or http://www.geographics.com/?cPath=183_66_327_387&custom_perpage=24 They all lead me to the same page, which will help you get rid of any possible duplicate content penalty, because you are passing a directive to Google telling that what the correct URL is so they only rank the canonical tag URL in the SERPS. IMO, you are good. You can however take it to the next level if needed by implementing the rel canocal = next / prev and test it to see if that helps.
-
I am not entirely sure if this will prevent the dup content issue, but you could try setting up rel canocal = next / prev for the pages to make it explicit that they are paginated content, and then change the rel canoncal on the individual pages to point to themselves instead of the index page.
If it's the rel cannocal causing confusion, that should help.
-
You will need to use the rel=cannonical meta tag in your index.html file.
In the section of your index.html file include the follow:
It will solve your problem. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with this duplicate content
Hello our websites offers prayer times in the US and UK. The problem is that we have nearby towns where the prayer times are the same and the pages (exp : https://prayer-times.us/prayer-times-lake-michigan-12258-en and https://prayer-times.us/prayer-times-lake-12147-en) are in duplicate . Same issue for this page https://prayer-time.uk/prayer-times-wallsend-411-en How can we solve this problem
On-Page Optimization | | Zakirou0 -
Unique Pages with Thin Content vs. One Page with Lots of Content
Is there anyone who can give me a definitive answer on which of the following situations is preferable from an SEO standpoint for the services section of a website? 1. Many unique and targeted service pages with the primary keyword in the URL, Title tag and H1 - but with the tradeoff of having thin content on the page (i.e. 100 words of content or less). 2. One large service page listing all services in the content. Primary keyword for URL, title tag and H1 would be something like "(company name) services" and each service would be in the H2 title. In this case, there is lots of content on the page. Yes, the ideal situation would be to beef up content for each unique pages, but we have found that this isn't always an option based on the amount of time a client has dedicated to a project.
On-Page Optimization | | RCDesign741 -
Duplicate content on events site
I have an event website and for every day the event occurs the event has a page. For example: The Oktoberfest in Germany the event takes 16 days. My site would have 16 (almost)identical pages about the Oktoberfest(same text, adres, photos, contact info). The only difference between the pages is the date mentioned on the page. I use rich snippets. How does google treat my pages and what is the best practice.
On-Page Optimization | | dragonflo0 -
Acquired Old, Bad Content Site That Ranks Great. Redirect to Content on My Site?
Hello. my company acquired another website. This website is very old, the content within is decent at best, but still manages to rank very well for valuable phrases. Currently, we're leaving the entire site active on its own for its brand, but i'd like to at least redirect some of the content back to our main website. I can't justify spending the time to create improved content on that site and not our main site though. What would be the best practice here? 1. Cross-domain canonical - and build the new content on our main website? 2. 301 Redirect Old Article to New Location containing better article 3. Leave the content where it is - you won't be able to transfer the ranking across domain. Thanks for your input.
On-Page Optimization | | Blenny0 -
Duplicate Content - Category Pages 2+
I have my Wordpress SEO settings to deindex past page 1 of each category. However, Google Webmasters is selling me I have 210 pages with duplicate title tags. My site tanked last weekend and I don't know if it was Google Panda or what. I have been getting some fantastic backlinks and it seems like they just decided to disregard all of them as I am completely off the SERPs. Is this duplicate content a contributing factor? How can I get google to deindex my category pages past page 1? (I do need the first page to index as that does bring me organic traffic) Thanks.
On-Page Optimization | | 2bloggers0 -
What is the best way to manage industry required duplicate Important Safety Information (ISI) content on every page of a site?
Hello SEOmozzer! I have recently joined a large pharmaceutical marketing company as our head SEO guru, and I've encountered a duplicate content related issue here that I'd like some help on. Because there is so much red tape in the pharmaceutical industry, there are A LOT of limitations on website content, medication and drug claims, etc. Because of this, it is required to have Important Safety Information (ISI) clearly stated on every page of the client's website (including the homepage). The information is generally pretty lengthy, and in some cases is longer than the non-ISI content on each page. Here is an example: http://www.xifaxan.com/ All content under the ISI header is required on each page. My questions are: How will this duplicated content on each page affect our on-page optimization scores in the eyes of search engines? Is Google seeing this simply as duplicated content on every page, or are they "smart" enough to understand that because it is a drug website, this is industry standard (and required)? Aside from creating more meaty, non-ISI content for the site, are there any other suggestions you have for handling this potentially harmful SEO situation? And in case you were going to suggest it, we cannot simply have an image of the content, as it may not be visible by all internet users. We've already looked into that 😉 Thanks in advance! Dylan
On-Page Optimization | | MedThinkCommunications0 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0 -
Cross Domain Duplicate Content
Hi My client has a series of websies, one main website and several mini websites, articles are created and published daily and weekly, one will go on a the main website and the others on one, two, or three of the mini sites. To combat duplication, i only ever allow one article to be indexed (apply noindex to articles that i don't wanted indexed by google, so, if 3 sites have same article, 2 sites will have noindex tag added to head). I am not completely sure if this is ok, and whether there are any negative affects, apart from the articles tagged as noindex not being indexed. Are there any obvious issues? I am aware of the canonical link rel tag, and know that this can be used on the same domain, but can it be used cross domain, in place of the noindex tag? If so, is it exactly the same in structure as the 'same domain' canonical link rel tag? Thanks Matt
On-Page Optimization | | mattys0