Best practice to solve this Unique duplicate page content issue?
-
I just got Seomoz Pro (it's awesome!), and when I did a campaign for my website I discovered that I have a big issue with duplicate page content (as well as titles).
The Crawl Diagnostics Summary told me I have 196 Crawl Errors Found (I had a total of 362 pages crawled on my site), and as much as 160 of these was duplicate page content. Which to me sounds like a big problem, correct me if I'm wrong (I'm very new to SEO).
So our website is an ecommerce that sells greeting cards. The unique part about our platform is that we offer the customer to make a customization of the cards.
Let me walk you through each step a customer takes so you fully understand:-
They find a card they like and visit the product page of that card (just like on any ecommerce store.)
-
They then decide they want to buy it. There is no "Add to cart" button, they will instead click on a "customize the card" button.
3) This takes them to a step by step process of customizing the card. They change the name on the front of the greeting card so it says for example: "Happy Birthday Katy!". And then adds a personal text on the inside of the card.
- They then add an delivery address and when it should be delivered. After that they proceed to checkout and it's all done.
This is my website (it's in Swedish): loveday.se - it will take you to a product page so that you can click the green button and see what I mean with the customization pages. Hopefully it helps even though it's in Swedish.
My issue starts at the customization part of the site (the bolded step above), as I can see the permalinks in the diagnostics I got.
This step-by-step process looks exactly the same with every card in the store. Same call-to-action headline, same descriptive text etc. The only difference is a JPEG-file with the unique greeting card design.So, what is your take on this? Let me know if I was unclear about something.
Any help or advice is greatly appreciated.
-
-
Ahh, I see! Thanks a lot. Really appreciate it.
I also found from reading one of evovlingSEO's blog posts that with the help of checking my google webmasters account for any reports on duplicate content, I could see if Google had found any duplicate content.
There was no reports on this, so I guess it could be Roger crawling pages that Google don't? But I can see from viewing my source code that the code snippet you suggested me to add isn't there.
I will get back when I know if it's been solved or not for sure!
Thanks again.
-
I see what you mean. Here's what you do for these particular pages.
Since these have no real value as a search engine landing page (since they're basically all the same), Google won't want to send people to them. Seems reasonable, right?
But, because your site has a whole lot of these, Google may also decide that loveday.se as a whole is feeding them content that has a high % of non-useful pages. It's an indicator of an overall low-quality site. This really started to become an issue with the first "Panda" update. So, for each of these particular pages, you want to add a tag to your HEAD section:
| name="robots" content="noindex,follow" /> |
| We tell Google "noindex", because we don't want these pages in their index (really, they don't either, so everyone is happy). They're terrible landing pages for a search engine. |We tell Google to "follow", because the other pages that these are linking to are still of value. And we want Googlebot to continue crawling and crediting internal links on your site.
-
When looking at this link: http://www.loveday.se/personifering/1/utan-facebook
I get these sample URLs (It says it's a total of 50 duplicate URLs):
http://www.loveday.se/personifering/168/julkortshanghttp://www.loveday.se/personifering/145/far-motherfucker
http://www.loveday.se/personifering/123/prispokal
http://www.loveday.se/personifering/136/gravitation
http://www.loveday.se/personifering/63/fing-love-you
I'd say that out of all the 160 duplicate content pages, 99.9% of them have the same link path of http://www.loveday.se/personifiering/... Which is the customization page.
-
Could you provide a few samples of URL's that SEOmoz Pro claims contain duplicate content? It should show you if you click on the error, then click on individual links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content for Event Pages
Hi Folks, I have event pages for specific training courses running on certain dates, the problem I have is that MOZ indicates that I have 1040 duplicate content issues because I'm serving pages like this https://purplegriffon.com/event/2521/mop-practitioner I'm not sure how best to go about resolving this as, of course, although each event is unique in terms of it's start date, the courses and locations could be identical. Will Google penalise us for these types of pages, or will they even index them? Should I add a canonical link to the head of the document pointing to the related course page such as https://purplegriffon.com/courses/project-management/mop-management-of-portfolios/mop-practitioner. Will this solve the issue? I'm a little stuck on what to do for the best. Any advice would be much appreciated. Thanks. Kind Regards Gareth Daine
On-Page Optimization | | PurpleGriffon0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
Duplicate Content
Part of a site I am working on, features many different bags in all thicknesses colors and sizes. I'm getting an error when some pages have different content like different thicknesses. The only differences between the pages are a single digit - but in trash bags that makes it a whole different product! I can't do a canonical because it's not the same. For example: http://www.plasticplace.net/index.php?file=productdetail&iprod_id=274 and http://www.plasticplace.net/index.php?file=productdetail&iprod_id=268 Any ideas?
On-Page Optimization | | EcomLkwd0 -
Offer landing page, duplicate content and noindex
Hi there I'm setting up a landing page for an exclusive offer that is only available (via a link) to a particular audience. Although I've got some specific content (offer informaiton paragraph), i want to use some of the copy and content from one of my standard product pages to inform the visitors about what it is that i'm trying to sell them. Considering I'm going to include a noindex on this page, do i need to worry about it having some content copied directly from another page on my site? Thanks
On-Page Optimization | | zeegirl0 -
Localised content/pages for identical products
I've got a question about localising the website of a nationwide company. We're a small dance school with nationwide (40 cities) coverage for around 40 products. Currently, we have one page for each product (style of dance), and one page for each city; the product pages cover keywords like 'cheerleading dance class' while the city pages target the 'london dance classes'-type keywords. To make 'localised product pages', I feel like we should make a page for every city/product combo 'London cheerleading classes' - but that seems like a nightmare for both writing sexy & original content, and link building/social stats. The other thing I can think of (which I refuse to do because it would look stupid & flag the page as keyword stuffed) is filling the page with the keyword phrases which are appropriate for every city. Is there another way to let google know 'this page is appropriate for these cities...'? We do currently list the cities a product is available in, but it doesn't seem to help local rankings very much. Would this just be a link building job, using hyper-targeted anchor texts (inc. city names) for each product? How do the pro's tackle this problem?
On-Page Optimization | | AlecPR0 -
Duplicate Title & Content in WordPress
I'm getting a lot of Crawl Errors due to duplicate content and duplicate title because of category and tag posts in WordPress. I rebuilt the sitemap and said to exclude category and tags, should that clear up the issue? I've also went through and did NO INDEX and NO FOLLOW for all categories and posts. Any thoughts on this issue?
On-Page Optimization | | seantgreen0 -
Pages with duplicate meta descriptions.
So I have similar items that is being recognized as duplicated meta description. What should I do? Here is couple of items.
On-Page Optimization | | DiamondJewelryEmpire0 -
Best Practice for Non-Cannibalisation of Money Term
Say I have a site which sells widgets. Site structure is as follows: Home Widgets Blue Widgets Green Widgets Red Widgets About Us Contact Us I know the money term is "blue widgets". Not "widgets" (as this is too generic, and blue/red/green widgets are only a subset of the whole 'widget' universe). How do I prevent the site from cannibalising this keyword? Do I only try to make www.mywidgetsshop.com/blue-widgets the main page for blue widgets or do I try and make the home page rank for this phrase?
On-Page Optimization | | timhatton0