Duplicate Content Brainstorming
-
Hi,
New here in the SEO world. Excellent resources here. We have an ecommerce website that sells presentation templates. Today our templates come in 3 flavours - for PowerPoint, for Keynote and both - called Presentation Templates. So we've ended up with 3 URLS with similar content. Same screenshots, similar description..
Example:
https://www.improvepresentation.com/keynote-templates/social-media-keynote-template
https://www.improvepresentation.com/powerpoint-templates/social-media-powerpoint-template
https://www.improvepresentation.com/presentation-templates/social-media-presentation-template
I know what you're thinking. Why not make a website with a template and give 3 download options right? But what about
https://www.improvepresentation.com/powerpoint-templates/
https://www.improvepresentation.com/keynote-templates/
These are powerfull URL's in my opinion taking into account that the strongest keyword in our field is "powerpoint templates"
How would you solve this "problem" or maybe there is no problem at all.
-
Thanks for your in-depth explanation. Siteliner shows:
Duplicate Content: 24%
Common Content: 58%
Unique Content: 19%We will probably opt-in on the first strategy
-
Hello,
This is a very common challenge in Ecommerce.
You have a couple options to consider.
1. Keep the existing format but add more unique content to each based on the type. For example on the Powerpoint template you could add at least a paragraph on PowerPoint and how this template works inside the program, a video of you downloading it, opening it, and modifying it with some test data.
2. As you pointed out, one page with 3 different download options.. you can even use this option to up-sell a bundle which includes all 3.
As it is now Google is going to pick one of those pages and index it and basically ignore the others as duplicate content. You may actually have further issues throughout the site due to the shear volume of similar content on each page (the bulk of the page's content is exactly the same baring the item's description). You may need to evaluate how you present all the other info on this page (how it works, what others are saying, whats in this template, whats included etc..) all this is leading to many pages with a lot of the exact same content. If other of your pages are displaying all this info as well, the one paragraph description may not be enough to make the page unique to a search engine.
You can review your site on Siteliner here.
Again this is a big challenge for most Ecommerce sites, especially those with a lot of products. However, for smaller sites with less products it is important to focus on each one of these products and write unique, compelling, informative content that sets you apart from the larger sites.
Hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix HTTP/HTTPS duplicate content
I recently installed an SSL certificate on the site: https://libertywholesalesupply.com Moz is now reading thousands of duplicate content pages because it is reading both http and https. I set up the configuration in Magento to auto-redirect the base URL, created a permanent redirect for the URL in the SEO settings, and adjusted the canonical settings. What am I missing??
Technical SEO | | adamxj20 -
Duplicate Content for Multiple Instances of the Same Product?
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue: The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search. We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search. We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this. Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
Technical SEO | | newwhy0 -
301 duplicate content dynamic url
I have a number of pages that appear as duplicate titles in google webmaster. They all have to do with a brand name query. I want to 301 these pages since I'm going to relaunch my new website on wordpress and don't want to have 404s on these pages. a simple 301 redirect doesn't work since they are dynamic urls. here is an example: /kidsfashionnetherlands/mimpi.html?q=brand%3Amim+pi%3A&page=2&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=3&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=5&sort=relevance should all be 301 to the original page that I want to remain indexed: /kidsfashionnetherlands/mimpi.html I have a lot of these but for different queries. Should I do a 301 on each of them to avoid having 404s when I change my site to wordpress? Thanks
Technical SEO | | dashinfashion0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Duplicate Content For Trailing Slashes?
I have several website in campaigns and I consistently get flagged for duplicate content and duplicate page titles from the domain and the domain/ versions of the sites even though they are properly redirected. How can I fix this?
Technical SEO | | RyanKelly0 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0 -
Mapping Internal Links (Which are causing duplicate content)
I'm working on a site that is throwing off a -lot- of duplicate content for its size. A lot of it appears to be coming from bad links within the site itself, which were caused when it was ported over from static HTML to Expression Engine (by someone else). I'm finding EE an incredibly frustrating platform to work with, as it appears to be directing 404's on sub-pages to the page directly above that subpage, without actually providing a 404 response. It's very weird. Does anyone have any recommendations on software to clearly map out a site's internal link structure so that I can find what bad links are pointing to the wrong pages?
Technical SEO | | BedeFahey0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0