Query based site; duplicate content; seo juice flow.
-
Hi guys,
We're planning on starting a Saas based service where we'll be selling different skins. Let's say WordPress themes, though it's not about that. Say we have an url called site.com/ and we would like to direct all seo juice to the mother landing page /best-wp-themes/ but then have that juice flow towards our additional pages:
/best-wp-themes/?id=Mozify
/best-wp-themes/?id=Fiximoz/best-wp-themes/?id=Mozicom
Challenges:
1. our content would be formatted like this:
a. Same content - featuresb. Same content - price
c. Different content - each theme will have its own set of features / design specs.
d. Same content - testimonials.
How would be go about not being penalised by SE's for the duplicate content, but still have the /?id=whatever pages be indexed with proper content?
2. How do we go about making sure SEO juice flows to the /?id pages too?Basically it's the same thing with different skins.
Thanks for the help!
-
No problem. And there are ways around presenting the same content differently. It's hard to be specific without being able to know what the product is, but it's something I've had to solve for various clients, whether it's been for software or actual products.
-
Dan,
Thanks for your time to address this question. Now it's all very clear. The problem wasn't that we don't want to invest time - but rather than there can be only so many ways in which you can present the same overall thing. It's clear now that there's no quick fire solution for this. Thank you for taking the time to address my question and help out.
Andy
-
So you want to rank at the top of search results and get people to pay for your products, but you can't invest the time to write some decent content for 20 pages?
You don't get penalised for duplicate content - what happens is that all except one result will be filtered out generally. Or you'll be seen as low quality and won't rank for any... You can canonicalise all the pages to 1 main page which will have a better chance of ranking, and that will tell Google that you know it's duplicate content. But you'll only ever have that canonical example ranking - if that's better than investing in either creating unique content yourself or outsourcing it to someone that can, then that's the route to go.
-
HI Dan,
Thanks for the answer however it's still not very clear. What exactly do you mean by "beef up unique features"? To try and "rewrite" them in a different way for each different product? That may not be feasible with over 20 different products to present. So would there be any way of telling google that we "know" that's duplicate content and not be penalised for this?
Thanks for everything else - cleared things big time for us.
-
Basically, beef up the unique set of features and design specs as much as possible. And you could hopefully build up unique testimonials pretty quickly. Or just select different ones from different products, so you're not duplicating them across every product you do.
In terms of 'SEO juice', why would you not just link to the individual pages as appropriate as much as possible? That will still raise the overall homepage in the same way as you're suggesting, but without trying to adjust the 'page rank sculpting' which has become less and less effective over the years. And just try to have as few links and navigation on the homepage as far as possible which don't point to your product pages. It's fine to nofollow links like a privacy policy or Terms and Conditions, but as for anything further, you get into potentially risky behaviour for a penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalties for duplicate content
Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
Intermediate & Advanced SEO | | jpuzakov0 -
SEO site Review
Does anyone have suggestions on places that provide in depth site / analytics reviews for SEO?
Intermediate & Advanced SEO | | Gordian0 -
Duplicate Page Content - Shopify
Moz reports that there are 1,600+ pages on my site (Sportiqe.com) that qualify as Duplicate Page Content. The website sells licensed apparel, causing shirts to go into multiple categories (ie - LA Lakers shirts would be categorized in three areas: Men's Shirts, LA Lakers Shirts and NBA Shirts)It looks like "tags" are the primary cause behind the duplicate content issues: // Collection Tags_Example: : http://www.sportiqe.com/collections/la-clippers-shirts (Preferred URL): http://www.sportiqe.com/collections/la-clippers-shirts/la-clippers (URL w/ tag): http://sportiqe.com/collections/la-clippers-shirts/la-clippers (URL w/ tag, w/o the www.): http://sportiqe.com/collections/all-products/clippers (Different collection, w/ tag and same content)// Blog Tags_Example: : http://www.sportiqe.com/blogs/sportiqe/7902801-dispatch-is-back: http://www.sportiqe.com/blogs/sportiqe/tagged/elias-fundWould it make sense to do 301 redirects for the collection tags and use the Parameter Tool in Webmaster Tools to exclude blog post tags from their crawl? Or, is there a possible solution with the rel=cannonical tag?Appreciate any insight from fellow Shopify users and the Moz community.
Intermediate & Advanced SEO | | farmiloe0 -
SEO firm site audit
needs recommendation for a site audit. Post panda/ post penguin experience preferred. thanks
Intermediate & Advanced SEO | | skyao0 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0