Duplicate Content for Multiple Instances of the Same Product?
-
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue:
The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search.
We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search.
We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this.
Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
-
Yeah, no argument there. I worry about it from an SEO standpoint, but sometimes there really isn't a lot you can do, from a business standpoint. I think it's occasionally worth a little fight, though - sometimes, when all the dealers want to have their cake and eat it, too, they all suffer (at least, post-Panda). Admittedly, that's a long, difficult argument, and you have to decide if it's worth the price.
-
okunen,
When you say "overlap in the products that they sell", do they have two identical franchises e.g. 2 Toyota stores that are on opposite sides of the same city, or are they wanting to share pre-owned inventory across multiple sites?
-
Dr. Pete, I think we are on the same page. The reason that I say don't worry too much about duplicate content when it comes to dealer inventory is; for the most part it is out of your control in the automotive industry. Dealer's want to have their inventory on as many sites as they can so it becomes virtually impossible to control.
-
I have to disagree with Mike a bit - this is the kind of situation that can cause problems, and I think the duplication across the industry actually makes it even more likely. Yes, the big players can get away with it, and Google understands the dynamic to some degree, but if you have a new site or smaller brand, you could greatly weaken your ranking ability. You especially have to be careful out the gate, IMO, when your authority is weak.
To be fair, I'm assuming you're a small to mid-sized player and not a major brand, so if that's an incorrect assumption, let me know.
There aren't many have-your-cake-and-eat-it-too approaches to duplicate content in 2013. If you use rel=canonical, NOINDEX, etc. then some version of the page won't be eligible for ranking. If you don't, then the pages could dilute each other or even harm the ranking of the overall site. Each product won't "carry the same weight in search" - if you don't pick, Google will, and your internal site architecture and inbound link structure is always going to weight some pages more highly than others. Personally, I think it's better to choose than have the choice made for you (which is usually what happens).
I'd also wonder if this structure is really that great for users - people don't want to happen across nine versions of the same page, that only differ by the branch. The branch is your construct, not theirs, and it's important to view this from the visitor perspective.
Unfortunately, I don't understand the business/site well enough to give you a great alternative. Is there a way to create a unified product URL/page, but still give the branch credit when a visitor hits the product via their sub-site. For example, you could cookie the visitor and then show the branches template (logo, info, etc.) at the top of the page, but still keep one default URL that Google would see. As long as new visitors to the site also see that default, it's not a problem.
-
Don't worry so much about the duplicate content. Those same cars are probable on 5 to 10 other sites anyway; e.g. cars.com, Autotrader, etc. The Search Engines understand the automotive industry dynamic pretty well.
Focus on location, content, and authority. By content, if you can get them to add unique descriptions to each vehicle and if possible unique text on the inventory search pages. Then of course relevant blog posts.
Depending on how much control you have over the inventory SEO you should be able to make the meta title/descriptions unique between the different dealers too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content
When I crawl my site through moz, it shows lots of Pages with Duplicate Content. The thing is all that pages are pagination pages. How should I solve this issue?
Technical SEO | | 100offdeal0 -
Duplicate Content from Multiple Sources Cross-Domain
Hi Moz Community, We have a client who is legitimately repurposing, or scraping, content from site A to site B. I looked into it and Google recommends the cross-domain rel=canonical tag below: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html The issue is it is not a one to one situation. In fact site B will have several pages of content from site A all on one URL. Below is an example of what they are trying to accomplish. EX - www.siteB.com/apples-and-oranges is made up of content from www.siteA.com/apples & www.siteB.com/oranges So with that said, are we still in fear of getting hit for duplicate content? Should we add multiple rel=canonical tags to reflect both pages? What should be our course of action.
Technical SEO | | SWKurt0 -
How to protect against duplicate content?
I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google. My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored. Is that the right move? Are there other things I should do? Thanks!
Technical SEO | | williammarlow0 -
Techniques for diagnosing duplicate content
Buonjourno from Wetherby UK 🙂 Diagnosing duplicate content is a classic SEO skill but I'm curious to know what techniques other people use. Personally i use webmaster tools as illustrated here: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/webmaster-tools-duplicate.jpg but what other techniques are effective? Thanks,
Technical SEO | | Nightwing
David0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Aspx filters causing duplicate content issues
A client has a url which is duplicated by filters on the page, for example: - http://www.example.co.uk/Home/example.aspx is duplicated by http://www.example.co.uk/Home/example.aspx?filter=3 The client is moving to a new website later this year and is using an out-of-date Kentico CMS which would need some development doing to it in order to enable implementation of rel canonical tags in the header, I don't have access to the server and they have to pay through the nose everytime they want the slightest thing altering. I am trying to resolve this duplicate content issue though and am wondering what is the best way to resolve it in the short term. The client is happy to remove the filter links from the page but that still leaves the filter urls in Google. I am concerned that a 301 redirect will cause a loop and don't understand the behaviour of this type of code enough. I hope this makes sense, any advice appreciated.
Technical SEO | | travelinnovations0 -
Snippets on every page considered duplicate content?
If I create a page that pulls a 10 snippets of information from various external site, would that content be considered duplicate content? If I link to the source, would it be recommended to use a "nofollow" tag?
Technical SEO | | nicole.healthline0 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0