Duplicate Content for Multiple Instances of the Same Product?
-
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue:
The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search.
We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search.
We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this.
Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
-
Yeah, no argument there. I worry about it from an SEO standpoint, but sometimes there really isn't a lot you can do, from a business standpoint. I think it's occasionally worth a little fight, though - sometimes, when all the dealers want to have their cake and eat it, too, they all suffer (at least, post-Panda). Admittedly, that's a long, difficult argument, and you have to decide if it's worth the price.
-
okunen,
When you say "overlap in the products that they sell", do they have two identical franchises e.g. 2 Toyota stores that are on opposite sides of the same city, or are they wanting to share pre-owned inventory across multiple sites?
-
Dr. Pete, I think we are on the same page. The reason that I say don't worry too much about duplicate content when it comes to dealer inventory is; for the most part it is out of your control in the automotive industry. Dealer's want to have their inventory on as many sites as they can so it becomes virtually impossible to control.
-
I have to disagree with Mike a bit - this is the kind of situation that can cause problems, and I think the duplication across the industry actually makes it even more likely. Yes, the big players can get away with it, and Google understands the dynamic to some degree, but if you have a new site or smaller brand, you could greatly weaken your ranking ability. You especially have to be careful out the gate, IMO, when your authority is weak.
To be fair, I'm assuming you're a small to mid-sized player and not a major brand, so if that's an incorrect assumption, let me know.
There aren't many have-your-cake-and-eat-it-too approaches to duplicate content in 2013. If you use rel=canonical, NOINDEX, etc. then some version of the page won't be eligible for ranking. If you don't, then the pages could dilute each other or even harm the ranking of the overall site. Each product won't "carry the same weight in search" - if you don't pick, Google will, and your internal site architecture and inbound link structure is always going to weight some pages more highly than others. Personally, I think it's better to choose than have the choice made for you (which is usually what happens).
I'd also wonder if this structure is really that great for users - people don't want to happen across nine versions of the same page, that only differ by the branch. The branch is your construct, not theirs, and it's important to view this from the visitor perspective.
Unfortunately, I don't understand the business/site well enough to give you a great alternative. Is there a way to create a unified product URL/page, but still give the branch credit when a visitor hits the product via their sub-site. For example, you could cookie the visitor and then show the branches template (logo, info, etc.) at the top of the page, but still keep one default URL that Google would see. As long as new visitors to the site also see that default, it's not a problem.
-
Don't worry so much about the duplicate content. Those same cars are probable on 5 to 10 other sites anyway; e.g. cars.com, Autotrader, etc. The Search Engines understand the automotive industry dynamic pretty well.
Focus on location, content, and authority. By content, if you can get them to add unique descriptions to each vehicle and if possible unique text on the inventory search pages. Then of course relevant blog posts.
Depending on how much control you have over the inventory SEO you should be able to make the meta title/descriptions unique between the different dealers too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Affiliate urls and duplicate content
Hi, What is the best way to get around having an affiliate program, and the affiliate links on your site showing as duplicate content?
Technical SEO | | Memoz0 -
Duplicate content due to csref
Hi, When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages. Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results. Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
Technical SEO | | Petersen110 -
Canonical usage and duplicate content
Hi We have a lot of pages about areas like ie. "Mallorca" (domain.com/Spain/Mallorca), with tabbed pages like "excursion" (domain.com/spain/Mallorca/excursions) and "car rental" (domain.com/Spain/Mallorca/car-rental) etc. The text on ie the "car rental"-page is very similar on Mallorca and Rhodos, and seomoz marks these as duplicate content. This happens on "car rental", "map", "weather" etc. which not have a lot of text but images and google maps inserted. Could i use rel=nex/prev/canonical to gather the information from the tabbed pages? That could show google that the Rhodos-map page is related to Rhodos and not Mallorca. Is that all wrong or/and is there a better way to do this? Thanks, Alsvik
Technical SEO | | alsvik0 -
API for testing duplicate content
Does anyone know a service or API or php lib to compare two (or more) pages and to return their similiarity (Level-3-Shingles). API would be greatly prefered.
Technical SEO | | Sebes0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Root domain not resolving to www. Duplicate content?
Hi, I'm working with a domain that stays on the root domain if the www is not included. But if the www is included, it stays with the www. LIke this: example.com
Technical SEO | | HardyIntl
or
www.example.com Of course, they are identical and both go to the same IP. Do search engines consider that to be duplicate content? thanks,
michael0