Duplicate content for vehicle inventory.
-
Hey all,
In the automotive industry...
When uploading vehicle inventory to a website I'm concerned with duplicate content issues.
For example, 1 vehicle is uploaded to the main manufacturers website, then again to the actual dealerships website & then again to Craigslist & even sometimes to a group site. The information is all the same, description, notes, car details & images.
What would you all recommend for alleviating duplicate content issues? Should I be using the rel canonical back to the manufacturers website?
Once the vehicle is sold all pages disappear.
Thanks so much for any advice.
-
The best thing to do is to create assets on these product listings where you can add certain unique content to the pages and minimize the duplicate and a "mix" to these specific listings.
great implementation is Zillow. check out these home listings on zillow and see the zillowestimates and other information that zillow provide to the user.
the mix is helping getting their listings rank higher because they are differentiating themselves from others. but adding more "information" from the data they are collecting.
hope it helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
Decline in traffic and duplicate content in different domains
Hi, 6 months ago my customer purchased their US supplier and moved the supplier's website to their e-commerce platform. When moving to the new platform they copied the descriptions of the products from their site to the supplier's site so now both sites have the same content in the product pages. Since then they have experienced decrease in traffic in about 80%. They didn't implement canonical tag or hreflang. My customer's domain format is https://www.xxx.biz and the supplier's domain is https://www.zzz.com The last one is targeting the US and when someone from outside of the US wants to purchase a product they get a message that they need to move to the first website, the www.xxx.biz. Both sites are in English. The old site version of www.zzz.com, before the shit to the new platform, contained different product descriptions, and BTW, the old website version is still live and indexed under a subdomain of www.zzz.com. My question is what's the best thing to do in this case so that the rankings will be back to higher positions and they'll get back their traffic. Thanks!
Technical SEO | | digital19740 -
Intentional Duplicate Content - Great UX, Bad for Ranking?
I'll try to keep this as clear and high level as possible. Thank you in advance for any and all help! We're managing a healthcare practice which specializes in neurosurgical treatments. As the practice is rather large, the doctors have several "specialties" in which they focus in, i.e. back surgery, facial surgery, brain surgery, etc. They have a main website (examplepractice.com) which holds ALL of their content on each condition and treatment in which they deal with. So, if someone enters their main homepage they will see conditions and treatments for all the specialties categorized together. However, linked within the main site are "mini-sites" for each specialty (same domain, same site) (examplepractice.com/brain-surgery), but with a different navigation menu to give the illusion of "separate website". These mini-sites are then tailored from a creative, content and UX perspective to THAT specific group of treatments and conditions. Now, anyone who enters this minisite will find information pertaining to only that specialty. The mini-sites are NOT set up as folders, but rather just a system of URLs that we have mapped out to each page. We set up the pages this way to maintain an exclusive feel for the site. Instead of someone drilling into a specific condition and having the menu change, we created the copies. But, because of how this is set up, we now have duplicate content for each treatment and condition child page (one on the main site, one on the minisite). My question (finally) is will this cause a problem in the future? Are we essentially splitting the "juice" between these two pages? Are we making it easier for our competitors to outrank us? We know this layout makes sense from the perspective of a user, but we're unclear how to move forward from a search perspective. Any tips?
Technical SEO | | frankmassanova1 -
Does adding a noindex tag reduce duplicate content?
I've been working under the assumption for some time that if I have two (or more) pages which are very similar that I can add a noindex tag to the pages I don't need and that will reduce duplicate content. As far as I know this removes the pages with the tag from Google's index and stops any potential issues with duplicate content. It's the second part of that assumption that i'm now questioning. Despite pages having the noindex tag they continue to appear in Google Search console as duplicate content, soft 404 etc. That is, new pages are appearing regularly that I know to have the noindex tag. My thoughts on this so far are that Google can still crawl these pages (although won't index them) so shows them in GSC due to a crude issue flagging process. I mainly want to know: a) Is the actual Google algorithm sophisticated enough to ignore these pages even through GSC doesn't. b) How do I explain this to a client.
Technical SEO | | ChrisJFoster0 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
How to get rid of duplicate content
I have duplicate content that looks like http://deceptionbytes.com/component/mailto/?tmpl=component&link=932fea0640143bf08fe157d3570792a56dcc1284 - however I have 50 of these all with different numbers on the end. Does this affect the search engine optimization and how can I disallow this in my robots.txt file?
Technical SEO | | Mishelm1 -
What to do about similar content getting penalized as duplicate?
We have hundreds of pages that are getting categorized as duplicate content because they are so similar. However, they are different content. Background is that they are names and when you click on each name it has it's own URL. What should we do? We can't canonical any of the pages because they are different names. Thank you!
Technical SEO | | bonnierSEO0