Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
-
We write our own product descriptions for merchandise we sell on our website. However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...). I'm concerned that we'll loose the value of our content because Google will consider it duplicated.
We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google?
Thanks,
Adam
-
I totally agree but you should be able to have another set written with great quality - The big drop shippers always rewrite manufacturer descriptions because of this issue.
- You need to decide if the gains out-ways the costs
-
oops, hit submit button twice..
-
Having two sets of ad copy does effectively solve the Google issue, but it creates two non-Google issues, both of which are potentially costly. For example:
-
I have to write new copy for them which costs time and money, and even then they may still not use it, which creates enforcement issues.
-
If it's substantially different copy (and possibly inferior, because let's face it, it's hard to write two sets of good compelling copy on the same item), then it may not convert as well, which means they sell less... and we sell less
I'm not saying you can't solve my original problem with this method. I'm just saying that there are some very real costs to take into consideration
-
-
Go with David's method, or a hybrid. Present them useable text and ask that they put that on their sites and if they won't then ask they they use canonical or noindex directives.
-
You could have them add a rel- canonical - But dropshippers want your content so they can rank they will not want to use it.
-
Hey guys, thanks for all the fast responses!
I thought I remembered reading something about a technical method for demonstrating to Google that your version of content is the original version. Is there a way to do that?
And yeah, we could ask them to change their behavior (or require it), but there are costs to both and I'm wondering if there's a more effective solution (such as the possibly mythical one above).
-
penalization isnt the only thing you need to worry about its a dropshipper that is stronger then you out ranking you.
-
The best way is to give your drop shippers a feed with 1 set of descriptions and your site having another set (people will still copy but much less)
-
Are these dropshippers people who have to obey by your agreements in order to continue doing business with you? Would it hurt your business to create a requirement that they either create unique content or have their pages use the noindex code to prevent google from finding the dupe?
Do most of your dropshippers get their traffic via Organic Search? Or are they using other advertising sources?
-
There is really nothing you can do because someone else is copying your description.
The only thing I can initially come up with is asking your dropshippers to not copy descriptions.
However, the content that is duplicated and might not really negatively effect your SEO. Google understands e-commerce and a lot of the times products on e-commerce sites are very similar and they do not get penalized. Another thing is that you originally created the description and Google does index according to freshness. As long as you are indexed first with the description, I don't see how Google can penalize you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate 'meta title' issue (AMP & NON-AMP Pages)
how to fix duplicate meta title issue in amp and non-amp pages? example.com
On-Page Optimization | | 21centuryweb
example.com/amp We have set the 'meta title' in desktop version & we don't want to change the title for AMP page as we have more than 10K pages on the website. ----As per SEMRUSH Tool---- ABOUT THIS ISSUE It is a bad idea to duplicate your title tag content in your first-level header. If your page’s <title>and <h1> tags match, the latter may appear over-optimized to search engines. Also, using the same content in titles and headers means a lost opportunity to incorporate other relevant keywords for your page.</p> <p><strong>HOW TO FIX IT</strong></p> <p>Try to create different content for your <title> and <h1> tags.<br /><br />this is what they are recommending, for the above issue we have asked our team to create unique meta and post title for desktop version but what about AMP page?<br /><br />Please help!</p></title>0 -
How to fix duplicate content for homepage and index.html
Hello, I know this probably gets asked quite a lot but I haven't found a recent post about this in 2018 on Moz Q&A, so I thought I would check in and see what the best route/solution for this issue might be. I'm always really worried about making any (potentially bad/wrong) changes to the site, as it's my livelihood, so I'm hoping someone can point me in the right direction. Moz, SEMRush and several other SEO tools are all reporting that I have duplicate content for my homepage and index.html (same identical page). According to Moz, my homepage (without index.html) has PA 29 and index.html has PA 15. They are both showing Status 200. I read that you can either do a 301 redirect or add rel=canonical I currently have a 301 setup for my http to https page and don't have any rel=canonical added to the site/page. What is the best and safest way to get rid of duplicate content and merge the my non index and index.html homepages together these days? I read that both 301 and canonical pass on link juice but I don't know what the best route for me is given what I said above. Thank you for reading, any input is greatly appreciated!
On-Page Optimization | | dreservices0 -
Auto loading articles ?- best practices
Hi all! In the past months I see more and more website that doing 'auto loading articles in scrolling' - can you tell me if it's okay for SEO and what are the best practices for this? Thanks!
On-Page Optimization | | JohnPalmer1 -
Putting content behind 'view more' buttons
Hi I can't find an upto date answer to this so was wondering what people's thoughts are. Does putting content behind 'view more' css buttons affect how Google see's and ranks the data. The content isn't put behind 'view more' to trick Google. In actual fact if you see the source of the data its all together, but its so that products appear higher up the page. Does anyone have insight into this. Thanks in advance
On-Page Optimization | | Andy-Halliday0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Duplicate meta descriptions
Hi all, I'm using Yoast's SEO plugin and when I run a On Page report card here on SEOMOZ it says there are 2 descriptions tags I've been trying to fix this but can't (I'm new!) Anyone any ideas on this? Thanks Elaine
On-Page Optimization | | elaineryan0 -
Best Practice for Deleting Pages
What is the best SEO practice for deleting pages? We have a section in our website with Employee bios, and when the employee leaves we need to remove their page. How should we do this?
On-Page Optimization | | Trupanion0