Duplicate product description ranking problems (off-site duplicate content)
-
We do business in niche category and not in English language market. We have 2-3 main competitors who use same product information as us. They all do have same duplicate products descriptions as we. We with one competitors have domains with highest authority in this market. They maybe have 10-20% better link profile (when counting linking domains and total links).
Problem is that they rank much better with product names then we do (same duplicate product descriptions as we have and almost same level internal optimisation) and they haven't done any extra link building for products. Manufacturers website aren't problem, because these doesn't rank well with product name keywords. Most of our new and some old product go to the Supplemental Results and are shown in "In order to show you the most relevant results, we have omitted some entries very similar to the ... already displayed. If you like, you can repeat the search with the omitted results included.".
Unique text for products isn't a option. When we have writen unique content for product, then these seem to rank way better. So our questions is what can we do externaly to help our duplicate description product rank better compared to our main competitor withour writing unique text?
How important is indexation time? Will it give big advantage to get indexed first? We have thought of using more RSS/bing services to get faster indexation (both site will get products information almost at same time). It seems our competitor get quicker in index then we do.
Also are farmpages helpful for getting some quick low value links for new products. We have planed to make 2-3 domains that would have few links pointint to these new products to get little advantage right after products are launched and doesn't have extranl links.
Sitemap works and our new product are shown on front pages (products that still mostly doesn't rank well and go to Supplemental Results). Some new product have #1 or top3 raking, but these are only maybe 1/3 that should have top3 rankings.
Also we have noticed problem that when we index products quickly (for example Fetch as Google) then these will get good top3 results and then some will get out of rankings (to Supplemental Results).
-
There's no easy answer, I'm afraid, and if an answer looks too easy, I'd stay away from it. Building low-quality links might help in the short-term, but it's too high-risk in the long-term. Plus, if you're combining it with duplicate content, you've got multiple quality issues in play (at least, in Google's eyes - I'm not making a judgment calling about using product descriptions, which is very common).
You say that unique text is proven to have worked, and yet it isn't an option. Why? If it's a matter of time/cost, I'd strongly consider not only the long-term ROI but the possibility of investing selectively. For example, you don't have to write unique text for every product you sell (or re-sell) - you could pick the top 10% of products (which may account for 90% of sales) and start with those. Even the top 1% would be a start. Small investments in the right places could yield large returns here.
The other option that people don't like to hear but really is powerful is to consider more carefully focusing your link equity on a smaller number of products. The more products you list, the more duplicates you have, and some of those products are probably very poor sellers or have very poor profit margins. What if you focused your site architecture on 25% of the total products? You'd focus your authority more and each page would be stronger, relative to your competitors.
One easy win is to make sure you're not dealing with any internal duplicate content (product options pages, search filters, etc.). If you're compounding external duplication with internal duplication, it's only going to make all of your problems worse. The internal duplication is much easier to solve.
-
Thank you for your answer. When comparing DA and PA then ours are little bit better 48 vs 49 (DA), and also our front page PA is better. But actually Open Site Explorer data (DA and PA) isn't really good when we look international market like us. Ahrefs gets better link profiles here. But as we have such a little difference when comparing backlinks then it's little bit strage that they can get so much better results.
It's small international market so customer reviews isn't option. Nobody doesn't give these here. We have reviews possibility already but nobody doesn't submit these.
So also my main qiestions is what factors Google look when they rank same duplicate products. Like we know that they count DA, PA.. and as I understand also who get indexed first. Does anybody know what else?
-
The reason your competitor is ranking better could be the value of their DA and their PA. Without looking specifically it would be hard to say. Google isn't going to show two pages that are exactly the same, which is why they say similar pages have been omitted.
I would not suggest using a link farm. This can only bring you disaster in the long run.
Have you thought about getting customer reviews on page? Using a program that will put customer reviews on page, so that you can see them in the source code is a good way to start leveling the duplicate content out of the equation. You should also put some focus into building quality links. It isn't the quantity of links that you have, but the quality.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
I changed settings in search console and my rankings dropped significantly?
hi reader, 3 weeks ago i changed international targeting setting in search console to USA and 3 weeks. ago i was ranking pretty fine in the US. now i am out of top 100. is it search console or other reason?
International SEO | | maria-cooper90 -
SEO Strategy for international website with similar content
Hello, If a company is in different countries and has same content in most of the countries does it hurt SEO? For Ex. fibaro.com is the website that I am researching and I have seen the indexed pages to be about 40,000 however there is not much content on it. On further inspection I noticed that for every country the sub folder is different. So for us it will be fibaro.com/us/motion-sensor and for Europe fibaro.com/en/motion-sensor. Now both of these pages have same content on it and the company is in 86 countries so imagine the amount of duplicate content it has. Does anybody have any ideas on what should be an ideal way to approach this? Thanks
International SEO | | Harveyspecter0 -
Suggestion for Baidu Keyword Rank Tracking Software?
Does anyone have a suggestion for the best keyword rank tracking software in Baidu? The keywords would be in Chinese. We currently already have keyword rank tracking software for Google/Bing/Yahoo so we just need to expand to Baidu. I am looking forward to comparing my research with what the Moz community has to say! Thanks!
International SEO | | kchandler0 -
Ranking issues for UK vs US spelling - advice please
Hi guys, I'm reaching out here for what may seem to be a very simple and obvious issue, but not something I can find a good answer for. We have a .com site hosted in Germany that serves our worldwide audience. The site is in English, but our business language is British (UK) English. This means that we rank very well for (e.g.) optimisation software but optimization software is nowhere to be found. The cause of this to me seems obvious; a robot reading those two phrases sees two distinct words. Nonetheless, having seen discussions of a similar nature around the use of plurals in keywords, it would seem to me that Google should have this sort of thing covered. Am I right or wrong here? If I'm wrong, then what are my options? I really don't want to have to make a copy of the entire site; apart from the additional effort involved in content upkeep I see this path fraught with duplicate content issues. Any help is very much appreciated, thanks.
International SEO | | StevenHowe0 -
Freelancer.com: Same Content on Different TLD?
Take a look at freelancer.com and freelancer.in. Both have the same content. I check for rel=canonical and freelancer.in has one to itself. Not to the .com version. Both the sites are indexed in Google as well. Do you think high authority sites like freelancer can get away with duplicate content?
International SEO | | jombay0 -
Will changing my host affect my rankings
Were moving our site from a UK server to an Australian server to benefit from hosting our site in our home country. The domain name is the same, the template and files are all the same so will we lose rankings at all when the website is moved? I plan on leaving the old website up there until the new website is cached. What considerations should we think of before we migrate? Cheers
International SEO | | acs1110 -
Google UK picking up USA Site
I have a site with two subfolders one is .../uk and one is .../us Part of the content on the two sites is the same and part is unique. The US site's language is set to en and the UK site's language is set to en_gb. I have setup geo-targeting in webmaster tools. The problem is that the home page is a GEO-IP redirect and it seems to be picking up information from the US site even on google uk. I'm not concerned too much about getting the uk site crawled as we submit a sitemap for that anyway. But my concern is that if I setup the geo-ip redirect as a 301 will my UK site loose all of it's ranking? Also am I likely to be penalised for duplicate content?
International SEO | | matthewdolman0