Duplicate product description ranking problems (off-site duplicate content)
-
We do business in niche category and not in English language market. We have 2-3 main competitors who use same product information as us. They all do have same duplicate products descriptions as we. We with one competitors have domains with highest authority in this market. They maybe have 10-20% better link profile (when counting linking domains and total links).
Problem is that they rank much better with product names then we do (same duplicate product descriptions as we have and almost same level internal optimisation) and they haven't done any extra link building for products. Manufacturers website aren't problem, because these doesn't rank well with product name keywords. Most of our new and some old product go to the Supplemental Results and are shown in "In order to show you the most relevant results, we have omitted some entries very similar to the ... already displayed. If you like, you can repeat the search with the omitted results included.".
Unique text for products isn't a option. When we have writen unique content for product, then these seem to rank way better. So our questions is what can we do externaly to help our duplicate description product rank better compared to our main competitor withour writing unique text?
How important is indexation time? Will it give big advantage to get indexed first? We have thought of using more RSS/bing services to get faster indexation (both site will get products information almost at same time). It seems our competitor get quicker in index then we do.
Also are farmpages helpful for getting some quick low value links for new products. We have planed to make 2-3 domains that would have few links pointint to these new products to get little advantage right after products are launched and doesn't have extranl links.
Sitemap works and our new product are shown on front pages (products that still mostly doesn't rank well and go to Supplemental Results). Some new product have #1 or top3 raking, but these are only maybe 1/3 that should have top3 rankings.
Also we have noticed problem that when we index products quickly (for example Fetch as Google) then these will get good top3 results and then some will get out of rankings (to Supplemental Results).
-
There's no easy answer, I'm afraid, and if an answer looks too easy, I'd stay away from it. Building low-quality links might help in the short-term, but it's too high-risk in the long-term. Plus, if you're combining it with duplicate content, you've got multiple quality issues in play (at least, in Google's eyes - I'm not making a judgment calling about using product descriptions, which is very common).
You say that unique text is proven to have worked, and yet it isn't an option. Why? If it's a matter of time/cost, I'd strongly consider not only the long-term ROI but the possibility of investing selectively. For example, you don't have to write unique text for every product you sell (or re-sell) - you could pick the top 10% of products (which may account for 90% of sales) and start with those. Even the top 1% would be a start. Small investments in the right places could yield large returns here.
The other option that people don't like to hear but really is powerful is to consider more carefully focusing your link equity on a smaller number of products. The more products you list, the more duplicates you have, and some of those products are probably very poor sellers or have very poor profit margins. What if you focused your site architecture on 25% of the total products? You'd focus your authority more and each page would be stronger, relative to your competitors.
One easy win is to make sure you're not dealing with any internal duplicate content (product options pages, search filters, etc.). If you're compounding external duplication with internal duplication, it's only going to make all of your problems worse. The internal duplication is much easier to solve.
-
Thank you for your answer. When comparing DA and PA then ours are little bit better 48 vs 49 (DA), and also our front page PA is better. But actually Open Site Explorer data (DA and PA) isn't really good when we look international market like us. Ahrefs gets better link profiles here. But as we have such a little difference when comparing backlinks then it's little bit strage that they can get so much better results.
It's small international market so customer reviews isn't option. Nobody doesn't give these here. We have reviews possibility already but nobody doesn't submit these.
So also my main qiestions is what factors Google look when they rank same duplicate products. Like we know that they count DA, PA.. and as I understand also who get indexed first. Does anybody know what else?
-
The reason your competitor is ranking better could be the value of their DA and their PA. Without looking specifically it would be hard to say. Google isn't going to show two pages that are exactly the same, which is why they say similar pages have been omitted.
I would not suggest using a link farm. This can only bring you disaster in the long run.
Have you thought about getting customer reviews on page? Using a program that will put customer reviews on page, so that you can see them in the source code is a good way to start leveling the duplicate content out of the equation. You should also put some focus into building quality links. It isn't the quantity of links that you have, but the quality.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Targeting for Australia Problem
Hello Moz Community! I'm reaching out since I recently launched a UK and Australia version of my website. Now, each page on the website has 4 versions: 1. www.example.com 2. www.example.com/au 3. www.example.com/uk 4. www.example.com/en <-- this is a by-product of the plugin we're using, CMS is WP each page has the following 4 targeting tags on it: I looked in Webmaster Tools and we're getting an error on what appears to be every Australia page. The error states, ""au"- unknown language code. URLs for your site that have an unknown language code 'au' and their alternate URLs." In Google's own example, they have the language for Australia set as en-au [https://support.google.com/webmasters/answer/189077?hl=en} Has anyone run into this issue before? We had the alternate tag set to "au" at first, but edited the plugin so the alternate tag now says "en-au", but this still hasn't remedied the problem. Any insights into resolving this error are greatly appreciated!
International SEO | | DigitalThirdCoast0 -
Duplicate content across English-speaking ccTLDs
Morning, If a brand offering pretty the same products/services has 4 English-speaking ccTLDs (.com, .co.uk, .com.au and .co.nz), what are the best practices when thinking about SEO and content? In an ideal world, all content should be totally unique, but when the products/services offered across every ccTLD are the same, this may prove tricky. Am I right in thinking that duplicate content across ccTLDs is tolerated by Google as they know you're targeting specific countries? Cheers!
International SEO | | PeaSoupDigital0 -
Web Site Migration - Time to Google indexing
Soon we will do a website migration .com.br to .com/pt-br. Wi will do this migration when we have with lower traffic. Trying to follow Google Guidelines, applying the 301 redirect, sitemap etc... I would like to know, how long time the Google generally will use to transfering the relevance of .com.br to .com/pt-br/ using redirect 301?
International SEO | | mobic0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
SEO Strategy for international website with similar content
Hello, If a company is in different countries and has same content in most of the countries does it hurt SEO? For Ex. fibaro.com is the website that I am researching and I have seen the indexed pages to be about 40,000 however there is not much content on it. On further inspection I noticed that for every country the sub folder is different. So for us it will be fibaro.com/us/motion-sensor and for Europe fibaro.com/en/motion-sensor. Now both of these pages have same content on it and the company is in 86 countries so imagine the amount of duplicate content it has. Does anybody have any ideas on what should be an ideal way to approach this? Thanks
International SEO | | Harveyspecter0 -
I have more than 4000 pages but still have a low trafic. Would love to know more to be better ranked ?
My website is a magazine about travel and fashion. But even if i have a lot of pages, I am still low in ranking. Why ? Thanks for any advice !
International SEO | | ccjourn0 -
Will Google punish me cuz my websites content are almost the same?
If I have almost the same contents for my three e-commerce websites, say A.com,B.uk,C.ca. They're promoted in US, GB, Canada which are all English speaking. Will my site be punished because they're almost the same to Google?
International SEO | | SquallPersun0 -
Internationally targetted subdomains and Duplicate content
A client has a site they'd like to translated into French, not for the french market but for french speaking countries. My research tells me the best way to implement this for this particular client is to create subfolders for each country. For ease of implementation I’ve decided against ccTLD’s and Sub Domains. So for example… I'll create www.website.com/mr/ for Mauritania and in GWT set this to target Mauritania. Excellent so far. But then I need to build another sub folder for Morocco. I'll then create www.website.com/ma/ for Morocco and in GWT set this to target Morocco. Now the content on these two sub folders will be exactly the same and I’m thinking about doing this for all French speaking African countries. It would be nice to use www.website.com/fr/ but in GWT you can only set one Target country. Duplicate content issues arise and my fear of perturbing the almighty Google becomes a possibility. My research indicates that I should simply canonical back to the page I want indexed. But I want them both to be indexed surely!? I therefore decided to share my situation with my fellow SEO’s to see if I’m being stupid or missing something simple both a distinct possibility!
International SEO | | eazytiger0