Duplicate product description ranking problems (off-site duplicate content)
-
We do business in niche category and not in English language market. We have 2-3 main competitors who use same product information as us. They all do have same duplicate products descriptions as we. We with one competitors have domains with highest authority in this market. They maybe have 10-20% better link profile (when counting linking domains and total links).
Problem is that they rank much better with product names then we do (same duplicate product descriptions as we have and almost same level internal optimisation) and they haven't done any extra link building for products. Manufacturers website aren't problem, because these doesn't rank well with product name keywords. Most of our new and some old product go to the Supplemental Results and are shown in "In order to show you the most relevant results, we have omitted some entries very similar to the ... already displayed. If you like, you can repeat the search with the omitted results included.".
Unique text for products isn't a option. When we have writen unique content for product, then these seem to rank way better. So our questions is what can we do externaly to help our duplicate description product rank better compared to our main competitor withour writing unique text?
How important is indexation time? Will it give big advantage to get indexed first? We have thought of using more RSS/bing services to get faster indexation (both site will get products information almost at same time). It seems our competitor get quicker in index then we do.
Also are farmpages helpful for getting some quick low value links for new products. We have planed to make 2-3 domains that would have few links pointint to these new products to get little advantage right after products are launched and doesn't have extranl links.
Sitemap works and our new product are shown on front pages (products that still mostly doesn't rank well and go to Supplemental Results). Some new product have #1 or top3 raking, but these are only maybe 1/3 that should have top3 rankings.
Also we have noticed problem that when we index products quickly (for example Fetch as Google) then these will get good top3 results and then some will get out of rankings (to Supplemental Results).
-
There's no easy answer, I'm afraid, and if an answer looks too easy, I'd stay away from it. Building low-quality links might help in the short-term, but it's too high-risk in the long-term. Plus, if you're combining it with duplicate content, you've got multiple quality issues in play (at least, in Google's eyes - I'm not making a judgment calling about using product descriptions, which is very common).
You say that unique text is proven to have worked, and yet it isn't an option. Why? If it's a matter of time/cost, I'd strongly consider not only the long-term ROI but the possibility of investing selectively. For example, you don't have to write unique text for every product you sell (or re-sell) - you could pick the top 10% of products (which may account for 90% of sales) and start with those. Even the top 1% would be a start. Small investments in the right places could yield large returns here.
The other option that people don't like to hear but really is powerful is to consider more carefully focusing your link equity on a smaller number of products. The more products you list, the more duplicates you have, and some of those products are probably very poor sellers or have very poor profit margins. What if you focused your site architecture on 25% of the total products? You'd focus your authority more and each page would be stronger, relative to your competitors.
One easy win is to make sure you're not dealing with any internal duplicate content (product options pages, search filters, etc.). If you're compounding external duplication with internal duplication, it's only going to make all of your problems worse. The internal duplication is much easier to solve.
-
Thank you for your answer. When comparing DA and PA then ours are little bit better 48 vs 49 (DA), and also our front page PA is better. But actually Open Site Explorer data (DA and PA) isn't really good when we look international market like us. Ahrefs gets better link profiles here. But as we have such a little difference when comparing backlinks then it's little bit strage that they can get so much better results.
It's small international market so customer reviews isn't option. Nobody doesn't give these here. We have reviews possibility already but nobody doesn't submit these.
So also my main qiestions is what factors Google look when they rank same duplicate products. Like we know that they count DA, PA.. and as I understand also who get indexed first. Does anybody know what else?
-
The reason your competitor is ranking better could be the value of their DA and their PA. Without looking specifically it would be hard to say. Google isn't going to show two pages that are exactly the same, which is why they say similar pages have been omitted.
I would not suggest using a link farm. This can only bring you disaster in the long run.
Have you thought about getting customer reviews on page? Using a program that will put customer reviews on page, so that you can see them in the source code is a good way to start leveling the duplicate content out of the equation. You should also put some focus into building quality links. It isn't the quantity of links that you have, but the quality.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting up international site subdirectories in GSC as separate properties for better geotargeting?
My client has an international website with a subdirectory structure for each country and language version - eg. /en-US. At present, there is a single property set up for the domain in Google Search Console but there are currently various geotargeting issues I’m trying to correct with hreflang tags. My question is, is it still recommended practise and helpful to add each international subdirectory to Google Search Console as an individual property to help with correct language and region tagging? I know there used to be properly sets for this but haven’t found any up to date guidance on whether setting up all the different versions as their own properties might help with targeting. Many thanks in advance!
International SEO | | MMcCalden0 -
Duplicate Content Regarding Translated Pages
If we have one page in English, and another that is translated into Spanish, does google consider that duplicate content? I don't know if having something in a different language makes it different or if it will get flagged. Thanks, Ruben
International SEO | | KempRugeLawGroup1 -
US site vs New Canadian site for Brand
Hi Everyone, My company decided to create a Canadian site for Canadian customers. How do I slowly transition the US site for ranking in Google.ca? I was thinking of using robots.txt to block Google.Ca from crawling the US site? Can anyone provide some advice oh how this should be managed? Thank you!
International SEO | | JMSCC0 -
.fr Site Not Indexed in Search Console
Hey everyone, I am just now learning international SEO, and so appreciate any and all help! 1. I added supershuttle.fr to Search Console about a week ago and it still has 0 indexed pages. It is showing that pages were indexed in the sitemap, and when I check, there are 75 results in Google. Is this something I should be concerned with? Is there a setting that I'm not aware of in Search Console that I need to change? 2. Also, I read the below regarding the automatically translated pages. Would https://en.supershuttle.fr/ be considered an "automatically translated" page? Use robots.txt to block search engines from crawling automatically translated pages on your site. Automated translations don’t always make sense and could be viewed as spam. More importantly, a poor or artificial-sounding translation can harm your site’s perception
International SEO | | SuperShuttle0 -
Wants to Rank Well in Multiple Countries
We have been using moz subscription from last few months & are quite happy so far. We just wanted to explore more out of it & wanted to understand how to approach the optimization for having us rank well in US. At the same time we don't wanted to lose upon what we have achieved till now. I have read through some blog posts & found that changing the international targeting in GWT can also hurt. My vision is to rank well in multiple countries but at the same time do not lose rankings in India. Any help here would be appreciated so whatever we do, we do it right.
International SEO | | fourseven0 -
Ranking UK company in Google.com
Hi all, I have a UK client with a .com domain, hosted on a US server, but the physical business premises is based in the UK. Their product is a really great product and available for export to the US. I want to rank them higher in the US, more specifically Google.com. I've helped them rank very well organically in the UK (google.co.uk) for some great terms, however they rank almost nowhere in google.com (gl=us) for the same terms, for example: In Google.co.uk they rank #3 for the key-phrase.
International SEO | | seowoody
In Google.com they rank #90 for the same key-phrase. I've got them some great US focused links with PR coverage including MSN Cars, nydailynews.com etc. I just wondered if there was any one "golden ticket" for boosting US rankings? I've read that a physical business premises located in the US helps a lot. Can anyone confirm this and if so, would a rented PO box in the US help? The site has great social signals too, growing twitter following and many FB likes/shares etc. Any other tips/advice? Thanks in advance,
Cheers,
Woody 🙂0 -
International SEO whats best 2 sites co.uk and com.au ?
We have the co.uk and com.au ccTLDS and currently operate out of the UK only but plans are in place for Australia. We can't get hold of the .org or .com so it has to be the ccTLD. I want to use the same site for both countries and either host 2 identical sites (same content) or 1 site with different domain names + meta tags for the 2 countries. Whats the best way to make this happen without screwing things up?
International SEO | | therealmarkhall0 -
How to improve SERP rankings in other countries?
We have a .com website in English targeting global visitors. We have good rankings on Google USA for our targeted keywords But we do not rank well in APAC e.g. Sigapore and Australia. What could we do to get ranked in these countries? We received a recommendation to sign up for .sg and .au domains and replicate our Website for Singapore and Australia. But wouldn’t it contribute to duplicate content Issue? We are in software industry and do not have locale specific content. Please advise.
International SEO | | Amjath0