Duplicate product description ranking problems (off-site duplicate content)
-
We do business in niche category and not in English language market. We have 2-3 main competitors who use same product information as us. They all do have same duplicate products descriptions as we. We with one competitors have domains with highest authority in this market. They maybe have 10-20% better link profile (when counting linking domains and total links).
Problem is that they rank much better with product names then we do (same duplicate product descriptions as we have and almost same level internal optimisation) and they haven't done any extra link building for products. Manufacturers website aren't problem, because these doesn't rank well with product name keywords. Most of our new and some old product go to the Supplemental Results and are shown in "In order to show you the most relevant results, we have omitted some entries very similar to the ... already displayed. If you like, you can repeat the search with the omitted results included.".
Unique text for products isn't a option. When we have writen unique content for product, then these seem to rank way better. So our questions is what can we do externaly to help our duplicate description product rank better compared to our main competitor withour writing unique text?
How important is indexation time? Will it give big advantage to get indexed first? We have thought of using more RSS/bing services to get faster indexation (both site will get products information almost at same time). It seems our competitor get quicker in index then we do.
Also are farmpages helpful for getting some quick low value links for new products. We have planed to make 2-3 domains that would have few links pointint to these new products to get little advantage right after products are launched and doesn't have extranl links.
Sitemap works and our new product are shown on front pages (products that still mostly doesn't rank well and go to Supplemental Results). Some new product have #1 or top3 raking, but these are only maybe 1/3 that should have top3 rankings.
Also we have noticed problem that when we index products quickly (for example Fetch as Google) then these will get good top3 results and then some will get out of rankings (to Supplemental Results).
-
There's no easy answer, I'm afraid, and if an answer looks too easy, I'd stay away from it. Building low-quality links might help in the short-term, but it's too high-risk in the long-term. Plus, if you're combining it with duplicate content, you've got multiple quality issues in play (at least, in Google's eyes - I'm not making a judgment calling about using product descriptions, which is very common).
You say that unique text is proven to have worked, and yet it isn't an option. Why? If it's a matter of time/cost, I'd strongly consider not only the long-term ROI but the possibility of investing selectively. For example, you don't have to write unique text for every product you sell (or re-sell) - you could pick the top 10% of products (which may account for 90% of sales) and start with those. Even the top 1% would be a start. Small investments in the right places could yield large returns here.
The other option that people don't like to hear but really is powerful is to consider more carefully focusing your link equity on a smaller number of products. The more products you list, the more duplicates you have, and some of those products are probably very poor sellers or have very poor profit margins. What if you focused your site architecture on 25% of the total products? You'd focus your authority more and each page would be stronger, relative to your competitors.
One easy win is to make sure you're not dealing with any internal duplicate content (product options pages, search filters, etc.). If you're compounding external duplication with internal duplication, it's only going to make all of your problems worse. The internal duplication is much easier to solve.
-
Thank you for your answer. When comparing DA and PA then ours are little bit better 48 vs 49 (DA), and also our front page PA is better. But actually Open Site Explorer data (DA and PA) isn't really good when we look international market like us. Ahrefs gets better link profiles here. But as we have such a little difference when comparing backlinks then it's little bit strage that they can get so much better results.
It's small international market so customer reviews isn't option. Nobody doesn't give these here. We have reviews possibility already but nobody doesn't submit these.
So also my main qiestions is what factors Google look when they rank same duplicate products. Like we know that they count DA, PA.. and as I understand also who get indexed first. Does anybody know what else?
-
The reason your competitor is ranking better could be the value of their DA and their PA. Without looking specifically it would be hard to say. Google isn't going to show two pages that are exactly the same, which is why they say similar pages have been omitted.
I would not suggest using a link farm. This can only bring you disaster in the long run.
Have you thought about getting customer reviews on page? Using a program that will put customer reviews on page, so that you can see them in the source code is a good way to start leveling the duplicate content out of the equation. You should also put some focus into building quality links. It isn't the quantity of links that you have, but the quality.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google does not index UK version of our site, and serves US version instead. Do I need to remove hreflanguage for US?
Webmaster tools indicates that only 25% of pages on our UK domain with GBP prices is indexed.
International SEO | | lcourse
We have another US domain with identical content but USD prices which is indexed fine. When I search in google for site:mydomain I see that most of my pages seem to appear, but then in the rich snippets google shows USD prices instead of the GBP prices which we publish on this page (USD price is not published on the page and I tested with an US proxy and US price is nowhere in the source code). Then I clicked on the result in google to see cached version of page and google shows me as cached version of the UK product page the US product page. I use the following hreflang code: rel="alternate" hreflang="en-US" href="https://www.domain.com/product" />
rel="alternate" hreflang="en-GB" href="https://www.domain.co.uk/product" /> canonical of UK page is correctly referring to UK page. Any ideas? Do I need to remove the hreflang for en-US to get the UK domain properly indexed in google?0 -
How to interlink 16 different language versions of site?
I remember that Matt Cutts recommended against interlinking many language versions of a site.
International SEO | | lcourse
Considering that google now also crawls javascript links, what is best way to implement interlinking? I still see otherwhise extremely well optimized large sites interlinking to more than 10 different language versions e.g. zalando.de, but also booking.com (even though here on same domain). Currently we have an expandable css dropdown in the footer interlinking 16 different language versions with different TLD. Would you be concerned? What would you suggest how to interlink domains (for user link would be useful)?0 -
Duplicate Page Content due to Language and Currency
Hi Folks, hoping someone can help me out please I have a site that I'd like to rank in France and the UK but I'm getting a stack of duplicate content errors due to English and French pages and GBP and EUR prices. Below is an example of how the home page is duplicated: http://www.site.com/?sl=en?sl=fr
International SEO | | Marketing_Today
http://www.site.com/?sl=fr?sl=fr
http://www.site.com
http://www.site.com/?currency=GBP?sl=fr
http://www.site.com/?currency=GBP?sl=en
http://www.site.com/?sl=fr?sl=en
http://www.site.com/?currency=EUR?sl=fr
http://www.site.com/?currency=EUR?sl=en
http://www.site.com/?currency=EUR
http://www.site.com/?sl=en¤cy=EUR
http://www.site.com/?sl=en¤cy=GBP
http://www.site.com/?sl=en
http://www.site.com/?currency=GBP
http://www.site.com/?sl=en?sl=en Each page has the following code in the that updates according to the page you are on: How do I simplify this and what's the correct approach?0 -
License Details across multiple regional brand sites
Hi guys! I have a quick question. Our team are currently having a debate regarding whether we should display our licensing details as text across all our brands in multiple regions (roughly 50 sites). My argument is that if you are required to have a license to be able to operate legally that Google would EXPECT to be able to crawl those details in order to provide their (Google) users with reliable results as opposed to rogue operators. The other side of the argument is that it will tie all the sites together and that would be a huge risk (as Google will perceive it as a network)- also that it would be seen as duplicate content? Would really appreciate any feedback on what is the best to do in this case. Thanks!!
International SEO | | RedSearch010 -
I have more than 4000 pages but still have a low trafic. Would love to know more to be better ranked ?
My website is a magazine about travel and fashion. But even if i have a lot of pages, I am still low in ranking. Why ? Thanks for any advice !
International SEO | | ccjourn0 -
Suggestion for Baidu Keyword Rank Tracking Software?
Does anyone have a suggestion for the best keyword rank tracking software in Baidu? The keywords would be in Chinese. We currently already have keyword rank tracking software for Google/Bing/Yahoo so we just need to expand to Baidu. I am looking forward to comparing my research with what the Moz community has to say! Thanks!
International SEO | | kchandler0 -
How do I get a UK website to rank in Dubai?
We are trying to get a UK-based children's furniture website to rank in Dubai. We have had a couple of orders from wealthy expats in Dubai and it seems to be the correct target market. Does anyone have any specific knowledge of this area? We are promoting the same website as for the UK market. Also does anyone know any user behaviour stats on expatriates using search engines? Do they carry on using the version of Google they are used to, or do most change to the local version of Google? Thanks in advance
International SEO | | Wagada0 -
What is the best SEO site structure for multi country targeting?
Hi There, We are an online retailer with four (and soon to be five) distinct geographic target markets (we have physical operations in both the UK and New Zealand). We currently target these markets like this: United Kingdom (www.natureshop.co.uk) New Zealand (www.natureshop.co.nz) Australia (www.natureshop.com/au) - using a google web master tools geo targeted folder United States (www.natureshop.com) - using google web master tools geo targeted domain Germany (www.natureshop.de) - in german and yet to be launched as full site We have various issues we want to address. The key one is this: our www.natureshop.co.uk website was adversely affected by the panda update on April 12. We had some external seo firms work on this site for us and unfortunately the links they gained for us were very low quality, from sometimes spammy sites and also "keyword" packed with very littlle anchor text variation. Our other websites (the .co.nz and .com) moved up after the updates so I can only assume our external seo consultants were responsible for this. I have since managed to get them to remove around 70% of these links and we have bought all seo efforts back in house again. I have also worked to improve the quality of our content on this site and I have 404'ed the six worst affected pages (the ones that had far too many single phrase anchor text links coming into them). We have however not budged much in our rankings (we have made some small gains but not a lot). Our other weakness's are not the fastest page load times and some "thin" content. We are on the cusp (around 4 weeks away) of deploying a brand new platform using asp.net MVP with N2 and this looks like it will address our page load speed issues. We also have been working hard on our content building and I believe we will address that as well with this release. Sorry for the long build up, however I felt some background was needed to get to my questions. My questions are: Do you think we are best to proceed with trying to get our www.natureshop.co.uk website out of the panda trap or should we consider deploying a new version of the site on www.natureshop.com/uk/ (geo targeted to the UK)? If we are to do this should we do the same for New Zealand and Germany and redirect the existing domains to the new geo targeted folders? If we do this should we redirect the natureshop.co.uk pages to the new www.natureshop.com/uk/ pages or will this simply pass on the panda "penalty". Will this model build stronger authority on the .com domain that benefit all of the geo targeted sub folders or does it not work this way? Finally can we deploy the same pages and content on the different geo targeted sub folders (with some subtle regional variations of spelling and language) or will this result in a duplicate content penalty? Thank you very much in advance to all of you and I apologise for the length and complexity of the question. Kind Regards
International SEO | | ConradC
Conrad Cranfield
Founder: Nature Shop Ltd0