How to SEO a website that is being help back by duplicate content?
-
We have over 20 websites that sell property. Each website is targeted to a different country. People advertise to sell their property. The websites are not getting to page 1 for the terms we want probably because of duplication issues. If we compare one website with another country website on www.duplicatecontent.net we find it is nearly 70% between one and the other. So we trying to understand why this is. If someone wanted to sell a property in Spain we would create an advert for them but rather than putting this on the back-end of the Spain website it goes on a separate website that does on all countries. We have tried to put nofollow tags so that the country specific website gets acknowledgement of being the original website but the rankings for key-terms will not rise and the duplication % remains nearly 70%. Can anyone suggest the best way forward?
-
You are mixing up terminology.
<noindex>- applies to the entire webpage or website.</noindex>
<nofollow>- applies to links</nofollow>
-
How about creating RSS feeds on the Turkey site and adding them to all other sites on your farm (Russia etc.).
This would give you more links and the content would exist only once as the RSS usually refers back to the original link.
Regards,
Jim Cetin
-
Sorry for being unclear. We are only interested in gaining people from UK and Ireland to sell their property in for example Spain and we have the geographical targeting set up already. This is not the issue. The issue is that the Spain website is coming out with a 70% duplicate content with the Turkey website. So the advert for Property Turkey is being put on backend of propertyanywhere.com and then made live on this website as well as the Property Turkey website as well as a Russian and Chinese website. Our tech team have put a nofollow on the "Property Anywhere" website as well as the Russian and Chinese website so that the Turkey Property website is considered to be the original content and help its naturally listings. However, the naturally listings wont go up and Turkey website is still coming up as 70% dupicate content of for example the Spain website. Very messy I know which is our problem
-
I'm a little confused about what you are trying to accomplish, but I'll give it a shot.
If you have a website that you want to ONLY rank in Spain, go to GWT > sit configuration > settings > Geographic target > Spain. "IF" you do this, nothing on this site will show anywhere else except in http://www.google.es/. "IF" you are trying to sell property in Spain to an American, then don't do this because you're listings will never come out in Google.com.
If you are having a duplicate content problem, then rel=canonical it using a plugin in your CMS. http://www.mattcutts.com/blog/rel-canonical-html-head/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate and thin content - advanced..
Hi Guys Two issues to sort out.. So we have a website that lists products and has many pages for: a) The list pages - that lists all the products for that area.
On-Page Optimization | | nick-name123
b) The detailed pages - that when click into from the list page, will list the specific product in full. On the list page, we perhaps have half the description written down, when clicked into you see the full description.
If you search in google for a phrase on the detailed page, you will see results for that specific page including 'multiple' list pages where it is on. For example, lets say we are promoting 'trees' which are situated in Manhatten. And we are also promoting trees in Brooklyn, there is a crossover. So a tree listed in Manhatten will also be listen in brooklyn as its close by (not from America so don't laugh if I have areas muddled)
We then have quite a few pages with the same content as a result. I read a post a while back from the mighty Cutts who said not to worry about the duplicate unless its spammy, but what is good for one person, is spammy to another.. Does anyone have any ideas as to if this is a genuine problem and how you would solve? Also, we know we have alot of thin content on the site, but we dont know how to identify it. It's a large site so needs something automated (I think).. Thanks in advance Nick0 -
Duplicate Content
I have a question about duplicate content. (auto generated text).
On-Page Optimization | | affigroup
Will google consider page 1 and page 2 as duplicate content? Page 1. You will find all the Amazon coupon codes and Amazon discount codes currently available listed below, if Amazon doesn't currently have any coupons available you may want to check for Amazon deals or find related coupon codes or promotional codes for similar online stores selling the same products as amazon.
We always have the latest coupon codes for Amazon which are updated daily, so if you can't find any Amazon coupons here then you won't find them anywhere else.
Shop online today at Amazon, and take advantage of the coupon codes that Amazon currently has on offer, these coupon codes, offer codes, and promo codes for Amazon may never be available again. Page 2. You will find all the Target coupon codes and Target discount codes currently available listed below, if Target doesn't currently have any coupons available you may want to check for Target deals or find related coupon codes or promotional codes for similar online stores selling the same products as Target.
We always have the latest coupon codes for Target which are updated daily, so if you can't find any Target coupons here then you won't find them anywhere else.
Shop online today at Target, and take advantage of the coupon codes that Target currently has on offer, these coupon codes, offer codes, and promo codes for Target may never be available again.0 -
Seo category or specific seo page?
To rank in google.bg for key phrase like "seo optimization for web" which do you thing is better: To make most of the backlinks with anchor text "seo optimization for web" to point to a link that is a category with many seo articles or to point to a single page from this category?
On-Page Optimization | | vladokan0 -
Creating Duplicate Content on Shopping Sites
I have a client with an eCommerce site that is interested in adding their products to shopping sites. If we use the same information that is on the site currently, will we run into duplicate content issues when those same products & descriptions are published on shopping sites? Is it best practice to rewrite the product title and descriptions for shopping sites to avoid duplicate content issues?
On-Page Optimization | | mj7750 -
Duplicate content on video pages
Hi guys, We have a video section on our site containing about 50 videos, grouped by category/difficulty. On each video page except for the embedded player, a sentence or two describing the video and a list of related video links, there's pretty much nothing else. All of those appear as duplicate content by category. What should we do here? How long a description should be for those pages to appear unique for crawlers? Thanks!
On-Page Optimization | | lgrozeva0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0 -
What is considered to be great quality content for e-commerce websites?
Hi there, At the moment I am building an e-commerce website selling a bunch of iphone accessories. I have been aware that great quality content is one of the most important factors for a site to stand out in the list of search results especially when it comes to google's latest panda update. I was wondering what is considered to be great quality content for e-commerce websites? It would be great if you could share some ideas about how to build/create fantastic content for e commerce websites. Many thanks!
On-Page Optimization | | PHDAustralia680 -
Cross Domain Duplicate Content
Hi My client has a series of websies, one main website and several mini websites, articles are created and published daily and weekly, one will go on a the main website and the others on one, two, or three of the mini sites. To combat duplication, i only ever allow one article to be indexed (apply noindex to articles that i don't wanted indexed by google, so, if 3 sites have same article, 2 sites will have noindex tag added to head). I am not completely sure if this is ok, and whether there are any negative affects, apart from the articles tagged as noindex not being indexed. Are there any obvious issues? I am aware of the canonical link rel tag, and know that this can be used on the same domain, but can it be used cross domain, in place of the noindex tag? If so, is it exactly the same in structure as the 'same domain' canonical link rel tag? Thanks Matt
On-Page Optimization | | mattys0