How to SEO a website that is being help back by duplicate content?
-
We have over 20 websites that sell property. Each website is targeted to a different country. People advertise to sell their property. The websites are not getting to page 1 for the terms we want probably because of duplication issues. If we compare one website with another country website on www.duplicatecontent.net we find it is nearly 70% between one and the other. So we trying to understand why this is. If someone wanted to sell a property in Spain we would create an advert for them but rather than putting this on the back-end of the Spain website it goes on a separate website that does on all countries. We have tried to put nofollow tags so that the country specific website gets acknowledgement of being the original website but the rankings for key-terms will not rise and the duplication % remains nearly 70%. Can anyone suggest the best way forward?
-
You are mixing up terminology.
<noindex>- applies to the entire webpage or website.</noindex>
<nofollow>- applies to links</nofollow>
-
How about creating RSS feeds on the Turkey site and adding them to all other sites on your farm (Russia etc.).
This would give you more links and the content would exist only once as the RSS usually refers back to the original link.
Regards,
Jim Cetin
-
Sorry for being unclear. We are only interested in gaining people from UK and Ireland to sell their property in for example Spain and we have the geographical targeting set up already. This is not the issue. The issue is that the Spain website is coming out with a 70% duplicate content with the Turkey website. So the advert for Property Turkey is being put on backend of propertyanywhere.com and then made live on this website as well as the Property Turkey website as well as a Russian and Chinese website. Our tech team have put a nofollow on the "Property Anywhere" website as well as the Russian and Chinese website so that the Turkey Property website is considered to be the original content and help its naturally listings. However, the naturally listings wont go up and Turkey website is still coming up as 70% dupicate content of for example the Spain website. Very messy I know which is our problem
-
I'm a little confused about what you are trying to accomplish, but I'll give it a shot.
If you have a website that you want to ONLY rank in Spain, go to GWT > sit configuration > settings > Geographic target > Spain. "IF" you do this, nothing on this site will show anywhere else except in http://www.google.es/. "IF" you are trying to sell property in Spain to an American, then don't do this because you're listings will never come out in Google.com.
If you are having a duplicate content problem, then rel=canonical it using a plugin in your CMS. http://www.mattcutts.com/blog/rel-canonical-html-head/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To avoid the duplicate content issue I have created new urls for that specific site I am posting to and redirecting that url to the original on my site. Is this the right way to do it?
I am trying to avoid the duplicate content issue by creating new urls and redirecting them to the original url. Is this the proper way of going about it?
On-Page Optimization | | yagobi210 -
Suggestions on dealing with duplicate content?
What are the best ways to protect / deal with duplicate content? I've added an example scenario, Nike Trainer model 1 – has an overview page that also links to a sub-page about cushioning, one about Gore-Tex and one about breathability. Nike Trainer model 2,3,4,5 – have an overview page that also links to sub-pages page about cushioning , Gore-Tex and breathability. In each of the sub-pages the URL is a child of the parent so a distinct page from each other e.g. /nike-trainer/model-1/gore-tex /nike-trainer/model-2/gore-tex. There is some differences in material composition, some different images and of course the product name is referred multiple times. This makes the page in the region of 80% unique. Any suggestions welcome about the above example or any other ways you guys know of dealing with duplicate content.
On-Page Optimization | | punchseo0 -
Would a free PDF download diminish SEO benefits of HTML content?
Hello, I am doing SEO for a company that, as a sideline business, sells four books written by the principals; the content is directly relevant to the company's primary business focus. Book sales are a tiny fraction of our overall revenue, and we don't expect that to change, although we will continue to sell the books. In addition to selling them, we have decided to convert the books to HTML and post them for free on our website (laid out by chapter and section). The hope is that this will result in goodwill, links, traffic, and ultimately improved search rankings. My question: Would offering free PDF downloads of the books (in addition to posting the HTML content) diminish the SEO benefits of the HTML content? If we don't offer the PDF option, people would have to visit our site to read the content (unless they bought a hard copy). If visitors were able to download a free PDF, they wouldn't need to return to our site to read it. If our corporate clients (nearly all of our clients are corporations) could download a PDF, they could then post it on an intranet instead of posting a link to our site. In general, do you think a visitor would be less likely to link to our site if he or she were able to download the PDF? Or would the appeal of the PDF option make it more likely that people would visit and link to the site? Also, if we offer the PDF option, are there any SEO issues related to duplicate content? Finally, if we did offer the free PDF download, would you recommend that we ask for an email address before giving the PDF? Thank you very much!
On-Page Optimization | | nyc-seo0 -
Help! A couple of basic questions on dup. content, pagination and tumblr blogs.
Hi, and many thanks in advance for any assistance. According to our GWMT we currently have over a thousand duplicated title tags and meta descriptions. These stem from tabs that we have located beneath the body copy, which when you click on them display offers or itineraries (we're a travel company). So the URLs change to having "?st=Offer" or "?st=Itinerary" at the end, and are considered to be duplicating the original page's title and meta des. Sometimes the original page is also paginated, and shows the same duplication errors. What would be the best way to ensure we're not duplicating anything? Also, we have a tumblr blog, where there's single page displaying all the blog content, but also links to each blog on a separate individual page. We would like to keep the individual pages as we can optimise to target specific keywords, but want to avoid any duplication issues again. Any advice would be greatly appreciated.
On-Page Optimization | | LV70 -
Meta Data definition for multiple pages. Potential duplicate content risk?
Hi all, One of our clients needs to redefine their meta title and description tags. They publish very similar information almost every day, so the structure they propose is the following: Structure 1: Type of Analysis + periodicity + data + brand name Examples 1: Monthly Market Analysis, 1/5/2012 - Brand Name Weekly Technical Analysis, 7/5/2012 - Brand Name Structure 2: Company Name + investment recommendation + periodicity Example 2: Iberdrola + investment recommendation (this text doesn't vary) + 2T12 (wich means 2012, 2nd trimestrer) Regarding meta description they want to follow a similar approach, replicating every time the same info with a slight variation for each publication. I'm afraid this may cause a duplicate content problem because of the resemblance of every "Market Analysis" done or every "Investment recommendation" done in the future. My initial suggestion for them is to define specific and unique meta data for each page, but this is not possible for them given the time it takes to do it for every page. Finally, I ask them to specify the data in each meta title of content published, in order to add something different each time and avoid duplicate content penalty. Will this be enough to avoid duplicate content issues? Thanks in advance for your help folks! Alex
On-Page Optimization | | elisainteractive0 -
Duplicate content issue in SEOmoz campaign.
Hi, We are running a campaign for a website in SEOmoz. We get a dup content issue warning: http://www.oursite.com and http://www.oursite.com/ are being seen as 2 different urls. Only difference among 2 urls is the trailing slash at the end of the second url. Why is this happening? I was aware of www vs non www but never heard of an issue related to the slash. Thanks for your help!
On-Page Optimization | | gerardoH1 -
Duplicate content on homepage?
Hi I have just created a new campaign and it states that I have duplicate page content which would affect search rankings. Basically it is counting my site www.mydomain.com and www.mydomain.com/index.php as two seperate pages. How can I make it so that only www.mydomain.com is visible reducing the duplicate content issue? Many Thanks
On-Page Optimization | | idv0 -
Duplicate content Issue
I'm getting a report of duplicate title and content on: http://www.website.com/ http://www.website.com/index.php Of course, they're the same pages but does this need to be corrected somehow. Thanks!
On-Page Optimization | | dbaxa-2613380