SEO with duplicate content for 3 geographies
-
The client would like us to do seo for these 3 sites
http://www.solutionselectrical.com.au
http://www.calculatecablesizes.co.uk/
The sites have to targetted in US, Australia, and UK resoectively .All the above sites
have identical content. Will Google penalise the sites ?
Shall we change the content completly ? How do we approach this issue ?
-
So, shall i accept the project considering i am safe at seomoz !
That is entirely your decision. I would not recommend depending on SEOmoz for assistance. The Q&A is a fantastic resource for asking an occasional question, but some questions go unanswered and the quality of answers can vary.
You are being paid for your expertise on a subject. Only accept the job if you are confident you can offer a solid benefit to the client. I was very candid with my first clients about my experience. I offered to work hard, work extra and work for less money but I did not ever hide my lack of experience. You may wish to do the same.
Any tool to know the different terms used by australia and uk people.
None that I am aware of. I would recommend locating someone from each country.
-
So, shall i accept the project considering i am safe at seomoz ! Any tool to know the different terms used by australia and uk people. As the content needs to be tailored to each geography.
-
Please clarify this.
Atul, the clarification is the 5 bullet points immediately following that statement.
what does it signify ?
It signifies the language used on the page.
Is this necessary ?
It depends what you mean by necessary.
It is a step towards solid SEO. Most solid SEO involves multiple layers. The idea is it would require multiple failures to cause a problem. I would recommend this step on any site which targets multiple languages or countries.
One could argue it is unnecessary because the proper setting in Google WMT alone should resolve the matter. But then again, the same setting would need to be made for any search engines for which you wish the site to rank.
What is the language code for uk and australia ?
Alex offered a good response to this question.
-
http://en.wikipedia.org/wiki/Language_localisation#Language_tags_and_codes
Australia isn't listed there but it's en-AU. It's necessary if you want to help Google recognise the sites are targeted to different countries, as Ryan mentions language and spellings differ slightly in various English-speaking countries.
-
Since all three sites are in the same language, be sure each site is properly directed to their respective countries
Please clarify this.
Use the proper language code meta tag such as EN-US for the .com
Is this necessary ? what does it signify ? What is the language code for uk and australia ?
-
The sites have to targeted in US, Australia, and UK respectively .All the above sites have identical content. Will Google penalize the sites ?
No. Google does not penalty sites for duplicate content if each site targets a different country.
Since all three sites are in the same language, be sure each site is properly directed to their respective countries. A few steps to take:
-
Use the proper language code meta tag such as EN-US for the .com.
-
You can set the country targets in Google WMT.
-
Use the proper form of English for each country. For example, US English should show "penalize" where UK and Australian English would show the same word as "penalise" (I think).
-
Use the proper currency and measurement systems for each country.
-
Use the appropriate cultural references for each site.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content Issue
Hello, I recently solved www / no www duplicate issue for my website, but now I am in trouble with duplicate content again. This time something that I cannot understand happens: In Crawl Issues Report, I received Duplicate Page Content for http://yourappliancerepairla.com (DA 19) http://yourappliancerepairla.com/index.html (DA 1) Could you please help me figure out what is happenning here? By default, index.html is being loaded, but this is the only index.html I have in the folder. And it looks like the crawler sees two different pages with different DA... What should I do to handle this issue?
Technical SEO | | kirupa0 -
Duplicate content on charity website
Hi Mozers, We are working on a website for a UK charity – they are a hospice and have two distinct brands, one for their adult services and another for their children’s services. They currently have two different websites which have a large number of pages that contain identical text. We spoke with them and agreed that it would be better to combine the websites under one URL – that way a number of the duplicate pages could be reduced as they are relevant to both brands. What seamed like a good idea initially is beginning to not look so good now. We had planned to use CSS to load different style sheets for each brand – depending on the referring URL (adult / Child) the page would display the appropriate branding. This will will work well up to a point. What we can’t work out is how to style the page if it is the initial landing page – the brands are quite different and we need to get this right. It is not such an issue for the management type pages (board of trustees etc) as they govern both identities. The issue is the donation, fundraising pages – they need to be found, and we are concerned that users will be confused if one of those pages is the initial landing page and they are served the wrong brand. We have thought of making one page the main page and using rel canonical on the other one, but that will affect its ability to be found in the search engines. Really not sure what the best way to move forward would be, any suggestions / guidance would be much appreciated. Thanks Fraser .
Technical SEO | | fraserhannah0 -
When is Duplicate Content Duplicate Content
Hi, I was wondering exactly when duplicate content is duplicate content? Is it always when it is word-for-word or if it is similar? For example, we currently have an information page and I would like to add a FAQ to the website. There is, however, a crossover with the content and some of it is repeated. However, it is not written word for word. Could you please advise me? Thanks a lot Tom
Technical SEO | | National-Homebuyers0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Content development for improving organic SEO Questions
I have a real estate web site at www.nhfinehomes.com. The site is build on a Windows Server running ASP.NET with a basic, home grown CMS. In my Google Analytic under Traffic Sources, Search Engine Optimization, Queries I see a list of over 1,000 keyword phrases that my site had ranked for. In an effort to improve these rankings, I am under the assumption that if I create original content for these terms and add it to my site, it could help improve ranks? First question, is this a valid strategy worth pursuing assuming the content is good, I use good internal linking practices and make sure not to 'orphan' these pages? Second, because my current site technology doesn't really make it easy to add content, I have figured out how to install a WordPress blog at www.nhfinehomes.com/blog (currently not up since the MySQL DB died for some reason) So am I at any disadvantage adding all this content to /blog vs. right on the primary CMS? Obviously the WordPress template is not the same as the main pages so primary nav and footer nav differ. Of course I can cross link between the two but wanted to make sure this was not going to penalize the value of this content. Third, if there is anyone with Windows/.NET optimization experience at the server and site architecture level and can help me give my site a once over, I would be most appreciative. I'm not the programmer and rely and a friend to help with that and his SEO experience is very limited. Thanks in advance.
Technical SEO | | LinkMoser0 -
Canonical usage and duplicate content
Hi We have a lot of pages about areas like ie. "Mallorca" (domain.com/Spain/Mallorca), with tabbed pages like "excursion" (domain.com/spain/Mallorca/excursions) and "car rental" (domain.com/Spain/Mallorca/car-rental) etc. The text on ie the "car rental"-page is very similar on Mallorca and Rhodos, and seomoz marks these as duplicate content. This happens on "car rental", "map", "weather" etc. which not have a lot of text but images and google maps inserted. Could i use rel=nex/prev/canonical to gather the information from the tabbed pages? That could show google that the Rhodos-map page is related to Rhodos and not Mallorca. Is that all wrong or/and is there a better way to do this? Thanks, Alsvik
Technical SEO | | alsvik0 -
Is there ever legitimate near duplicate content?
Hey guys, I’ve been reading the blogs and really appreciate all the great feedback. It’s nice to see how supportive this community is to each other. I’ve got a question about near duplicate content. I’ve read a bunch of great post regarding what is duplicate content and how to fix it. However, I’m looking at a scenario that is a little different from what I’ve read about. I’m not sure if we’d get penalized by Google or not. We are working with a group of small insurance agencies that have combined some of their back office work, and work together to sell the same products, but for the most part act as what they are, independent agencies. So we now have 25 different little companies, in 25 different cities spread across the southeast, all selling the same thing. Each agency has their own URL, each has their own Google local places registration, their own backlinks to their local chambers, own contact us and staff pages, etc. However, we have created landing pages for each product line, with the hopes of attracting local searches. While we vary each landing page a little per agency (the auto insurance page in CA talks about driving down the 101, while the auto insurance page in Georgia says welcome to the peach state) probably 75% of the land page content is the same from agency to agency. There is only so much you can say about specific lines of insurance. They have slightly different titles, slightly different headers, but the bulk of the page is the same. So here is the question, will Google hit us with a penalty for having similar content across the 25 sites? If so, how do you handle this? We are trying to write create content, and unique content, but at the end of the day auto insurance in one city is pretty much the same as in another city. Thanks in advance for your help.
Technical SEO | | mavrick0 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0