SEO with duplicate content for 3 geographies
-
The client would like us to do seo for these 3 sites
http://www.solutionselectrical.com.au
http://www.calculatecablesizes.co.uk/
The sites have to targetted in US, Australia, and UK resoectively .All the above sites
have identical content. Will Google penalise the sites ?
Shall we change the content completly ? How do we approach this issue ?
-
So, shall i accept the project considering i am safe at seomoz !
That is entirely your decision. I would not recommend depending on SEOmoz for assistance. The Q&A is a fantastic resource for asking an occasional question, but some questions go unanswered and the quality of answers can vary.
You are being paid for your expertise on a subject. Only accept the job if you are confident you can offer a solid benefit to the client. I was very candid with my first clients about my experience. I offered to work hard, work extra and work for less money but I did not ever hide my lack of experience. You may wish to do the same.
Any tool to know the different terms used by australia and uk people.
None that I am aware of. I would recommend locating someone from each country.
-
So, shall i accept the project considering i am safe at seomoz ! Any tool to know the different terms used by australia and uk people. As the content needs to be tailored to each geography.
-
Please clarify this.
Atul, the clarification is the 5 bullet points immediately following that statement.
what does it signify ?
It signifies the language used on the page.
Is this necessary ?
It depends what you mean by necessary.
It is a step towards solid SEO. Most solid SEO involves multiple layers. The idea is it would require multiple failures to cause a problem. I would recommend this step on any site which targets multiple languages or countries.
One could argue it is unnecessary because the proper setting in Google WMT alone should resolve the matter. But then again, the same setting would need to be made for any search engines for which you wish the site to rank.
What is the language code for uk and australia ?
Alex offered a good response to this question.
-
http://en.wikipedia.org/wiki/Language_localisation#Language_tags_and_codes
Australia isn't listed there but it's en-AU. It's necessary if you want to help Google recognise the sites are targeted to different countries, as Ryan mentions language and spellings differ slightly in various English-speaking countries.
-
Since all three sites are in the same language, be sure each site is properly directed to their respective countries
Please clarify this.
Use the proper language code meta tag such as EN-US for the .com
Is this necessary ? what does it signify ? What is the language code for uk and australia ?
-
The sites have to targeted in US, Australia, and UK respectively .All the above sites have identical content. Will Google penalize the sites ?
No. Google does not penalty sites for duplicate content if each site targets a different country.
Since all three sites are in the same language, be sure each site is properly directed to their respective countries. A few steps to take:
-
Use the proper language code meta tag such as EN-US for the .com.
-
You can set the country targets in Google WMT.
-
Use the proper form of English for each country. For example, US English should show "penalize" where UK and Australian English would show the same word as "penalise" (I think).
-
Use the proper currency and measurement systems for each country.
-
Use the appropriate cultural references for each site.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Duplicate Content/Similar Pages
Hello, I'm working on our site and I'm coming into an issue with the duplicate content. Our company manufactures heavy-duty mobile lifts. We have two main lifts. They are the same, except for capacity. We want to keep the format similar and the owner of the company wants each lift to have its own dedicated page. Obviously, since the layout is the same and content is similar I'm getting the duplicate content issue. We also have a section of our accessories and a section of our parts. Each of these sections have individual pages for the accessory/part. Again, the pages are laid out in a similar fashion to keep the cohesiveness, and the content is different, however similar. Meaning different terminology, part numbers, stock numbers, etc., but the overall wording is similar. What can I do to combat these issues? I think our ratings are dropping due to the duplicate content.
Technical SEO | | slecinc0 -
Canonical Tags - Do they only apply to internal duplicate content?
Hi Moz, I've had a complaint from a company who we use a feed from to populate a restaurants product list.They are upset that on our products pages we have canonical tags linking back to ourselves. These are in place as we have international versions of the site. They believe because they are the original source of content we need to canonical back to them. Can I please confirm that canonical tags are purely an internal duplicate content strategy. Canonical isn't telling google that from all the content on the web that this is the original source. It's just saying that from the content on our domains, this is the original one that should be ranked. Is that correct? Furthermore, if we implemented a canonical tag linking to Best Restaurants it would de-index all of our restaurants listings and pages and pass the authority of these pages to their site. Is this correct? Thanks!
Technical SEO | | benj20341 -
Removed .html - Now Get Duplicate Content
Hi there, I run a wordpress website and have removed the .html from my links. Moz has done a crawl and now a bunch of duplicated are coming up. Is there anything I need to do in perhaps my htaccess to help it along? Google appears to still be indexing the .html versions of my links
Technical SEO | | MrPenguin0 -
An odd duplicate content issue...
Hi all, my developers have just assured me that nothing has changed form last week but in the today's crawl I see all the website duplicated: and the difference on the url is the '/' so basically the duplicated urls are: htts://blabla.bla/crop htts://blabla.bla/crop/ Any help in understanding why is much appreciated. thanks
Technical SEO | | LeadGenerator0 -
Duplicate Footer Content
A client I just took over is having some duplicate content issues. At the top of each page he has about 200 words of unique content. Below this is are three big tables of text that talks about his services, history, etc. This table is pulled into the middle of every page using php. So, he has the exact same three big table of text across every page. What should I do to eliminate the dup content. I thought about removing the script then just rewriting the table of text on every page... Is there a better solution? Any ideas would be greatly appreciated. Thanks!
Technical SEO | | BigStereo0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0