SEO with duplicate content for 3 geographies
-
The client would like us to do seo for these 3 sites
http://www.solutionselectrical.com.au
http://www.calculatecablesizes.co.uk/
The sites have to targetted in US, Australia, and UK resoectively .All the above sites
have identical content. Will Google penalise the sites ?
Shall we change the content completly ? How do we approach this issue ?
-
So, shall i accept the project considering i am safe at seomoz !
That is entirely your decision. I would not recommend depending on SEOmoz for assistance. The Q&A is a fantastic resource for asking an occasional question, but some questions go unanswered and the quality of answers can vary.
You are being paid for your expertise on a subject. Only accept the job if you are confident you can offer a solid benefit to the client. I was very candid with my first clients about my experience. I offered to work hard, work extra and work for less money but I did not ever hide my lack of experience. You may wish to do the same.
Any tool to know the different terms used by australia and uk people.
None that I am aware of. I would recommend locating someone from each country.
-
So, shall i accept the project considering i am safe at seomoz ! Any tool to know the different terms used by australia and uk people. As the content needs to be tailored to each geography.
-
Please clarify this.
Atul, the clarification is the 5 bullet points immediately following that statement.
what does it signify ?
It signifies the language used on the page.
Is this necessary ?
It depends what you mean by necessary.
It is a step towards solid SEO. Most solid SEO involves multiple layers. The idea is it would require multiple failures to cause a problem. I would recommend this step on any site which targets multiple languages or countries.
One could argue it is unnecessary because the proper setting in Google WMT alone should resolve the matter. But then again, the same setting would need to be made for any search engines for which you wish the site to rank.
What is the language code for uk and australia ?
Alex offered a good response to this question.
-
http://en.wikipedia.org/wiki/Language_localisation#Language_tags_and_codes
Australia isn't listed there but it's en-AU. It's necessary if you want to help Google recognise the sites are targeted to different countries, as Ryan mentions language and spellings differ slightly in various English-speaking countries.
-
Since all three sites are in the same language, be sure each site is properly directed to their respective countries
Please clarify this.
Use the proper language code meta tag such as EN-US for the .com
Is this necessary ? what does it signify ? What is the language code for uk and australia ?
-
The sites have to targeted in US, Australia, and UK respectively .All the above sites have identical content. Will Google penalize the sites ?
No. Google does not penalty sites for duplicate content if each site targets a different country.
Since all three sites are in the same language, be sure each site is properly directed to their respective countries. A few steps to take:
-
Use the proper language code meta tag such as EN-US for the .com.
-
You can set the country targets in Google WMT.
-
Use the proper form of English for each country. For example, US English should show "penalize" where UK and Australian English would show the same word as "penalise" (I think).
-
Use the proper currency and measurement systems for each country.
-
Use the appropriate cultural references for each site.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
Duplicate Content with ADN, DNS and F5 URLs
In my duplicate content report, there are URLs showing as duplicate content. All of the pages work, they do not redirect, and they are used for either IT debugging or as part of a legacy system using a split DNS, QAing the site, etc... They aren't linked (or at least, shouldn't be) on any pages, and I am not seeing them in Search Results, but Moz is picking them up. Should I be worried about duplicate content here and how should I handle them? They are replicates of the current live site, but have different subdomains. We are doing clean up before migrating to a new CMS, so I'm not sure it's worth fixing at this point, or if it is even an issue at all. But should I make sure they are in robots or take any action to address these? Thanks!
Technical SEO | | QAD_ERP0 -
Duplicated content in news portal: should we use noindex?
Hello, We have a news portal, and like other newspapers we have our own content and content from other contributors. Both our content and our contributors content can be found in other websites (we sell our content and they give theirs to us). In this regard, everything seems to work fine from the business and users perspective. The problem is that this means duplicated content... so my question is: "Should we add the noindex,nofollow" tag to these articles? Notice that there might be hundreds of articles everyday, something like a 1/3 of the website. I checked one newspaper which uses news from agencies, but they seem not to use any noindex tag. Not sure what others do. I would appreciate any opinion on that.
Technical SEO | | forex-websites0 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
Caps in URL creating duplicate content
Im getting a bunch of duplicate content errors where the crawl is saying www.url.com/abc has duplicate at www.url.com/ABC The content is in magento and the url settings are lowercase, and I cant figure out why it thinks there is duplicate consent. These are pages with a decent number of inbound links.
Technical SEO | | JohnBerger0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
Duplicate Page Content
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page? http://poolstar.net/ http://poolstar.net/Home_Page.php
Technical SEO | | RouteAccounts0