SEO with duplicate content for 3 geographies
-
The client would like us to do seo for these 3 sites
http://www.solutionselectrical.com.au
http://www.calculatecablesizes.co.uk/
The sites have to targetted in US, Australia, and UK resoectively .All the above sites
have identical content. Will Google penalise the sites ?
Shall we change the content completly ? How do we approach this issue ?
-
So, shall i accept the project considering i am safe at seomoz !
That is entirely your decision. I would not recommend depending on SEOmoz for assistance. The Q&A is a fantastic resource for asking an occasional question, but some questions go unanswered and the quality of answers can vary.
You are being paid for your expertise on a subject. Only accept the job if you are confident you can offer a solid benefit to the client. I was very candid with my first clients about my experience. I offered to work hard, work extra and work for less money but I did not ever hide my lack of experience. You may wish to do the same.
Any tool to know the different terms used by australia and uk people.
None that I am aware of. I would recommend locating someone from each country.
-
So, shall i accept the project considering i am safe at seomoz ! Any tool to know the different terms used by australia and uk people. As the content needs to be tailored to each geography.
-
Please clarify this.
Atul, the clarification is the 5 bullet points immediately following that statement.
what does it signify ?
It signifies the language used on the page.
Is this necessary ?
It depends what you mean by necessary.
It is a step towards solid SEO. Most solid SEO involves multiple layers. The idea is it would require multiple failures to cause a problem. I would recommend this step on any site which targets multiple languages or countries.
One could argue it is unnecessary because the proper setting in Google WMT alone should resolve the matter. But then again, the same setting would need to be made for any search engines for which you wish the site to rank.
What is the language code for uk and australia ?
Alex offered a good response to this question.
-
http://en.wikipedia.org/wiki/Language_localisation#Language_tags_and_codes
Australia isn't listed there but it's en-AU. It's necessary if you want to help Google recognise the sites are targeted to different countries, as Ryan mentions language and spellings differ slightly in various English-speaking countries.
-
Since all three sites are in the same language, be sure each site is properly directed to their respective countries
Please clarify this.
Use the proper language code meta tag such as EN-US for the .com
Is this necessary ? what does it signify ? What is the language code for uk and australia ?
-
The sites have to targeted in US, Australia, and UK respectively .All the above sites have identical content. Will Google penalize the sites ?
No. Google does not penalty sites for duplicate content if each site targets a different country.
Since all three sites are in the same language, be sure each site is properly directed to their respective countries. A few steps to take:
-
Use the proper language code meta tag such as EN-US for the .com.
-
You can set the country targets in Google WMT.
-
Use the proper form of English for each country. For example, US English should show "penalize" where UK and Australian English would show the same word as "penalise" (I think).
-
Use the proper currency and measurement systems for each country.
-
Use the appropriate cultural references for each site.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Partially duplicated content on separate pages
TL;DR: I am writing copy for some web pages. I am duplicating some bits of copy exactly on separate web pages. And in other cases I am using the same bits of copy with slight alterations. Is this bad for SEO? Details: We sell about 10 different courses. Each has a separate page. I'm currently writing copy for those pages. Some of the details identical for each course. So I can duplicate the content and it will be 100% applicable. For example, when we talk about where we can run courses (we go to a company and run it on their premises) – that's applicable to every course. Other bits are applicable with minor alterations. So where we talk about how we'll tailor the course, I will say for example: "We will the tailor the course to the {technical documents|customer letters|reports} your company writes." Or where we have testimonials, the headline reads "Improving {customer writing|reports|technical documents} in every sector and industry". There is original content on each page. The duplicate stuff may seem spammy, but the alternative is me finding alternative re-wordings for exactly the same information. This is tedious and time-consuming and bizarre given that the user won't notice any difference. Do I need to go ahead and re-write these bits ten slightly different ways anyway?
Technical SEO | | JacobFunnell0 -
SEO for User Authenticated Content
Hi Everyone - I have a potential client who is seeking SEO for a site that contains about 95% of content only accessible through user authentication . Does anyone have tips for getting this indexed without having to open it up to the public? I was considering adding "snippets" into the robots.txt or creating an additional page with snippets linking to the login page. I'd appreciate any thoughts! Thanks!
Technical SEO | | manutx0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Category URL Duplicate Content
I've recently been hired as the web developer for a company with an existing web site. Their web architecture includes category names in product urls, and of course we have many products in multiple categories thus generating duplicate content. According to the SEOMoz Site Crawl, we have roughly 1600 pages of duplicate content, I expect primarily from this issue. This is out of roughly 3600 pages crawled. My questions are: 1. Fixing this for the long term will obviously mean restructuring the URLs for the site. Is this worthwhile and what will the ramifications be of performing such a move? 2. How can I determine the level and extent of the effects of this duplicated content? 3. Is it possible the best course of action is to do nothing? The site has many, many other issues, and I'm not sure how highly to prioritize this problem. In addition, the IT man is highly doubtful this is causing an SEO issue, and I'm going to need to be able to back up any action I request. I do feel I will need to strongly justify any possible risks this level of site change could cause. Thanks in advance, and please let me know if any more information is needed.
Technical SEO | | MagnetsUSA0 -
Client has 3 websites, for various locations & duplicate content is a big issue...Is my solution the best?
Hi guys, I have a client who has 3 websites all for different locations in the same state in Australia. Obviously this is not the best practice but in the meeting he said that each area is quite particular about where they do business. What he means is that people from one area want to do business with a website from that particular area. He has 3 domains and we have duplicate content issues. We are solving these at the moment with the canonical tag however they are redesigning the site soon. My suggestion is that we have 1 domain and sub domains for the other 2 areas. This way the people from that area will see the company is from their area. Also this way we have 1 domain to optimise and build domain authority for. Has anyone else come across this and is my solution the best for this? Thanks! Jon
Technical SEO | | Jon_bangonline0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Are recipes excluded from duplicate content?
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
Technical SEO | | RiseSEO0