Dublicate Content: Almost samt site on different domains
-
Hi,
I own a couple of casting websites, which I'm at the moment launching "local" copies of all over the world.
When I launch my website in a new country, the content is basically allways the same, except the language sometimes changes country for country.
The domains will vary, so the sitename would be site.es for Spain, site.sg for Singapore, site.dk for Denmark and so.
The websites will also feature diffent jobs (castings) and diffent profiles on the search.pages and so, BUT the more static pages are the same content (About us, The concept, Faq, Create user and so).
So my Questions are:
- Is this something that is bad for Google SEO?
- The sites are atm NOT linking to each other with language-flags or anything - Should I do this? Basically to tell google that
the business behind all these sites are somewhat big. - Is there a way to inform Google on, that these sites should NOT be treated as dublicate content (Canonical tag wont do, since I want the "same" content to be listet on the locally Google sites).
Hope there is some experts here which can help.
/Kasper
-
Thanks a lot. I wont change anything then.
-
It sounds like you have a few options. Since you need to geo-target since the content is different and NEEDS to be different in each country, except for the standard About Us, etc, you are on the right track. Using ccTLDs automatically tells Google and Bing that you are geo-targeting. All good there.
You should not need hreflang. Each site's main content is different. It's not just translated. You're fine not marking that up.
For the general content, I would recommend a canonical to the original content. That content won't be useful in the SERPs much for you, mostly branded content, so you shouldn't worry about them appearing for each country. You could just have each page on each ccTLD, but they won't perform well. Again, these are not real important pages, so don't fret too much.
-
Google has made it clear, time and time again, that if a web page is in a different language (it's translated), then it's not considered to be duplicate content. So, we recommend translating it into the appropriate language, it will (should) do just fine in Google and won't have an duplicate content issues.
If, however, there's more than one site that has the same content in the same language, like using English in more than one country (having two English sites but targeting different countries), then your content will need to be unique. If it's not unique, then we recommend using the canonical tag to specify which one Google should use. Using the canonical tag should be a last resort, though, as unique content is going to be best.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing server location nearest to visitors? i am confused with the content part.
hi there, currently hosted in Singapore, and target audience is the US, john mueller said keep the url, content and cms the same. i am confused with the content part i have been tweaking the content for a month now because i have changed content on my site a day ago if i change the server the next day? is that bad? what should be done?
Algorithm Updates | | maria-cooper90 -
Does an EAT score on my YMYL site impact my rankings?
I've read some conflicting information on YMYL and EAT. If the Google Quality Raters are out there reviewing YMYL pages and scoring them on EAT, does that site's score have an impact on that page's/site's ranking?
Algorithm Updates | | BFMichael0 -
New Domain, Subdomain or Subfolder
Hi All, I am working with a bank that would like to rank as many parts of the company site as possible for the company name. This includes the home page, a page on careers and a page on company reviews. The question is, it is better to structure the careers and reviews content on a subfolder, subdomain or new domain. Using subfolders to retain equity of the root domain site.americanbank.comamernicanbank.com/careersamericanbank.com/reviews or (use subdomains - you lose some of the main domain equity and it is counter to the Moz research) americanbank.comcareers.americanbank.comreviews.americanbank.com or set up new domains to overcome Google bias not to rank the same root domain in the top 7 to 10 results multiple times when displaying results for a company name. americanbank.comhttp://americanbankcareers.comhttp://americanbankreviews.com Thanks for your perspective.
Algorithm Updates | | BetterAnalytics0 -
Duplicate Domain Listings Gone?
I'm noticing in several of the SERPs I track this morning that the domains that formerly had multiple pages listed on pages 1-3 for the same keyword are now reduced to one listing per domain. I'm hoping that this is a permanent change and widespread as it is a significant boon to my campaigns, but I'm wondering if anyone else here has seen this in their SERPs or knows what I'm talking about...? EX of what I mean by "duplicate domain listings": (in case my wording is confusing here) Search term "Product Item" Pages ranking: domain-one.com/product-item.html domain-one.com/product-item-benefits.html etc...
Algorithm Updates | | jesse-landry1 -
Google Site Links question
Are Google site links only ever shown on the top website? Or is it possible for certain queries for the site in position #2 or #3 or something to have site links but the #1 position not have them? If there are any guides, tips or write ups regarding site links and their behavior and optimization please share! Thanks.
Algorithm Updates | | IrvCo_Interactive0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Microsites for Local Search / Location Based sites?
Referring to the webinar on SEOMoz about Local Search that was presented by Nifty Marketing (http://www.seomoz.org/webinars/be-where-local-is-going). I have a question my client asked us regarding why we broke out their locations into microsites, and not just used subfolders. So here are the details: The client has one main website in real estate. They have 5 branches. Each branch covers about a 50 mile radius. Each branch also covers a specialized niche in their areas. When we created the main site we incorporated the full list of listings on the main site; We then created a microsite for each branch, who has a page of listings (same as the main site) but included the canonical link back to the main site. The reason we created a microsite for each branch is that the searches for each branch are very specific to their location and we felt that having only a subfolder would take away from the relevancy of the site and it's location. Now, the location sites rank on the first page for their very competitive, location based searches. The client, as we encourage, has had recommendations from others saying this is hurting them, not helping them. My question is this... How can this hurt them when the microsites include a home page specific to the location, a contact page that is optimized with location specific information (maps, text, directions, NAP, call to action, etc.), a page listing area information about communities/events/etc., a page of the location's agents, and of course real estate listings (with canonical back to the main site)? Am I misunderstanding? I understood that if the main site could support the separation of a section into a microsite, this would help local search. Local search is the bread and butter of this client's conversions. AND if you tell me we should go back to having subfolders for each location, won't that seriously hurt our already excellent rankings? The client sees significant visitors from their placement of the location URLs. THANKS!
Algorithm Updates | | gXeSEO
Darlene1 -
Why is site dropping in rank after we update it?
One of our sites - supereyes.com - appears to drop in rank after we update it. The client notified us of this today and I've verified that it did indeed drop in Google -- four spots since last week. He says this happens every time we make changes to the site, but then a week later it will go back up and is usually higher than where it was before. I have not verified this, but I'm very worried it may not rise again In the past week, we've posted a new blog entry to their site and we've changed some of the content -- specifically, added their locations to the header, added a contact page and put two testimonials in their sidebar. We've also had someone submitting their site to directories and local business sites like Angie's List and so forth. There are about 16 new backlinks established in the past 2-3 weeks. Also, I should note, traffic is higher than it's ever been, but the client doesn't look at traffic. They only look at their Google results. Can anyone offer any insight into what's going on here and if I need to be worried the site won't rise again in the rankings?
Algorithm Updates | | aloley0