Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Regional and Global Site
-
We have numerous versions of what is basically the same site, that targets different countries, such as United States, United Kingdom, South Africa.
These websites use Tlds to designate the region, for example, co.uk, co.za I believe this is sufficient (with a little help from Google Webmastertools) to convince the search engines what site is for what region.
My question is how do we tell the search engines to send traffic from other regions besides the above to our global site, which would have a .com TLD.
For example, we don't have a Brazilian site, how do we drive traffic from Brazil to our global .com site?
Many thanks,
Jason
-
Hi Jason,
If you use the unique ccTLDs and the href lang / rel="alternative" tag, this duplication will be fine. The tag was brought out in late 2011 and tells Google: "just because this content is the same on an Australian site, a British site and an American site, this is okay - it has been done on purpose." You can also use it to point to direct translations, e.g. "this Spanish content is the same as this English content over here, but one is meant for the UK and one for Argentina." Lastly, you can also use this tag as mark up to say "This is French content meant for Canada, and this English content over here is also meant for Canada".
More information about the tag is available here and here.
Cheers,
Jane
-
That last comment kind of worried me. Each site has a separate domain, but is the content all the same? You'd basically be competing with each other and even your global site, not to mention suffering from possible duplicate content issues. Not sure what kind of approach you've taken but can't think of too many reasons one would want to host the same site on multiple domains.
But to answer your question, yes. More tips here: https://support.google.com/webmasters/answer/62399?hl=en
Good luck Jason. I'm not going to sleep well because of what I read here tonight, but if you're fulfilling your business goals with this approach I'll just have to trust that you know what you're doing.
-
Hi Kevin,
Thanks for your quick and helpful reply.
So if I understand you correctly, without specifically targeting a region using web master tools and using a region agnostic tld such as .com tells Google that this is our Global Site. As long as we leave our global site un-targeted and the regional sites targeted would be the most effective way to ensure non-regional traffic is driven to our global site.
As our sites are not multilingual, maybe the Brazilian example was not the best - however I was referring to English based queries.
With regards to your last point, no each site is a separate domain.
Thanks again for you help and advise.
Many thanks,
Jason
-
Hey Jason,
You're basically telling Google to do just that when you indicate the geographic target for .com site as unlisted and your other ccTLDs target as a specific region. If you don't have language specific content for Brazil in Portuguese or Spanish, then ranking for commonly searched terms in this locale will be challenging.
If you care to post a link to your site, we can all give you better advice. One question I have after thinking about your question for a minute is do you have these ccTLDs all pointing to the same site? Is your content translated into different languages? Unless there is a strong rationale for these ccTLDs, i.e. sales tracking, conversions, etc., I'm going to say you're probably hurting your ranking by having so many URLs all pointing to the same page. This spreads link juice out between all the pages instead of concentrating it to a single URL.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
So many links from single site?
this guy is ranking on all high volume keywords and has low quality content, he has 1600 ref domains check the attachment how did he get so many links from single site is he gonna be penalized YD2BvQ0
Intermediate & Advanced SEO | | SIMON-CULL0 -
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
Using the same image across the site?
Hi just wondering i'm using the same image across 20 pages which are optimized for SEO purposes. I was wondering is there issues with this from SEO standpoint? Will Google devalue the page because the same image is being used? Cheers.
Intermediate & Advanced SEO | | seowork2140 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Noindex a meta refresh site
I have a client's site that is a vanity URL, i.e. www.example.com, that is setup as a meta refresh to the client's flagship site: www22.example.com, however we have been seeing Google include the Vanity URL in the index, in some cases ahead of the flagship site. What we'd like to do is to de-index that vanity URL. We have included a no-index meta tag to the vanity URL, however we noticed within 24 hours, actually less, the flagship site also went away as well. When we removed the noindex, both vanity and flagship sites came back. We noticed in Google Webmaster that the flagship site's robots.txt file was corrupt and was also in need of fixing, and we are in process of fixing that - Question: Is there a way to noindex vanity URL and NOT flagship site? Was it due to meta refresh redirect that the noindex moved out the flagship as well? Was it maybe due to my conducting a google fetch and then submitting the flagship home page that the site reappeared? The robots.txt is still not corrected, so we don't believe that's tied in here. To add to the additional complexity, the client is UNABLE to employ a 301 redirect, which was what I recommended initially. Anyone have any thoughts at all, MUCH appreciated!
Intermediate & Advanced SEO | | ACNINTERACTIVE0 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90