What's the best international URL strategy for my non-profit?
-
Hi, I have a non-profit organization that advocates for mental health education and treatment. We are considering creating regional chapters of the non-profit in specific countries - France, UK, Russia, etc. What's the best long-term foundation for global organic growth? Should we simply internationalize our content (.org/uk/)? Or create a custom site for each ccTLD (.org.uk, etc.?
Since it's an educational site, the content for each country would not be particularly unique, apart from:
-
Language (regional English nuance for UK and AUS, or other languages altogether)
-
Expert videos and potentially supporting articles (i.e., hosting videos and a supporting article for a UK Doctor versus a US Doctor)
-
Offering some regional context when it comes to treatment options, or navigating school, work, etc.
Any thoughts would be much appreciated!
Thanks!
Aaron
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are SEO best practices for Java Language Redirections?
We would like to get some insight on what is the best practice of setting up canonical URLs in the below scenario. CMS used: Liferay – we believe they are using java. The URL structure at this stage can not be changed to best practices (/en/ and /ar/). Currently the language redirections works like this: English: https://www.website.com/page1?AF_language=en Arabic: https://www.website.com/page1?AF_language=ar Depending how you entered the website last time the root URL will show English or Arabic content without the ‘sufix’: https://www.website.com/page1 All 3 different URL’s are being indexed on Google - which is causing duplication and confusion. We have a few ideas: Have 2 main URLS: https://www.website.com/page1?AF_language=en and have the canonical set to https://www.website.com/page1?AF_language=en https://www.website.com/page1?AF_language=ar and have canonical set to https://www.website.com/page1?AF_language=ar However, how would you handle the root page which does not have a specific language attached. If we need to make a choice we would go with Arabic, as mainly Arabic pages are indexed on Google with the root domain. This way we would (hopefully) retain the rankings for this. Question: did anybody had to deal with a similar situation? What would you do in a similar situation and why? Thanks for all your input.
Local Website Optimization | | skrauss0 -
Remove URLs from App
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable. Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing. Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs? Thanks for your help and advice!
Local Website Optimization | | Airbnb_data_geek1 -
Looking to create a "best practice" doc on location pages. Anyone know of a useful resource?
I'm working for a few regional brands and would like to create a best practice doc for the structure of a location page. Has anyone seen anything recent regarding a structure for local, regional and national pages? Thanks all, Kevin
Local Website Optimization | | Kevin.Bekker1 -
Content Strategy – Blog Channel Questions
We are currently blogging at a high volume to hit keywords for our 1,500 locations across the country. We are trying to make sure we rank well near each location and we have been using our blog to create content for that reason. With recent changes on Google, I am seeing that it is more about content topics than hitting all variations of your keywords and including state and city specific terms. We are now asking ourselves if the blog channel portion of our content strategy is incorrect. Below are some of the main questions we have and any input that is backed by experience would be helpful. 1. Can it hurt us to blog at a high volume (4 blogs per day) in an effort to include all of our keywords and attach them to state and city specific keywords (ie. "keyword one" with "keyword one city" and "keyword one different city")? 2. Is it more valuable to blog only a couple of times per month with deeper content, or more times per month with thinner connect but more keyword involvement? 3. Our customers are forced to use our type of product by the government. We are one of the vendors that provide this service. Because of this our customers may not care at all about anything we would blog about. Do we blog for them, or do we blog for the keyword and try and reach partners and others who would read the content and hope that it also ranks us high when our potential customers search? 4. Is there an advantage/disadvantage or does it matter if we have multiple blog authors? Big questions for sure, but if you have insight on any one of them, please provide and maybe we can answer them all with a group effort. Thanks to all of you who are taking the time to read this and contribute.
Local Website Optimization | | Smart_Start0 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Which is the best, ".xx" or ".com.xx" in general and for SEO?
Hi, I'm working for a digital marketing agency and have traffic from different countries. We are planning to make different websites for each country. What is the best SEO practice to choose the domain between ".xx" or ".com.xx" from Spain, Mexico, Chile, Colombia and Peru?
Local Website Optimization | | NachoRetta
I think that the ccTLD is better always, for example ".es" better than ".com.es"0 -
How best to clean up doorway pages. 301 them or follow no index ?
Hi Mozzers, I have what is classed as doorway pages on my website. These have historically been location specific landing pages for some of our categories but from speaking to a number of different webmasters , then general consensus is that they are not in google guidelines so I will be getting punished by having them. My options are : I can 301 the pages back to their original category pages . This will conserve some link juice to pass back to the respective category page. I can set these as Follow No index. Not sure what will happen here with regards to link value etc. What would be best ?... Some of the pages do currently rank "fairly well" for some of the locations so I am getting traffic from them but I also know I will be getting a algorithmic penalty for having them so how best I clean these up ?. Also , by cleaning up the site structure , would I see any benefit here ? or will I have to wait for a new panda update/ refresh ? I thought the panda refresh won't use a new dataset thanks Pete
Local Website Optimization | | PeteC120 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0