Move into new markets
-
We are going to move into a new market - Malaysia. Currently our domain is like this https://www.babyment.com, and what will be your suggestion for domain for our new market: https://my.babyment.com or something else? Furthermore, currently in google webmaster, we set our target country to Singapore, what should I change when we move into the new market? Thanks
-
Hi,
With Generic top-level domains (gTLDs) such as .com Google use something called geotagging. In Google Search Console, you are able to select 'unlisted' within the international targeting option. This means that Google will determine what country the user is in and decide whether the site should be shown within this country.
However, you would need to specify a language within the URL or the sitemap of the page.
For example, if https://www.babyment.com was unlisted in Google Search Console and it was searched for in Malaysia, you would need to specify the country and the language so Google understands that this page can be used by the people in Malaysia.
In my opinion, the best URL to have in this situation would be https://www.babyment.com/en/my which indicates 'en' the website is for English speaking and 'my' is the website is targeted for Malaysia.
Alternatively you can specify the language and country of the page within the source code of the page using hreflang="en-my"
-
What's your website built with?
Can you use wordpress multisite (or something similar if not?)
You could then configure it for mywebsite.com/my/
You could also then have a mywebsite.com/sg
with your .com being a generic catch all international (excluding Singapore and Malaysia)
All you'd need do is get your lang tags set up right and you can copy all your content over
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Proper Use and Interpretation of new Query/Page report
When I'm in WMT/Search Console - I start a process of looking at all of the data initially unfiltered Then I select a query. Let's say its a top query for starters and I filter my results by that top query (exactly) With the filter on, I flip over to Pages and I get about a dozen results. When I look at this list, I get the normal variety of output: impressions, clicks, CTR, avg. position One thing that seems a bit odd to me is that most of the average positions for each of the URLs displayed is about the same. Say they range from 1.0 to 1.3. Does this mean that Google is displaying the dozen or so URLs to different people and generally in the 1st or 2nd position. Does this mean that my dozen or so pages are all competing with each other for the same query? On one hand, if all of my dozen pages displayed most of the time in the SERP all at the same time, I would see this as a good thing in that I would be 'owning' the SERP for my particular query. On the other hand, I'm concerned that the keyword I'm trying to optimize a particular page for is being partially distributed to less optimized pages. The main target page is shown the most (good) and it has about a 15x better CTR (also good). But all together, the other 11 pages are taking in around 40% of impressions and get a far lower CTR (bad). Am I interpreting this data correctly? Is WMT showing me what pages a particular query sends traffic to? Is there any way to extract the keywords that a particular page receives? When I reset my query and then start by selecting a specific page (exact match) and then select queries - is this showing my the search queries that drove traffic to that page? Is there a 'best practices' process to try to target a keyword to a specific page so that it gets more than the 60% of impressions I'm seeing now? Obviously I don't want to do a canonical because each keyword goes to many different pages and each page receives a different mix of keywords. I would think there would be a different technique when your page has an average position off of page 1.
On-Page Optimization | | ExploreConsulting0 -
Moving/deleting blog
I have a website with a blog attached to it. It is a wordpress blog which we have not updated since 2013. However, it has a lot of pages going back to 2010. I am moving my website to weebly so I can't move my blog over. I could recreate all the posts and make redirects to the new url. However, the posts are old and outdated so it doesn't really make sense, aside from seo. I need advice. Would I take an seo hit if i start over with a blog and leave the old posts behind? Thanks!!!
On-Page Optimization | | bhsiao0 -
New google serps page design
hi i know title length displayed is now based on pixels rather than character but still thought safe to have titles up to 70 characters long before they are truncated i see that on the new G serps designed pages titles that were showing in full on old design (without truncation) are now being truncated. As in same title shows fine (displays in full) on old design serps but truncated on new designed page Anyone else notice this ? Cheers Dan
On-Page Optimization | | Dan-Lawrence1 -
New Articles and Posts - what key word to focus on?
I have a few pages on my site focused on key words...such as office design Birmingham. The contact page and a tag page. http:www.businessinteriors.co.uk/tag/office-design-birmingham/ Now I recently published an article about a big new office design in birmingham for a company....and I tagged it as office design birmingham naturally...put it in the category for Birmingham office news....and then also put office design Birmingham at key strategic seo points.... the result being this article now seems to rank higher than my office design Birmingham pages?! My question is this....how should I optimise posts? Lets say I put 3 or 4 posts on my webiste/blog about an "office design Birmingham"....I dont want to rank for "HSBC office design birmingham"....I want the article to lend weight to my office design Birmingham credentials ...so I focus on office design birmingham? I dont really want my posts to rank very high though...I want them to help my key pages "float". I'm very confused how to optimise my posts. If I do it too well, they out rank the "old" pages that I actually want people to visit?! Mmm, thanks for pointers!
On-Page Optimization | | bizint0 -
Did anyone Rankings drop massively last weekend ...Is this new google update ?
Hi all, My Site rankings took a battering in the past week , From what I have read, I know that google have updated their algorithm and supposedly it only affects less than 1% of queries but I was very surprised to have fallen in that list... Just wondered, am i the only site to have been hammered this past week ? or is it more of a case that maybe the new algorithm means I need to be more savvy with our SEO. Just posting a quick general consensus question ... thanks Sarah
On-Page Optimization | | SarahCollins0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
New sitelinks - can we control the number?
A quick question on Google's new sitelink format. When searching for our brand name (Confetti) Google returns 8 sitelinks for our site. When searching for our domain (confetti.co.uk) Google returns the maximum number of 12 sitelinks. Is there a quick way (Webmaster Tools for example) to increase the number of sitelinks for our brand name to 12? Thanks,
On-Page Optimization | | Confetti_Wedding0 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | | SparkplugDigital0