How to do Country specific indexing ?
-
We are a business that operate in South East Asian countries and have medical professionals listed in Thailand, Philippines and Indonesia.
When I go to Google Philippines and check I can see indexing of pages from all countries and no Philippines pages. Philippines is where we launched recently. How can I tell Google Philippines to give more priority to pages from Philippines and not from other countries
Can someone help?
-
The use of the hreflang can be useful anyway, especially from Brand searches (aka: "name of the brand" as search term), because in that case you can see your "local country" web site outranked by your most powerful one despite of the geotargeting you have set up in GWT.
-
Thanks Tom. This was really helpful. I will try to set up WMT for each country as you suggested
-
Thanks. But the content in each country is different
-
I would also suggest that you add alternate language tags (in the HTML code as well as the XML Sitemap).
This would avoid any duplicate content issues that might arise due to different country websites.
The instructions for alternate language tags can be found here.
Thanks,
Sajeet -
I would set up a new profile for your Philippines pages in Google Webmaster Tools.
Whether you're on a country sub-domain and sub-directory structure (ph.example.com or example.com/ph/) won't matter - you can set up a new profile in WMT. Go to WMT and add a new site, then enter the URL for the Philippines part of your site.
Once this is done, you can then instruct Google that those pages in your subdomain/directory are targeting Filipinos. From the International Targeting section, choose the Country tab - then Check the Geographic target checkbox and choose your country target.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Subdomain & Best Way To Index
We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!
Intermediate & Advanced SEO | | Markbwc0 -
Why is a canonicalized URL still in index?
Hi Mozers, We recently canonicalized a few thousand URLs but when I search for these pages using the site: operator I can see that they are all still in Google's index. Why is that? Is it reasonable to expect that they would be taken out of the index? Or should we only expect that they won't rank as high as the canonical URLs? Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
Why does Moz recommend subdomains for language-specific websites?
In Moz's domain recommendations, they recommend subdirectories instead of subdomains (which agrees with my experience), but make an exception for language-specific websites: Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website). Why are language-specific websites excepted from this advice? Why are subdomains preferable for language-specific websites? Google's advice says subdirectories are fine for language-specific websites, and GSC allows geographic settings at the subdirectory level (which may or may not even be needed, since language-specific sites may not be geographic-specific), so I'm unsure why Moz would suggest using subdirectories in this case.
Intermediate & Advanced SEO | | AdamThompson0 -
Index process multi language website for different countries
We are in charge of a website with 7 languages for 16 countries. There are only slight content differences by countries (google.de | google.co.uk). The website is set-up with the correct language & country annotation e.g. de/DE/ | de/CH/ | en/GB/ | en/IE. All unwanted annotations are blocked by robots.txt. The «hreflang alternate» are also set. The objective is, to make the website visible in local search engines. Therefore we have submitted a overview sitemap connected with a sitemap per country. The sitemap has been submitted now for quite a while, but Google has indexed only 10 % of the content. We are looking for suggestion to boost the index process.
Intermediate & Advanced SEO | | imsi0 -
Removing index.php
I have question for the community and whether or not this is a good or bad idea. I currently have a Joomla site that displays www.domain.com/index.php in all the URLs with the exception of the home page. I have read that it's better to not have index.php showing in the URL at all. Does it really matter if I have index.php in my URL? I've read that it is a bad practice. I am thinking about installing the sh404SEF component on my site and removing the index.php. However, I rank pretty high for the keywords I want in Google, Bing and Yahoo. All of the URLs that show up in the searches have index.php as part of the URL. Has anyone ever used sh404SEF to remove the index.php and how did you overcome not loosing your search engine links? I don't want an existing search showing www.domain.com/index.php/sales and it not linking to the correct page which would now be www.domain.com/sales. I guess I could insert the proper redirects in the htaccess file. But I was hoping to avoid having every page of my site in the htaccess file for redirecting. Any help or advice appreciated.
Intermediate & Advanced SEO | | MedGroupMedia0 -
My website is not indexing
Hello Experts As i search site :http://www.louisvuittonhandbagss.com or just entering http://www.louisvuittonhandbagss.com on Google i am not getting my website . I have done following steps 1. I have submitted sitemaps and indexed all the site maps 2.i have used GWT feature fetch as Google . 3. I have submitted my website to top social book marking websites and to some classified sites also . Pleae
Intermediate & Advanced SEO | | aschauhan5210 -
PR Dilution and Number of Pages Indexed
Hi Mozzers, My client is really pushing for me to get thousands, if not millions of pages indexed through the use of long-tail keywords. I know that I can probably get quite a few of them into Google, but will this dilute the PR on my site? These pages would be worthwhile in that if anyone actually visits them, there is a solid chance they will convert to a lead do to the nature of the long-tail keywords. My suggestion is to run all the keywords for these thousands of pages through adwords to check the number of queries and only create pages for the ones which actually receive searches. What do you guys think? I know that the content needs to have value and can't be scraped/low-quality and pulling these pages out of my butt won't end well, but I need solid evidence to make a case either for or against it to my clients.
Intermediate & Advanced SEO | | Travis-W0 -
Problem of indexing
Hello, sorry, I'm French and my English is not necessarily correct. I have a problem indexing in Google. Only the home page is referenced: http://bit.ly/yKP4nD. I am looking for several days but I do not understand why. I looked at: The robots.txt file is ok The sitemap, although it is in ASP, is valid with Google No spam, no hidden text I made a request for reconsideration via Google Webmaster Tools and it has no penalties We do not have noindex So I'm stuck and I'd like your opinion. thank you very much A.
Intermediate & Advanced SEO | | android_lyon0