Local Google vs. default Google search
-
Hello Moz community,
I have a question: what is the difference between a local version of Google vs. the default Google in regards to search results?
I have a Mexican site that I'm trying to rank in www.google.com.mx, but my rankings are actually better if I check my keywords on www.google.com
The domain is a .mx site, so wouldn't it make more sense that this page would rank higher on google.com.mx instead of the default Google site, which in theory would mean a "broader" scope? Also, what determines whether a user gets automatically directed to a local Google version vs. staying on the default one?
Thanks for your valuable input!
-
So glad to be of help, Eduardo.
-
Wow thanks for this answer Miriam. This definitely helps me understand much better how this works as I was getting very confused why I was ranking better in "global" searches, now this makes more sense.
-
Hi Eduardo,
I just wanted to add a note to this good conversation. For local businesses, Google will show IP-based results if they believe the search query has a local intent. So, the factors governing actual local results (for things like restaurants, plumbers, hotels, etc.) are highly influenced by the searcher's location, or by him adding a geographic keyword to his query.
For example, searching from my location in the US, if I search for 'hotels alcapulco' Google is showing me their local carousel of hotel results for this city in Mexico, regardless of the fact that I'm not located in Mexico. Google 'gets' that I want to see hotels in Mexico. But, if I search for 'history of the Mexican flag', Google is showing me general, organic results including things like Wikipedia. So, whether one gets local results or organic ones appears to be based on the language of the searcher's query + Google's concept of the user's intent. Usually, they get this right, but there are some instances in which you may get local results when you didn't really want them. For example, I might search for something really generic like 'shoes', and then Google isn't totally secure about my intent, so they are showing me both local and national results.
-
its not that it is impossible but the results you see may not reflect what others see.
With cookies set on your computer and with IP and language playing such a strong role it will difficult to see accurate results without forcing it with things like gl=US or something like that.
Do you have a local listing for your address? That will help a lot as well as marking your site up with rich snippets like schema.org
-
Thank you Gary. What you've mentioned makes sense to me.
So, pretty much getting to google.com is almost impossible now, as Google will redirect users to their local Google sites based on their IP addresses. Therefore, it makes more sense to rank in local versions of Google, at least for this site, which is not intended to be global.
I'm thinking more links from .mx sites could be a possible solution.
Thanks a lot!
-
I have a question: what is the difference between a local version of Google vs. the default Google in regards to search results?
Many factors, the results in google.com can be a real mix of international listings as well as local. The location of backlinks to a website can force Google to rank a site in that region.
Lets say you have a .mx site but have lots of links from UK websites because it is about holidays in mexico then you could rank well in the google.co.uk it would be more likely however if the domain was a .com as Google gives more universal power to those types of TLD's
The domain is a .mx site, so wouldn't it make more sense that this page would rank higher on google.com.mx instead of the default Google site
Not if the above is true, also if you have a less of a local feel to your content you might rank better among a group of similar sites that are more broad in nature.
Also, what determines whether a user gets automatically directed to a local Google version vs. staying on the default one?
There are a lot of factors involved, IP address and Google's ability to determine where you are.
The language you use and even the spelling of words, for instance in the UK vs US it could be color vs colour
There are lots of answers to your questions but those are just a few to give you an idea of the 200+ algorithm factors for ranking in such circumstances.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google search console image indexing issue
Google search console tells that only '58 out of the 3553' images in the images sitemap are indexed. But if I search "site:example.com" in Google images there seem to be lots of images. There are no errors in the sitemap and I am still getting reasonable number of image search hits daily. Are the webmaster tools stats for images indexed accurate? When I click on the Sitemap Errors & Index Errors this is what i get - Error details: No errors found. https://www.screencast.com/t/pqL62pIc
Technical SEO | | 21centuryweb0 -
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi! The Problem We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them. The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed. Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions? Thanks for any input on this one. PmHmG
Technical SEO | | AlisonMills0 -
Crawling issues in google
Hi everyone, I think i have crawling issues with one of my sites. It has vanished form Google rankings it used to rank for all services i offered now it doesn't anymore ever since September 29th. I have resubmitted to Google 2 times and they came back with the same answer: " We reviewed your site and found no manual actions by the web spam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages. This article has a list of other potential reasons your site may not be doing well in search. " How i detected that it may be a crawling issue is that 2 weeks ago i changed metas - metas are very slow in getting updated and for some of my pages never did update Do you know any good tools to check for bad code that could slow down the crawling. I really don't know where to look other than issues for crawling. I validated the website with w3c validator and ran xenu and cleaned these up but my website is still down. Any ideas are appreciated.
Technical SEO | | CMTM0 -
Tags showing up in Google
Yesterday a user pointed out to me that Tags were being indexed in Google search results and that was not a good idea. I went into my Yoast settings and checked the "nofollow, index" in my Taxanomies, but when checking the source code for no follow, I found nothing. So instead, I went into the robot.txt and disallowed /tag/ Is that ok? or is that a bad idea? The site is The Tech Block for anyone interested in looking.
Technical SEO | | ttb0 -
When keywords are on the top of the google search engine then what to do ?
My two keywords are on the top of my desired market place that means google.co.uk . So now what should I do to sustain this position???
Technical SEO | | JohnDooley0 -
Google not visiting my site
Hi my site www.in2town.co.uk which is a lifestyle magazine has gone under a major refit. I am still working on it but it should be ready by the end of this week or sooner but one problem i have is, google is not visiting the site. I took a huge gamble to redo the site, even though before the refit i was getting a few thousand visitors a day, i wanted to make the site better as i was getting google webmaster errors. But now it seems google is not visiting the site. for example i am using sh404sef and i have put friendly url in the site and on the home page it has its name and meta tag but when you look at google it is not giving the site a name. Also it has not visited the site since october 13th Can anyone advise how to encourage google to visit the site please.
Technical SEO | | ClaireH-1848860 -
Google crawl rate almost zero since re-launch, organic search up 50% though!
We're confused as to why Google's crawl of our site has dropped hugely since our new site went live. The URLs of almost all pages changed, and were 301d to the new site. About 20% of our pages were blocked by robots.txt for the re-launch. The re-launch has been great for organic search, with hits up about 50%. Yet our new content is taking a lot longer to get indexed than before. Our KB downloaded a day according to webmaster tools are well down, as is time spent downloading a page. Any ideas as to why this is?i7hwX.png
Technical SEO | | soulnafein0 -
404 vs. 200?
Is it better to have an error page return a 404 or 200? If I change it to 200, will I still be able to see reports of 404's and/ or broken links? Is there a valid SEO reason that Google would have for not wanting error pages to return 200? In other words, is there any SEO reason to absolutely change it to return a 404? I would rather let it return 200 if no priority reason to change. [title edited by staff to provide clarity]
Technical SEO | | cindyt-170380