How does Google Crawl Multi-Regional Sites?
-
I've been reading up on this on Webmaster Tools but just wanted to see if anyone could explain it a bit better.
I have a website which is going live soon which is going to be set up to redirect to a localised URL based on the IP address i.e. NZ IP ranges will go to .co.nz, Aus IP addresses would go to .com.au and then USA or other non-specified IP addresses will go to the .com address.
There is a single CMS installation for the website.
Does this impact the way in which Google is able to search the site? Will all domains be crawled or just one?
Any help would be great - thanks!
-
Forgetting all other issues, I am assuming you are using a PHP redirect to accomplish this and that you have three sites: .co.nz, .co.au, and .com which you are using the geo targeting preference for US.
Given that you have three sites, they will be indexed individually based on all the normal ways a site is indexed. So, links to site A will bring a googlebot. Sitemap of site B will bring same. Etc. If you are using some type of cross domain canonical or have any redirects of one site to the other for any reason, they will be followed across as well.
All domains will be crawled independent of one another. So, you need to get links to each.
Best,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL dynamic structure issue for new global site where I will redirect multiple well-working sites.
Dear all, We are working on a new platform called [https://www.piktalent.com](link url), were basically we aim to redirect many smaller sites we have with quite a lot of SEO traffic related to internships. Our previous sites are some like www.spain-internship.com, www.europe-internship.com and other similars we have (around 9). Our idea is to smoothly redirect a bit by a bit many of the sites to this new platform which is a custom made site in python and node, much more scalable and willing to develop app, etc etc etc...to become a bigger platform. For the new site, we decided to create 3 areas for the main content: piktalent.com/opportunities (all the vacancies) , piktalent.com/internships and piktalent.com/jobs so we can categorize the different types of pages and things we have and under opportunities we have all the vacancies. The problem comes with the site when we generate the diferent static landings and dynamic searches. We have static landing pages generated like www.piktalent.com/internships/madrid but dynamically it also generates www.piktalent.com/opportunities?search=madrid. Also, most of the searches will generate that type of urls, not following the structure of Domain name / type of vacancy/ city / name of the vacancy following the dynamic search structure. I have been thinking 2 potential solutions for this, either applying canonicals, or adding the suffix in webmasters as non index.... but... What do you think is the right approach for this? I am worried about potential duplicate content and conflicts between static content dynamic one. My CTO insists that the dynamic has to be like that but.... I am not 100% sure. Someone can provide input on this? Is there a way to block the dynamic urls generated? Someone with a similar experience? Regards,
Technical SEO | | Jose_jimenez0 -
Multi language and multi target eCommerce site in EU
Hi, before I raised this question I red some other topics which were relevant to my question, however every topic had some slight difference between the various scenarios. I run a webshop with .eu domain (notebookscreen.eu) which currently runs on 3 languages. We use geoIP to determine the user's location for the following reasons: language currency shipping fee The site runs very slow during the tests, and most of the testers (including MOZ) fails since it has too many 302 redirects. We are rebuilding this part to fix this redirect and need some advice what is the best to optimise for multiple countries. As said in the title this is a shop mainly targeting EU countries, and next to the .eu domain we have 10 other country level TLD registration. Currently we do subfolder stile. Would it be better to do subdomain per country or separate TLD for every country. Current option is the best for backlinks, I don't think second has any gains. Having dedicated TLDs can help to local SERP for every country, however we would need a lot of back-linking. Also if someone starts with the .eu page, a 3xx redirect is needed for the designated country. Different sites do it differently. Some don't care (Apple), some stay on one page and gives you local currency and shipping rates (eBay), or moves to different TLD (Amazon). Is there any better way to determine someone's location other then GeoIP?
Technical SEO | | kukacwap0 -
Does Google distinguish between core content and accessory, 3rd party widgets when considering how slow or fast a site is?
Our site's Facebook Plugin is really slowing page speed down. As far as users are concerned, the page loads fast enough and they can already start interacting with the page before the last sidebar widget has loaded. But the FB widget is really slow to load and is dragging the performance down in Google Analytics Page Speed for example. Any thoughts on whether this should be an SEO concern, and whether Google differentiates between different elements of the page when deciding whether a page is a bad user experience? Thanks!
Technical SEO | | etruvian0 -
Page Indexing increase when I request Google Site Link demote
Hi there, Has anyone seen a page crawling increase in Google Web Master Tools when they have requested a site link demotion? I did this around the 23rd of March, the next day I started to see page crawling rise and rise and report a very visible spike in activity and to this day is still relatively high. From memory I have asked about this in SEOMOZ Q&A a couple of years ago in and was told that page crawl activity is a good thing - ok fine, no argument. However at the nearly in the same period I have noticed that my primary keyword rank for my home page has dropped away to something in the region of 4th page on Google US and since March has stayed there. However the exact same query in Google UK (Using SEOMOZ Rank Checker for this) has remained the same position (around 11th) - it has barely moved. I decided to request an undemote on GWT for this page link and the page crawl started to drop but not to the level before March 23rd. However the rank situation for this keyword term has not changed, the content on our website has not changed but something has come adrift with our US ranks. Using Open Site Explorer not one competitor listed has a higher domain authority than our site, page authority, domain links you name it but they sit there in first page. Sorry the above is a little bit of frustration, this question is not impulsive I have sat for weeks analyzing causes and effects but cannot see why this disparity is happening between the 2 country ranks when it has never happened for this length of time before. Ironically we are still number one in the United States for a keyword phrase which I moved away from over a month ago and do not refer to this phrase at all on our index page!! Bizarre. Granted, site link demotion may have no correlation to the KW ranking impact but looking at activities carried out on the site and timing of the page crawling. This is the only sizable factor I can identify that could be the cause. Oh! and the SEOMOZ 'On-Page Optimization Tool' reports that the home page gets an 'A' for this KW term. I have however this week commented out the canonical tag for the moment in the index page header to see if this has any effect. Why? Because as this was another (if not minor) change I employed to get the site to an 'A' credit with the tool. Any ideas, help appreciated as to what could be causing the rank differences. One final note the North American ranks initially were high, circa 11-12th but then consequently dropped away to 4th page but not the UK rankings, they witnessed no impact. Sorry one final thing, the rank in the US is my statistical outlier, using Google Analytics I have an average rank position of about 3 across all countries where our company appears for this term. Include the US and it pushes the average to 8/9th. Thanks David
Technical SEO | | David-E-Carey0 -
Javascript to manipulate Google's bounce rate and time on site?
I was referred to this "awesome" solution to high bounce rates. It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question). I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me. Can someone with experience in JS help me by explaining what this script does? I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in. 🙂
Technical SEO | | BenRWoodard1 -
I think google thinks i have two sites when i only have one
Hi, i am a bit puzzled, i have just used http://www.opensiteexplorer.org/anchors?site=in2town.co.uk to check my anchor text and forgot to put in the www. and the information came up totally different from when i put the www. in it shows a few links for the in2town.co.uk but then when i put in www.in2town.co.uk it gives me different information, is this a problem and if so how do i solve this | | | | | | | | |
Technical SEO | | ClaireH-184886
| | | | | | | | |0 -
Pros & Cons of deindexing a site prior to launch of a new site on the same domain.
If you were launching a new website to completely replace an older existing site on the same domain, would there be any value in temporarily deindexing the old site prior to launching the new site? Both have roughly 3000 pages, will launch on the same domain but have a completely new url structure and much better optimized for the web. Many high ranking pages will be redirected with 301 to the corresponding new page. I believe the hypothesis is this would eliminate a mix of old & new pages from sharing space in the serps and the crawlers are more likely to index more of the new site initially. I don't believe this is a great strategy, on the other hand I see some merit to the arguments for it.
Technical SEO | | medtouch0 -
Google not visiting my site
Hi my site www.in2town.co.uk which is a lifestyle magazine has gone under a major refit. I am still working on it but it should be ready by the end of this week or sooner but one problem i have is, google is not visiting the site. I took a huge gamble to redo the site, even though before the refit i was getting a few thousand visitors a day, i wanted to make the site better as i was getting google webmaster errors. But now it seems google is not visiting the site. for example i am using sh404sef and i have put friendly url in the site and on the home page it has its name and meta tag but when you look at google it is not giving the site a name. Also it has not visited the site since october 13th Can anyone advise how to encourage google to visit the site please.
Technical SEO | | ClaireH-1848860