Does server location matter?
-
Hi guys,
-
A friend's website is hosted in Germany (showing German IP in Flagfox) but it is a UK-based local business that only serves customers within a small radius covering 3 medium sized UK towns (they sell heavy construction materials for collection only). Should I advise him to change hosting location to the UK? Will this help him rank better for regional keyword searches & Google Places?
-
He has some 'followed' links from UK sites (over 6 months old) that are not being picked up by Majestic, OSE or Webmaster Tools - is this likely to be connected to the server location?
Thanks in advance for any help!
-
-
The main too signals google looks ta are tld and ip number, so hosting in anouther country is a problem, but as stephen has said you can use GWMT to indicate what country you want to rank in, unfortunatly this only helps with google
See matt cutts
-
- Not really a need to change location. Google is aware that sites are often hosted outside of their country, there are a lot of other stronger signals that he can easily give
I assume he has a .co.uk or a .com?
- Set his location in webmaster tools to be UK
- Make sure he has a UK address on the site - footers or contact us page etc
- Set the lang in the html to en-gb
- Make sure there is UK related content on the page
- If those links are not showing up, they are likely from very weak domains that wouldn't help him even if they did show up. Go and get decent quality links!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Page speed matter for google ranking?
We are not sure that page does matter or not for google ranking as I am working for this keyword "flower delivery in Bangalore" from last few months and I saw some website's google first page who have low page speed but still ranking so I am really worried about my page that has also low page speed. will my Bangalore page rank on google the first page if the speed is low and kindly suggest me more tips for the ranking best factors which really works in 2020 and one more thing that domain authority really matters in this year? as I also saw some websites with low domain authority and ranking on google's first page. Home page: Flowerportal Bangalore page: https://flowerportal.in/flower-delivery/bangalore/ focus Keyword is: Flower delivery in Bangalore, send flowers to Bangalore
Technical SEO | | vidi34231 -
What are the benefits to use .bank as a TLD against the locat TLD?
One of my clients is a leading bank in my country and want to switch to .bank TLD from the local one. Does anyone have experience of what are the benefits or drawbacks of the switch? Is it really a trusted domain with all the security benefits they promise or just another low level TLD with chances to get attaced even more often than before?
Technical SEO | | Kirowski0 -
Should I nofollow Geo-located links on a site?
I run various sites that use Geo-location to place related links in navigation menus on a page. For example, if you land on the home page, we will see that you are in Florida and then in one of the content boxes on the page, show job listings that this site has in Florida. We also give the option to search for other jobs or use other navigation options. The idea is to try to help the user along the best we can, but ..... What opinions do persons have here on if these links should be nofollowed as GoogleBot will always see links to places in California etc. - wherever Googlebot is crawling from? Would this then be confusing as we are a site that focused on the entire US and not just California etc Thanks!
Technical SEO | | CleverPhD0 -
Best way to fix a whole bunch of 500 server errors that Google has indexed?
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not. In any case, there are now thousands of these pages in their index that error out. If I wanted to simply remove them all from the index, which is my best option: Disallow all 1,000 or so pages in the robots.txt ? Put the meta noindex in the headers of each of those pages ? Rel canonical to a relevant page ? Redirect to a relevant page ? Wait for Google to just figure it out and remove them naturally ? Submit each URL to the GWT removal tool ? Something else ? Thanks a lot for the help...
Technical SEO | | jim_shook0 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
Delete 301 redirected pages from server after redirect is in place?
Should I remove the redirected old pages from my site after the redirects are in place? Google is hating the redirects and we have tanked. I did over 50 redirects this week, consolidating content and making one great page our of 3-10 pages with very little content per page. But the old pages are still visible to google's bot. Also, I have not put a rel canonical to itself on the new pages. Is that necessary? Thanks! Jean
Technical SEO | | JeanYates0 -
Local SEO best practices for multiple locations
When dealing with local search for a business with multiple locations, I've always created an individual page for each location. Aside from the address and business name being in there, I also like to make sure the title tag and other important markup features the state/city/suburb, or, in the case of hyper-local, hyper-competitive markets, information more specific than that. It's worked very well so far. But, the one thing you can always count on with Local is that the game keeps changing. So I'd like to hear what you think... How do you deal with multiple locations these days? Has Google (and others, of course) advanced far enough to not mess things up if you put multiple locations on the same page? (Do I hear snickers? Be nice now) How does Schema.org fit in to your tactics in this area, if at all? Cheers (Edit: dear SEOmoz, stop eating my line breaks)
Technical SEO | | BedeFahey0 -
Using a third party server to host site elements
Hi guys - I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand. One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL. I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com). I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain? Hope this makes sense (I'm not the best at explaining things!)
Technical SEO | | zealmedia0