Does Server Location have anything to do with Search Results
-
Good Morning Everyone...
Does having a site hosted in Europe have any effect on Search Engine results in the US?
Thanks
-
Hi all...
actually it is not the physical presence of a site in a targeted country that matters, but its IP, that is why you can choose if having the site hosted in the targeted country or buying an IP corresponding to that country, and operates via proxy caching.
Said that, and also due to the evolution of the Cloud, Google is not considering it a main factor, as it says here:
Server location (through the IP address of the server). The server location is often physically near your users and can be a signal about your site’s intended audience. Some websites use distributed content delivery networks (CDNs) or are hosted in a country with better webserver infrastructure, so it is not a definitive signal.
-
Hi, I'm based in the UK, based on some advice we moved ours from Chicago to Manchester (UK). We saw a bit of a leap. I think it's due to the locality of data centres. We were doing a bunch of other SEO at the time though. Hope this helps
-
Thanks... That video is from 2009, does anyone know if this is still relevant?
-
Hi, I posted this question recently and the consensus was that it does still seem to have an impact. Also check out this video with Matt Cutts: http://www.youtube.com/watch?v=hXt23AXlJJU
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Australian search - ZERO visibility and stumped
Fair warning, this is going to be long, but necessary to explain the situation and what has been done. I will take ANY suggestions, even if I have tried them already. We have a sister site in Australia, targeting Australian traffic. I have inherited what seems to be an incredible rat's nest. I've fixed over two dozen issues, but still haven't seemed to address the root cause. NOTE: Core landing pages have weak keyword targeting. I don't expect much here until I fix this. The main issues I'm trying to resolve first are with the unusual US-based targeting, and the inability of the homepage to rank for anything. The site is www[dot]castleford[dot]com[dot]au. Here's the rundown on what's going on: Problems: The site ranks for four times as many keywords in the US as it does in Australia. The site ranks for a grand total of 5 keywords on the first page for AU keywords. The homepage, while technically optimized on-page for "content marketing agency", and with content through MarketMuse, has historically ranked between 60-100, despite having a fairly strong DA with fairly weak competitors, based on AHREFs keyword difficulty, and Moz keyword difficulty. Oddly, the ranking has gone up to 5-7 for three day spurts over the past year. Infrequent indexing of homepage (used to be every 2-3 weeks, I've gotten that down to 1 week). Sequence of events: November 2017 - they made some changes to their URLs - some on the blog and some on the top nav LPs. Redirects seem okay. November 2017 - Substantial number of lost referring domains, not many seem to be quality. January 2018 - total number of AU ranking keywords more than halved. May/June 2018 - added a follow inbound link sitewide to an external site that they created. 20k inbound links with same anchor text to homepage. Site has a total of 24k inbound links. July-Sep 2018 - total number of US ranking keywords halved November 10 - I walked into this mess. What's been done: Reduced site load speed by over 150% (it was around 20 seconds). Create sitemap (100 entry batching) and submit to GSC. Improved MarketMuse score for the homepage. Changed language from "en-US" to "en-AU" Fetch and render - content is all crawlable and indexed properly. Changed site architecture for top nav core landing pages to establish clear hierarchy. All version of GSC created, non-www and www http, and non www https and www https Site crawl - normal amount of 404s, nothing stands out as substantial. http to https redirect okay. Robots.txt updated and okay. Checked GSC international targeting, confirmed AU. No manual links penalty I'm clearly stumped and could use some insights. Thanks to everyone in advance, if you can find time.
Technical SEO | | Brafton-Marketing0 -
Does an Apostrophe affect searches?
Does Google differentiate between keyphrase structures such as Mens Sunglasses & Men**'**s Sunglasses? I.e. does the inclusion/exclusion of an apostrophe make any difference when optimising your main keyword/phrase for a page? Keyword explorer appears to give different results..... I.e. no data for Men's Sunglasses, but data appears for Mens sunglasses. So if I optimise my page to include the apostrophe, will it screw the potential success for that page? Thanks 🙂 Bob
Technical SEO | | SushiUK1 -
404 or rel="canonical" for empty search results?
We have search on our site, using the URL, so we might have: example.com/location-1/service-1, or example.com/location-2/service-2. Since we're a directory we want these pages to rank. Sometimes, there are no search results for a particular location/service combo, and when that happens we show an advanced search form that lets the user choose another location, or expand the search area, or otherwise help themselves. However, that search form still appears at the URL example.com/location/service - so there are several location/service combos on our website that show that particular form, leading to duplicate content issues. We may have search results to display on these pages in the future, so we want to keep them around, and would like Google to look at them and even index them if that happens, so what's the best option here? Should we rel="canonical" the page to the example.com/search (where the search form usually resides)? Should we serve the search form page with an HTTP 404 header? Something else? I look forward to the discussion.
Technical SEO | | 4RS_John1 -
Reusing content owned by the client on websites for other locations?
Hello All! Newbie here, so I'm working through some of my questions 🙂 I do have two major question regarding duplicate content: _Say a medical hospital has 4 locations, and chooses to create 4 separate websites. Each website would have the same design, but different NAP, and contact info, etc. Essentially, we'd be looking at creating their own branded template. _ My question 1.) If the hospitals all offer similar services, with roughly the same nav, does it make sense to have multiple websites? I figure this makes the most sense in terms of optimizing for their differing locations. 2.) If the hospital owns the content on the first site, I'm assuming it is still necessary to change it duplicates for the other properties? Or is it possible to differentiate between the duplication of owned content from other instances of content duplication? Everyone has been fantastic here so far, looking forward to some feedback!
Technical SEO | | kbaltzell0 -
Hi can anyone let me know which is the better server
hi, i am trying to find out which is the better dedicated server and would like your opinion. the first one is Dell PowerEdge 😄 Intel Xeon E3-1220L, 2.2GHz Dual-Core
Technical SEO | | ClaireH-184886
4GB DDR3 RAM
2 x 500GB SATA HDD
Linux/Windows
10000GB Monthly Transfer
Up to 2 IP Addresses
LSI Raid Card and the second one is, Intel Atom 330 1MB L2 Cache 1.6GH 500GBStorage
4GBRAM
10TBBandwidth if you can please let me know the difference and which one is better for speed and for memory for a large site. many thanks0 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
Can changing a host provider impact search rankings?
I was wondering if changing my host provider would impact my search rankings on the major search engines?
Technical SEO | | bronxpad0 -
Why was my homepage kicked out from results, but not my internal pages?
My domain's homepage has been ranking 1st position for an specific term for about 8 months. Our domain got hacked and it took just one day to make the website right again. A week after our homepage didn't appear anymore in Google results, it isn't even indexed. However the rest of our internal pages keep being indexed and ranking as usual. How can I make my homepage appear again in the results? Is there a way to speed up this process? Will it be in the same position as before, or will it have some sort of penalization for the hacking?
Technical SEO | | HerbalTechnologies0