Is it better to have hosting that specializes in performance or have the host closer to you physically?
-
I am looking to change to a new hosting company.
I am debating between taking a company that specializes in Wordpress and performance but is situated far from my users or a local company that might not be as good from a performance/speed point of view.
Which do you think I should go with?
My users are near Europe and the Wordpress hosting that I am considering is in the US.
-
Improving your speed will help more from an SEO perspective as Google likes to count it towards the 'visitor experience'. Country specific really makes little difference these days, especially as many businesses use content caching such as Cloudflare - which could be based anywhere.
-
Hi Jill,
I'll always take speed and having a server in another country over having a server local and not having any speed advantages. Mostly, but this is just my take on this, because the factor on how a local server improves your ranking is so small you probably will never see a result on this in your rankings or your organic search traffic. This probably will be the case if you improve for speed, it will decrease the load time for your users hopefully and also decrease the time it takes for Google to download your pages.
Besides this, there are other functionalities to make sure Google is right about the country your site is targeting. Think about: href lang tags, more local signals as in links or company profiles and ofcourse geotargeting within Google Webmaster Tools.
Would absolutely love to hear the arguments for somebody disagreeing on this!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I Block https URLs using Host directive in robots.txt?
Hello Moz Community, Recently, I have found that Google bots has started crawling HTTPs urls of my website which is increasing the number of duplicate pages at our website. Instead of creating a separate robots.txt file for https version of my website, can I use Host directive in the robots.txt to suggest Google bots which is the original version of the website. Host: http://www.example.com I was wondering if this method will work and suggest Google bots that HTTPs URLs are the mirror of this website. Thanks for all of the great responses! Regards,
Technical SEO | | TJC.co.uk
Ramendra0 -
Which URL structure is better?
Quick question - Have a real estate site focused on "apartments", but apartments in not part of my company name. That being said, should which of the following URL structures should I use? http://website.com/city/neighborhood/property-name OR http://website.com/city-apartments/neighborhood/property-name
Technical SEO | | ChaseH0 -
Geographic location of hosting affect SEO
Can anyone confirm if its the geographic location of the web hosting or the domain hosting that can affect seo ? I have a client who has their domain hosting and Website hosting in Australia however their they have a .co.nz and their target market is in New Zealand. thank you
Technical SEO | | summer3000 -
How to best keep client hosting separate but manageable?
For those of you with a number of client accounts for which you do hosting, how do you keep them manageable but separate? Let's assume you have both public and private clients and don't want someone to do a reverse IP/server lookup and be able to identify everyone you work with. Additionally clients can be working in the US/UK/EU and want localised hosting. I'm looking for a large shared hosting provider (with some potentially dedicated options) who will let me manage accounts on multiple physical servers in a variety of geolocations from a single billing account and preferably a single admin panel as well. Once client contracts end I also need the ability to let them take over the hosting in a break-away account and to be able to add their own billing details. I'm looking for a solution a bit more upmarket than something like SEOhosting from Hostgator (which doesn't allow me to specify geolocation territories anyway), potentially with an account manager to help me sort out the individual requirements. Does anybody have any ideas of providers or what I should be searching for to get what I want?
Technical SEO | | I3SEO0 -
Host sitemaps on S3?
Hey guys, I run a dynamic web service and I will start building static sitemaps for it pretty soon. The fact that my app lives in a multitude of servers doesn't make it easy to distribute frequently updated static files throughout the servers. My idea was to host the files in AWS S3 and point my robots.txt sitemap directive there. I'll use a sitemap index so, every other sitemap will be hosted on S3 as well. I could dynamically mirror the content from the files in S3 through my app, but that would be a little more resource intensive than just serving the static files from a common place. Any ideas? Thanks!
Technical SEO | | tanlup0 -
Can you recommend a Web Developer who specializes in SEO?
We are an e-commerce site, http://www.ccisolutions.com running on an obscure Web store coded for us by a small company called Assist, located in Utah. We believe we have numerous problems with our code that are negatively impacting our SEO. One such problem, the current meta refresh on our homepage, is in the process of being fixed (Thanks to Jenn Lopez at SEOMoz for helping me convince management it was important enough to pay the $ for the fix!). However, I believe there could be numerous other issues. I am the SEO strategist, but I am not a coder beyond basic HTML and CSS. Can anyone recommend a highly qualified Web developer who's strong in SEO that we might hire to do an audit of our code, including recommendations on how to fix anything that might be discovered as a problem?
Technical SEO | | danatanseo0 -
Sudden drop in Google with our top performing keywords
Hi, I'm writing about the sudden drop in our keyword rankings from our site www.activitybreaks.com. Our keywords that have significantly seen a drop have been activity holidays was 8th now 16th Adventure holidays was 15th now 71st We have been listed on the first page for a number of keywords but these has suddenly dropped in the last couple of days. We did receive a notice on the 19th May from Google stating that they detected unnatural links. So we spent a couple of weeks getting the links removed and have re-submitted the site on 11th June. When I go into Google webmaster there is no reply from Google as yet and the links are still showing even though we know they have been removed. We also noticed in the last couple of days that we had a duplicate home page but this has now since been removed. Should we re-submit our site to Google for reconsideration or wait to they get back to us. Is there anything else we can be doing to fix this situation. Let me know if you have any ideas! Anything is appreciated, thanks. Naomi
Technical SEO | | activitybreaks50 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0