To host within my city or not?
-
I'm opening up a web development company in Vancouver and I'm stuck between getting hosting in Montreal or Vancouver. Montreal is a cheaper but, my company is in Vancouver. I have a .ca which will be my domain name, so I'm already on top of that aspect.
From what I understand it would help for SEO purposes if my IP is a local Vancouver IP for my company website. So my question is... Should I go for the Vancouver hosting and pay more or stick with the Montreal hosting.
-
The speed load of the entire is site is much more important. That, in correlation with the IP location can help you in some extent with your ranking but overall it's a small signal.
My opinion is to focus on the site speed (related with your question) and I strongly believe that since both Vancouver and Montreal are in Canada there won't be a difference as far as geo location of your host.
More then that if your site is fast, even if you host using a US host your ranking won't be affected. Since you have those options within Canada It's best, in my opinion, to go with the Montreal option as there is no difference as far as SEO benefit between the two options you've mention.
Hope it helps !
-
I ll stick with Montreal,
Both IPs are CA. I ll take into consideration If ips where in different countries ie russia and canada
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Changing the city of operation and trying to know the best way of informing Google
We are having a business operating out of three cities A, B and C with A being the primary address and the business provides its services in B and C as well. Business has decided to shut shop in C and instead add D as another city. Currently the URLs are like www.domainname.com/A/productswww.domainname.com/B/productswww.domainname.com/C/productsPlease help us in understanding the best way to inform google that City C is non operational now.Do we need to do the redirects, and if yes, should we do the redirects to Home Page?Or can we just remove the C city URLs from the webmaster tool and inform Google.
Technical SEO | | deep_irvin0 -
How to best keep client hosting separate but manageable?
For those of you with a number of client accounts for which you do hosting, how do you keep them manageable but separate? Let's assume you have both public and private clients and don't want someone to do a reverse IP/server lookup and be able to identify everyone you work with. Additionally clients can be working in the US/UK/EU and want localised hosting. I'm looking for a large shared hosting provider (with some potentially dedicated options) who will let me manage accounts on multiple physical servers in a variety of geolocations from a single billing account and preferably a single admin panel as well. Once client contracts end I also need the ability to let them take over the hosting in a break-away account and to be able to add their own billing details. I'm looking for a solution a bit more upmarket than something like SEOhosting from Hostgator (which doesn't allow me to specify geolocation territories anyway), potentially with an account manager to help me sort out the individual requirements. Does anybody have any ideas of providers or what I should be searching for to get what I want?
Technical SEO | | I3SEO0 -
Is there anywhere i can find a list of the best UK dedicated hosting companies
Hi, i am after finding a list of uk hosting companies that can offer dedicated hosting. I have been looking for months now for a UK hosting company. I need the following or something similar Intel Xeon E3-1220 4x3.1GHz TB 8MB Cache 500GBStorage8GBRAM10TBBandwidthif anyone can help with this then that would be great.
Technical SEO | | ClaireH-1848860 -
Source code structure: Position of content within the tag
Within the section of the source code of a site I work on, there are a number of distinct sections. The 1st one, appearing first in the source code, contains the code for the primary site navigation tabs and links. The second contains the keyword-rich page content. My question is this: if i could fix the layout so that the page still visually displayed in the same way as it does now, would it be advantageous for me to stick the keyword-rich content section at the top of the , above the navigation? I want the search engines to be able to reach the keyword-rich content faster when they crawl pages on the site; however, I dont want to implement this fix if it wont have any appreciable benefit; nor if it will be harmful to the search-engine's accessibilty to my primary navigation links. Does anyone have any experience of this working, or thoughts on whether it will make a difference? Thanks,
Technical SEO | | Tinhat0 -
Implementing Schema within Existing CSS tags
In implementing Schema with a site using CSS and containing existing tags, I want to be sure that we are (#1) using the tags effectively when used within a product detail template and (#2) not actually harming ourselves by telling Google that all products are named or described by the SS tag and not actually the product name or description (which obviously could be disasterous). An example of what we are looking at implementing is the following: Old: <ss:value source="$product.name"></ss:value> New: <ss:value source="$product.name"></ss:value> Old: <ss:value source="$product.description">New: <ss:value source="$product.description"></ss:value> Basically, is Schema at the point where the SS tag be replaced (in the eyes of the search engines) with the actual text and not the tag itself?</ss:value>
Technical SEO | | TechMama0 -
Trying to reduce pages crawled to within 10K limit via robots.txt
Our site has far too many pages for our 10K page PRO account which are not SEO worthy. In fact, only about 2000 pages qualify for SEO value. Limitations of the store software only permit me to use robots.txt to sculpt the rogerbot site crawl. However, I am having trouble getting this to work. Our biggest problem is the 35K individual product pages and the related shopping cart links (at least another 35K); these aren't needed as they duplicate the SEO-worthy content in the product category pages. The signature of a product page is that it is contained within a folder ending in -p. So I made the following addition to robots.txt: User-agent: rogerbot
Technical SEO | | AspenFasteners
Disallow: /-p/ However, the latest crawl results show the 10K limit is still being exceeded. I went to Crawl Diagnostics and clicked on Export Latest Crawl to CSV. To my dismay I saw the report was overflowing with product page links: e.g. www.aspenfasteners.com/3-Star-tm-Bulbing-Type-Blind-Rivets-Anodized-p/rv006-316x039354-coan.htm The value for the column "Search Engine blocked by robots.txt" = FALSE; does this mean blocked for all search engines? Then it's correct. If it means "blocked for rogerbot? Then it shouldn't even be in the report, as the report seems to only contain 10K pages. Any thoughts or hints on trying to attain my goal would REALLY be appreciated, I've been trying for weeks now. Honestly - virtual beers for everyone! Carlo0 -
Fastest Hosting Company
Who has the fastest hosting company? what major provider has fastest service for page load times? Looking for affordability like godaddy.
Technical SEO | | bozzie3110