Targeting City via Web Server
-
Here's a question I can't seem to find an answer to.
Does web hosting within a targeting city make a different in the engines?
For example, a site targeting the Denver area, with web hosting in Denver. Will this boast the ranking or is targeting limited to countries?
Thanks!
-
Thanks, that's what I assumed (re: city targeting)
-
Agree with Adam. A lot of people in the UK use US web hosting and it makes no difference (from what I have seen) to rankings.
-
I've never seen any evidence or indication that it matters what city your web hosting is in.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does Google's search results display my home page instead of my target page?
Why does Google's search results display my home page instead of my target page?
Technical SEO | | h.hedayati6712365410 -
How to Remove Web Cache snapshot page & other language SEO Title in Google Search Engine?
Hi... Please tell me how to remove web cache link given below. I have changed my SEO title but it can't be changed...Any other methods for without using webmaster tools. Kw3arat
Technical SEO | | Thilak_geo040 -
Cities in Footer
Good afternoon! For SEO I put all of the cities and states my customer serves ( over 40) in the footer. Will this help or hurt seo. Also if it does hurt is it better to create a page of cities we serve and write some content around the different communities? Thank you!
Technical SEO | | EmSt0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Targeting US search traffic
Hello, I've noticed the site I'm working on gets about 30-40% of Google organic search traffic from the US and the rest comes from around the world. All the site's customers are in the US and so the thought is to focus getting traffic more from the US. I know google webmaster tools has a geo targeting mechanism for the site in question but what I don't want to do is turn that on and then traffic from non-US sources goes away; I suppose that's not so bad if traffic from the US bumps up accordingly. Do you have any experience on this area? thanks -Mike
Technical SEO | | mattmainpath0 -
Linux Server recognizing ASP Pages (301)
Greeting, I am in the process of 301 a group of pages on our site. We are thinking about switching from ASP to a Linux Server. Question 1. - Would anyone have any information of creating a Sever Spoof so that the Linux Server will recognize asp pages for a 301? Question 2 - How will this style of 301 Effect SEO rankings? Thanks, Tony
Technical SEO | | Tonyd230 -
Sub-domains for keyword targeting? (specific example question)
Hey everyone, I have a question I believe is interesting and may help others as well. Our competitor heavily (over 100-200) uses sub-domains to rank in the search engines... and is doing quite well. What's strange, however, is that all of these sub-domains are just archives -- they're 100% duplicate content! An example can be seen here where they just have a bunch of relevant posts archived with excerpts. How is this ranking so well? Many of them are top 5 for keywords in the 100k+ range. In fact their #1 source of traffic is SEO for many of the pages. As an added question: is this effective if you were to actually have a quality/non-duplicate page? Thanks! Loving this community.
Technical SEO | | naturalsociety0 -
Trying to reduce pages crawled to within 10K limit via robots.txt
Our site has far too many pages for our 10K page PRO account which are not SEO worthy. In fact, only about 2000 pages qualify for SEO value. Limitations of the store software only permit me to use robots.txt to sculpt the rogerbot site crawl. However, I am having trouble getting this to work. Our biggest problem is the 35K individual product pages and the related shopping cart links (at least another 35K); these aren't needed as they duplicate the SEO-worthy content in the product category pages. The signature of a product page is that it is contained within a folder ending in -p. So I made the following addition to robots.txt: User-agent: rogerbot
Technical SEO | | AspenFasteners
Disallow: /-p/ However, the latest crawl results show the 10K limit is still being exceeded. I went to Crawl Diagnostics and clicked on Export Latest Crawl to CSV. To my dismay I saw the report was overflowing with product page links: e.g. www.aspenfasteners.com/3-Star-tm-Bulbing-Type-Blind-Rivets-Anodized-p/rv006-316x039354-coan.htm The value for the column "Search Engine blocked by robots.txt" = FALSE; does this mean blocked for all search engines? Then it's correct. If it means "blocked for rogerbot? Then it shouldn't even be in the report, as the report seems to only contain 10K pages. Any thoughts or hints on trying to attain my goal would REALLY be appreciated, I've been trying for weeks now. Honestly - virtual beers for everyone! Carlo0