Targeting by IP Address... SEO Issues?
-
I'm setting up a site to display a different site header graphic depending on which U.S. State the IP address is coming from. In theory we may end up doing 50 different images, although we'll probably start with 4 or 5 and then the other states will get a "default".
How will the SE's treat this... if it's just an image change, but the text on the page is the same, will it affect anything? Any best practice advice out there?
thanks!
-
Jim I agree with Jason and don't think you have anything to worry about. Where you run into trouble is when you are specifically trying to show Google (or any other search engine) something different. For example, if Googlebot was using an IP address from Kansas you'd want to show them the Kansas image just like you show everyone else. Don't complicate things by saying "Show Kansas.jpg to all Kansas IPs, but always show Googlebot the default.jpg file). That would be cloaking. What you are doing is simple IP delivery and it should be safe.
-
That variable image shouldn't cause you any SEO trouble. You should consider changing the meta data for the image (Alt Tag, etc...) with the image.
If you want to be extra safe, and avoid any cache problems with your geolocation feature, you will want to use a HTTP Vary header with "*" on the pages that have the GeoLocation personalization.
-Jason "Retailgeek" Goldberg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is having the same title tag on a blog listing page and blog date archives an SEO issue?
Hi there, Can anyone answer whether having duplicate title tags on the blog listing page (e.g.https://blog.companyname.com/) and the blog date archive pages (e.g.https://blog.companyname.com/archive/2017/10) is an issue? If so why is it an issue and what are the best practices of dealing with this? Thanks! John
Technical SEO | | SEOCT1 -
Technical SEO - Where to begin?
Hi all, I'm looking to learn more about technical SEO. My background was digital marketing/PR where I learned the importance of links, of anchor text, of page speed, of improving UX signals, of SSL, utilising things like Google My Business etc. However, I find I am chasing my tail when it comes to things like understanding JS/CSS/log file analysis etc. I've tried reading so many articles on the subjects and I just find it so damn confusing. AnugalarJS/BackboneJS. Fetching & rendering, URL parameters...etc. I know from my own experiments that JS pages struggle to rank and I've created two very similar pages, one without JS, one with JS (which had far more links) and the non-JS page ranked far higher. So, I suppose I'm asking for some help with how to begin learning this stuff. I find the articles on Moz, Search Engine Land etc to be a bit confusing...maybe I'm not technically minded enough! Cheers, Rhys
Technical SEO | | SwanseaMedicine0 -
SEO for sub domains
I've recently started to work on a website that has been previously targeting sub domain pages on its site for its SEO and has some ok rankings. To better explain, let me give an example...A site is called domainname.com. And has subdomains that they are targeted for seo (i.e. pageone.domainname.com, pagetwo.domainname.com, pagethree.domianname.com). The site is going through a site re-development and can reorganise its pages to another URL. What would be best way to approach this situation for SEO? Ideally, I'm tempted to recommend that new targeted pages be created - domainname.com/pageone, domainname.com/pagetwo, domainname.com/pagethree, etc - and to perform a 301 redirect from the old pages. Does a subdomain page structure (e.g. pageone.domainname.com) have any negative effects on SEO? Also, is there a good way to track rankings? I find that a lot of rank checkers don't pick up subdomains. Any tips on the best approach to take here would be appreciated. Hope I've made sense!
Technical SEO | | Gavo0 -
Home page canonical issues
Hi, I've noticed I can access/view a client's site's home page using the following URL variations - http://example.com/
Technical SEO | | simon-145328
http://example/index.html
http://www.example.com/
http://www.example.com/index.html There's been no preference set in Google WMT but Google has indexed and features this URL - http://example.com/ However, just to complicate matters, the vast majority of external links point to the 'www' version. Obviously i would like to tidy this up and have asked the client's web development company if they can place 301 redirects on the domains we no longer want to work - I received this reply but I'm not sure whether this does take care of the duplicate issue - Understand what you're saying, but this shouldn't be an issue regarding SEO. Essentially all the domains listed are linking to the same index.html page hosted at 1 location My question is, do i need to place 301 redirects on the domains we don't want to work and do i stick with the 'non www' version Google has indexed and try to change the external links so they point to the 'non www' version or go with the 'www' version and set this as the preferred domain in Google WMT? My technical knowledge in this area is limited so any help would be most appreciated. Regards,
Simon.0 -
SEO for Interspire Relic
Hi All, Does anyone know of optimization best practices for the now largely defunct Interspire Web Publisher? Specifically, I'm looking for a canonical plugin or workaround to try and get rid of a few duplicate content issues (most importantly root vs. index.php). I'd like to just redo the site with a cms that has better support...unfortunately client budget constraints are a little tight at the moment. Thanks!
Technical SEO | | G2W0 -
Location targeting with no physical location
If you have no physical premises (i.e. operate online) but you only serve clients in a specific area, what is best practice for targeting a local area? I know G. Places can be used if you have a premises, and that .co.uk / hosting server location make a difference, but beyond that... ? Thanks!
Technical SEO | | underscorelive1 -
Crawl issue
Hi I have a problem with crawl stats. Crawls Only return 3k pages while my site have 27k pages indexed(mostly duplicated content pages), why such a low number of pages crawled any help more than welcomed Dario PS: i have more campaign in place, might that be the reason?
Technical SEO | | Mrlocicero0 -
Does IP Changes Affect SEO Metrics?
Okay, So yesterday I asked a question about setting up custom error pages in IIS 6.0 to properly do a 24 hour 503 Service Temporarily Unavailable. With no answer, (not to the despair of the community as the question has NO simple or easy answer 🙂 So after a night of dreaming about solutions 🙂 I realized that we have the ability to just clone the site.... So basically it would just become a redundant server or mirror site for 24 hours. With all that being said the question is..... What SEO pitfalls might I encounter from this if any I suspect none as load balancing and redundancy is a fact of life in the WEB world, especially since it will be a MAX of 24 hours downtime for maint.
Technical SEO | | Jinx146780