Using IP Detection to Filter Directory Listings without Killing Your SEO?
-
I have a client who maintains a directory of surgeons across the United States (approx. 2,000 members at present), and wishes to use IP detection to dynamically filter their surgeon directory to a sub-set that is relevant to the geography of the visitor. At the same time, however, we want the pages in the surgeon directory to rank nationally for terms like "[insert specialty] surgeons". Any tips/best practices for implementing an IP detection solution without shooting yourself in the foot from an SEO perspective? Is it even possible?
Thanks!
Jeremy
-
Just to be sure - you want to present something like this:
IP Address = New York
Visitor is seeing a pagedomain.com/results_for_new_york
with canonical domain.com/results_generic
This might work - but it's not really a correct use of a canonical url (which is intended for duplicate content which is not really the case here) - so not sure if Google is going to respect the canonical in this case (canonical = hint - not a directive)
Personally wouldn't do it this way.
Dirk
-
Thanks, Dirk. That's a great solution. If my client is disinclined to introduce another click in the visitors' path, can you think of a solution that would still utilize a dynamically-presented selection of directory listings (based on IP), backed up with a canonicalized version of the complete directory?
Cheers,
Jeremy
-
I would give the choice to the user rather than forcing him to a certain page. You could use the IP detection to trigger a message like "We noticed that you are from New York. Would you like to visit the <new york="" page="">or rather view the <generic page="">?". You could consider to store the choice in a cookie, so that on subsequent visits the visitor always goes to the New York pages rather than the generic ones.</generic></new>
This solution is both user friendly and doesn't have impact on SEO. If you use IP detection to force users to a certain page, there is always a risk (depending on the implementation) that Google is only indexing the Californian pages (as main Googlebot IP is Californian)
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Proper SEO structure for Franchise/ Franchisee websites
Hi Neighbors, Franchise website design and development can be difficult, there’s no doubt about it. I had to find the right balance between a unique and unified brand identity, and a localized experience that accurately reflects the individual franchisees and their efforts. Due to the many benefits, I have structured the to read _domain.com/location _ domain.com = TLD /location = subfolder (location page) I have also built a customized CMS (e.g. Drupal) and have given each location access to manage their location page (subfolder). To accommodate local SEO optimization, franchisees have complete control in terms of optimizing their location page (subfolder). Title tags, meta description, Alt tags, etc... Will any local optimization performed in the subfolder (location page) be stiffened because it was not done in the TLD but in the subfolder ?
Local Website Optimization | | Jeffvertus1 -
Checking subdomains/ site structure of a website for International SEO
Dear Moz community, I am looking into two websites for a friend and we want to understand the following: What is the site structure as per the sub domains? e.g currently it is .com/en/ or .com/ru/ or .com/zh/ Using the crawl report, each page has a en or other language version. I take it this means that we have to create copy, meta titles and descriptions for each of the languages even if the page is the same but in a different language? To avoid duplication of content would you suggest canonical tags to be put in place? To check hreflang mark up, I couldn't find anything in the code which makes me thing a script is automatically translating this? This is the first time I have started to look at international SEO and want to understand what to look for in an audit of exisiting sites. Thank you,
Local Website Optimization | | TAT1000 -
Can I use Schema zip code markup that includes multiple zip codes but no actual address?
The company doesn't have physical locations but offers services in multiple cities and states across the US. We want to develop a better hyperlocal SEO strategy and implement schema but the only address information available is zip codes, names of cities and state. Can we omit the actual street address in the formatting but add multiple zipcodes?
Local Website Optimization | | hristina-m0 -
Ideas on creating location based service pages for SEO value while not worrying about local SEO?
Hello and thanks for reading! We have a bit of a rare issue, where we are a nationwide distributor but have a local side that handles all tristate area requests, the sales that happen via local basically don't impact the online side, so we're trying to not focus on local SEO but in a sense worry about abroad local SEO. We want to try the location based service pages, but not for every state, at most 5 states and inside those pages target 2 to 3 big cities. Is this a waste of time to even think about or is this something that can be done with a careful touch?
Local Website Optimization | | Deacyde0 -
Schema training/resources for local SEO?
I am currently in the process of apply schema for dozens of clients (many are large retailers). Although I am not a developer, I do know the basics of schematic markup & structured data. I do work with a development team and I'm trying to provide them with schema application best practices. Obviously there are many good articles/blog posts out there about schema. However I'm looking for a more substantial training course, webinar or resource website about schema application. Does anybody have any good recommendations?
Local Website Optimization | | RosemaryB0 -
Had SEO Firm tell me to Start Over - pros and cons help please
Hi So I have quotes of 1250 to 2500 a month to run my website, seo wise. What I am told is they will do all facebook postings, 4 blog posts each month, some citations, and site optimization. Those amounts due seem like a lot. Yet I was last to start all over. Basically I was told that because of some bad backlinks, which only a few remain, that you can never recover from an algorithm penalty. And with a Disavow, its like telling Google - penalize me please So the plan was this: $3000 for a new site, and new domain, and then it has no penalties, and I will be ranking in no time. The problem is I am branded. My domain and business name is Bernese Of The Rockies. People know us and we are very respected. So if we create a new site like example.com, I do not want to mislead people. Or if there is a penalty for say a landing page or site, where I am sending people to my main site for more info type of thing. Just looking for your input if this is a common issue, where if you have a non manual, but algo penalty that you must restart? Thank you so much for your thoughts and suggestions.
Local Website Optimization | | Berner0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0 -
One location performing worse than the rest despite no major difference in SEO strategy
Hi all, I'm flummoxed. I'm dealing with a business that has 15 or so offices in three cities, and one city is performing horribly (this includes every office therein). The other two cities have shown consistently stellar results with massive traffic increases month over month for the past year; the city in question dropped unexpectedly in June and hasn't ever recovered. We didn't perform any major website changes during or immediately prior to that time period, and the website in general hasn't been negatively affected by Hummingbird. All locations for the business are optimized in the exact same way and according to best practices; there's no significant difference in the number of local listings, reviews, G+ fans, social signals, etc across locations. All meta data and content is optimized, NAPs are all consistent, we've built links wherever we can: the SEO for every location has been by-the-books. We've run a competitor audit in this particular city that included pulling our top competitors and exploring their domain authority, meta data, on-page keyword grade for the term we're trying to rank for, number and type of inbound links, social signals, and more; and we didn't spot any patterns or any websites that were significantly outperforming us in any area (besides actual rankings). It's frustrating because the client is expecting a fix for this city and I can't find anything that needs to be fixed! Have any multi-local SEOs out there run into a similar problem? What did you do about it?
Local Website Optimization | | ApogeeResults0