Using differing calls to action based on IP address
-
Hi,
We have an issue with a particular channel on a lead generation site where we have sales staff requiring different quality of leads in different parts of the country. In saturated markets they require a stricter lead qualification process than those in more challenging markets.
To combat the problem I am toying with the idea of severing very slightly different content based on IP address. The main change in content would be in terms of calls to action and lead qualification processes.
We would plan to have a "standard" version of the site for when IP location can not be detected. URLs on this version would be the rel="canonical" for the location specific pages.
Is there a way to do this without creating duplicate content, cloaking or other such issues on the site?
Any advice, theories or case studies would be greatly appreciated.
-
Hi Gareth,
Thanks for the Googlebot info. Still concerned with cloaking content but will look into it in more detail.
Thanks again
-
Why don't you just block the other pages, so you only have your standard canonical page being indexed? You can then cloak the content by IP, but remember Googlebot always crawls from the US so make sure it is seeing the standard version.
-
Hi Highland,
Thanks for responding. The offering is the same in each location it's just the way we process the leads differs. For example, in Los Angeles / London we get lots of leads - almost too many for the regional sales staff to handle so they try to pre-qualify the leads. In the mid-west / Scotland the sales staff have to nurture the leads so are happy with a phone number.
For another stream on the site we are targeting local segments but this particular element has a national approach.
Thanks for your input.
-
I would create your content with distinct URLs and canonical to the generic if you feel it is too duplicate.
Example, make the following URLs and use a redirect with your geolocation
domain.com/houston
domain.com/los-angeles
domain.com/midwest
domain.comIf I were you, I would make all of these distinct pages with unique content rather than making them canonical. This way you can optimize lots of local searches in addition to the general ones. Canonical is more for people who already have duplicate content. It sounds like you want to make new content. Played right, you could go after lots of local traffic that might be untapped.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical Tag when using Ajax and PhantomJS
Hello, We have a site that is built using an AJAX application. We include the meta fragment tag in order to get a rendered page from PhantomJS. The URL that is rendered to google from PhantomJS then is www.oursite.com/?escaped_fragment= In the SERP google of course doesnt include the hashtag in the URL. So my question, with this setup, do i still need a canonical tag and if i do, would the canonical tag be the escaped fragment URL or the regular URL? Much Appreciated!
Technical SEO | | RevanaDigitalSEO0 -
Multiple Domains pointing to one IP
Hi we have some issues with Multiple domains pointing to one IP. is this considered duplicate content by Google? If so, what is the best thing can we do to avoid this? thanks
Technical SEO | | solution.advisor0 -
Hiding Duplicate Content using Javascript
We have e-commerce site selling books. Besides basic information on books, we have content for “About the book” , “Editorial Reviews”, “About the author” etc. But the content in all these section are duplicate and are available on all sites selling similar books. Our question is: 1.Should we worry about the content being duplicate?2.If yes, then will it by a good idea to hide this duplicate content using javascript or iframe?
Technical SEO | | CyrilWilson0 -
Does the use of sliders for text-on-page, effects SEO in any way?
The concept of using text sliders in an e-commerce site as a solution to placing SEO text above or in between product and high on ages, seems too good to be true.... or is it? How would a text slider for FAQ or other on-page text done with sliding paragraphs (similar but not this specific code- http://demo.tutorialzine.com/2010/08/dynamic-faq-jquery-yql-google-docs/faq.html) might effect text-on-page SEO. Does Google consider it hidden text? Would there be any other concerns or best practices with this design concept? faq.html
Technical SEO | | RKanfi0 -
Cross-Domain Canonical - Should I use it under the following circumstances?
I have a number of hyper local directories, where businesses get a page dedicated to them. They can add images and text, plus contact info, etc. Some businesses list on more than one of these directory sites, but use exactly the same description. I've tried asking businesses to use unique text when listing on more than one site to avoid duplication issues, but this is proving to be too much work for the business owner! Can I use a cross-domain canonical and point Google towards the strongest domain from the group of directories? What effects will this have? And is there an alternative way to deal with the duplicate content? Thanks - I look forward to hearing your ideas!
Technical SEO | | cmaddison0 -
Page Analysis Difference Between Root and Subdomain
I have a site where the canonical version is the subdomain www, with a permanent redirect to ensure this is so. When I do a page analysis from the MozBar for the domain I see that www and *.domain are both displayed, with numbers from *.domain being shown by default in the mozbar. Does MozBar show *.domain numbers by default, and do I correctly understand that the (higher) www numbers displayed in page analysis for www are valid and a result of my canonical strategy?
Technical SEO | | waynekolenchuk0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1 -
The effect of same IP addresses on SERPs
Hi All, Just wondering if anyone could shed some light on the following. If I was ranking number 1 for a term, what would the effects be of creating another site, hosted on the same server / IP, same whois info, same URL but a different TLD, and trying to get this to rank for the term also. Does G restrict search results to one IP per page or is this perfectly possible? (The term is fairly uncompetitive) Thanks, Ben
Technical SEO | | Audiohype0