Personalization software and SEO
-
Hi guys,
I'm just testing a personalization software in our website, basically changing the "location" text depending on the user's IP.
I can see in my software that when the Google bot comes to our site the personalization software triggers an action changing the location based text to "California". Can this make Google understand that our website targets only users in California and thereof hurt our rankings in other locations nationwide?
I'll appreciate your opinions.
-
So Mr King, would it be reasonable to say that personalizing all locations but California would keep us out of trouble?
Thanks Mike!
-
Thanks for your insights Dirk.
-
Hi Ana,
Just to clarify - if you redirect based on ip to a location based url like /newyork you can still have a link on the page going to the other locations like /sandiego - so Google can access all these pages & index them. This is not possible it the scenario you mentioned.
Not sure how old the article from unbounce is, but Google bot is able to interpret javascript (to a certain extent). Using javascript won't change the problem - as long as you have only one page that adapts automatically to the ip location you will be unable show all versions of the page to Google - it will help your Californian branch, but hurt all the others.
rgds,
Dirk
-
This is great Dirk - thanks so much for your insight as always!
-
Hi Patrick,
If the question would have been about country targeting I guess your answer would have been correct. As mentioned in the article however, the lowest level of geolocation is country. As the question was about locations "nationwide" I would conclude based on this article that at this point of time Google is unable to detect geo-targeted content based on region or city.
Even for countries I think it's a risky business - as the article doesn't indicate if this "local" bots visit the sites with the same frequency & depth as the normal ones, and they don't clearly indicate which country ip's are used.
It's a different story for languages - because you can indicate in the HTTP header that the content is depending on the user's language. A similar case is the dynamic serving for mobile (https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/dynamic-serving?hl=en) - here you can indicate that the content is changing based on the user agent.
As far as I know, there is no way to indicate in the HTTP header that the content is varying based on ip address.
rgds,
Dirk
-
Hi both,
Thanks a lot for your ideas and suggestions. No doubt it's a tough subject. I don't really understand Google's position about this, on one hand they want you to provide a better user experience (what can be done through personalization) and on the other hand they don't seem to be providing reasonable solutions to potential SEO drawbacks.
Dirk, referencing this line of yours "What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links", don't you think if the user is directly redirected to the "location based-page" then the Google bot coming from California (as an example) will also be redirected to it and then understand that the website is targeting California?
I read something at Unbounce regarding dynamic text replacement that caught my attention http://documentation.unbounce.com/hc/en-us/articles/203661004-Dynamic-Text-Replacement-pro-
They say “It's always been possible with Unbounce to do text replacement using a little bit of JavaScript, but unfortunately the bots and crawlers that Google (and other ad networks) use to validate your landing page quality don't read Javascript.”
If the fact that the bots cannot read Javascript is true maybe using Javascript for small personalization actions such as changing the location-based text maybe the solution. I wonder if this follows google guidelines or not.
Again I'll appreciate your answers; I'll go through all the links and information and keep investigating. I really need to find some technically supported facts.
Thank again. Ana
-
Hi Dirk
Thanks for the corrections and examples here. I appreciate it and learned something new here myself.
Out of curiosity, what do you make of the following: https://support.google.com/webmasters/answer/6144055?hl=en
After reading your explanation, and Google's suggestion in bold and red here, I understand the importance of your recommendation. I was just wondering your thoughts on this particular article and what you make of it.
Thanks so much again and well done!
-
Hi,
I don't really agree with the answer of Patrick. Depending on the level of personalisation you apply, it can hurt your rankings for locations outside California (our eventual other ip locations for Google bot).
As an example - you manage a chain of Chinese restaurants spread around the country and you have the domain mychineserestaurant.com.
If users accesses the site directly in New York, he will see the address, picture, phone number etc. from the New York restaurant. Googlebot however will never see this content - the bot will only be able to access the content from your branch in Los Angeles. While this is great for the user experience, there is no way to show Google the other locations, as you are obliged to show the bot the same content as normal human users, and hence show the information based on the ip of the bot.
The example of Groupon given by Patrick is not exactly the same - they personalise the homepage based on your ip - but if you search for Groupon New York you go to http://www.groupon.com/local/new-york-city
What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links. In the example above - accessing the site in New York I would go by default to mychineserestaurant.com/newyork but with the option to change the location. This way Google bot would be able to crawl all the locations. It's also the advice coming from Matt Cutts: https://www.mattcutts.com/blog/give-each-store-a-url/
If the personalisation is only minor (example only the local address on the homepage) and if you already have targeted pages for each location it should not really be a problem.
To be honest - it's rather my own opinion than something which is supported by hard facts.
Hope this helps,
Dirk
-
Hi there
Don't worry about this, it shouldn't be an issue. One thing you can do is target your website in Webmaster Tools if you're looking to target specific regions.
Amazon and Groupon have personalization happening on their sites as well - but that doesn't effect their ratings.
I would also take a look at:
SEO in the Personalization AgeLet me know if this helps at all!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO + Structured Data for Metered Paywall
I have a site that will have 90% of the content behind a metered paywall. So all content is accessible in a metered way. All users who aren't logged in will have access to 3 articles (of any kind) in a 30 day period. If they try to access more in a 30 day period they will hit a paywall. I was reading this article here on how to handle structured data with Google for content behind a paywall: https://www.searchenginejournal.com/paywalls-seo-strategy/311359/However, the content is not ALWAYS behind a paywall, since it is metered. So if a new user comes to the site, they can see the article (regardless of what it is). Is there a different way to handle content that will be SOMETIMES behind a paywall bc of a metered strategy? Theoretically I want 100% of the content indexed and accessible in SERPs, it will just be accessible depending on the user's history (cookies) with the site. I hope that makes sense.
Technical SEO | | triveraseo0 -
SEO impact classifieds website
Hi, I'm part of an organization running a classifieds platform in Spain. (Mercadonline.es) We are hit by Google penalties since a few weeks, possibly caused by numerous errors we are experiencing. Most frequent errors are 404's and duplicate content (titles tags etc) since the nature of our website is dynamic. Many ads change daily, are added or removed, causing Googlebots (and others) to flag us and not being able to see our more unique content. Until what part of our platform should we be indexed? Since we have +34,000 pages indexed (mostly due to internal filter pages) I would need a systematic solution for us to display relevant and unique content, with enough usage of keywords that can bring us back up - we are actually ranked <50 on google for most of our main keywords. It is costing us precious time and money since we can only aquire our visitors (adwords etc) and not being to attract any organically. I can go in more detail with someone who can give me a bit more direction. Your answer is much appreciated! Ivor
Technical SEO | | ivordg0 -
Domain name SEO
I would like to hear your opinion about which between robotics.kawasaki.com and www.kawasakirobotics.com is more effective for SEO of keyword robotics and kawasaki. We have been using kawasaki.com domain name for more than 15 years.
Technical SEO | | Iwashima0 -
Using 302 redirect for SEO
Hello, I'm in charge of SEO for an information website on which articles are only accessible if you have a login and password. Most of the natural links we get point to our subscribers' subomain : subscribers.mywebsite.com/article1 If they follow these natural links, visitors who are not logged get redirected (302) to www.mywebsite.com/article1 on which there is an extract of the article and they can request a free test subscription to read the end of the article. My goal is to optimize SEO for the www.mywebsite.com/article1 page. Does this page benefit from the links I get to the subscribers.mywebsite.com/article1 page or are theses links lost in terms of SEO? Thanks for your help, Sylvain
Technical SEO | | Syl200 -
Seo template for new website
I am revamping my website (www.UltimateBasicTraining.com). It's going to be major and I am concerned about the potential traffic loss since over 60% of my overall traffic comes from organic search results in the military basic training area. Are there any good SEO website templates I can start from? I see a lot out there but would prefer the advice of professionals. Thanks
Technical SEO | | TheVolkinator0 -
Does HTTPS Only make a impact on SEO?
Hi. I run a site that's SSL only. (Using a 301 redirect to redirect traffic from http:// to https://). This might be a stupid question but i can't seem to find any conclusive answers to the question by searching. Does this negatively affect the search engine ranking of the site? Regards,
Technical SEO | | Host1
Eivind1 -
New Forum: SEO considerations.
We're going to add a new forum to our website. We don't anticipate very large volumes of users. I read somewhere in The Art of SEO that forums should be 'built in bbPress'. I'm very much a programming novice so I'm still trying to get to grip with the basics of forums. I'd be grateful to know the main SEO considerations (however basic) that I should tell my web developer who is building the new forum.
Technical SEO | | JacobFunnell0 -
Mobile SEO or Block Crawlers?
We're in the process of launching mobile versions of many of our brand sites and our ecommerce site and one of our partners suggested that we should block crawlers on the mobile view so it doesn't compete for the same keywords as the standard site (We will be automatically redirecting mobile handsets to the mobile site). Does this advice make sense? It seems counterintuitive to me.
Technical SEO | | BruceMillard0