Personalization software and SEO
-
Hi guys,
I'm just testing a personalization software in our website, basically changing the "location" text depending on the user's IP.
I can see in my software that when the Google bot comes to our site the personalization software triggers an action changing the location based text to "California". Can this make Google understand that our website targets only users in California and thereof hurt our rankings in other locations nationwide?
I'll appreciate your opinions.
-
So Mr King, would it be reasonable to say that personalizing all locations but California would keep us out of trouble?
Thanks Mike!
-
Thanks for your insights Dirk.
-
Hi Ana,
Just to clarify - if you redirect based on ip to a location based url like /newyork you can still have a link on the page going to the other locations like /sandiego - so Google can access all these pages & index them. This is not possible it the scenario you mentioned.
Not sure how old the article from unbounce is, but Google bot is able to interpret javascript (to a certain extent). Using javascript won't change the problem - as long as you have only one page that adapts automatically to the ip location you will be unable show all versions of the page to Google - it will help your Californian branch, but hurt all the others.
rgds,
Dirk
-
This is great Dirk - thanks so much for your insight as always!
-
Hi Patrick,
If the question would have been about country targeting I guess your answer would have been correct. As mentioned in the article however, the lowest level of geolocation is country. As the question was about locations "nationwide" I would conclude based on this article that at this point of time Google is unable to detect geo-targeted content based on region or city.
Even for countries I think it's a risky business - as the article doesn't indicate if this "local" bots visit the sites with the same frequency & depth as the normal ones, and they don't clearly indicate which country ip's are used.
It's a different story for languages - because you can indicate in the HTTP header that the content is depending on the user's language. A similar case is the dynamic serving for mobile (https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/dynamic-serving?hl=en) - here you can indicate that the content is changing based on the user agent.
As far as I know, there is no way to indicate in the HTTP header that the content is varying based on ip address.
rgds,
Dirk
-
Hi both,
Thanks a lot for your ideas and suggestions. No doubt it's a tough subject. I don't really understand Google's position about this, on one hand they want you to provide a better user experience (what can be done through personalization) and on the other hand they don't seem to be providing reasonable solutions to potential SEO drawbacks.
Dirk, referencing this line of yours "What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links", don't you think if the user is directly redirected to the "location based-page" then the Google bot coming from California (as an example) will also be redirected to it and then understand that the website is targeting California?
I read something at Unbounce regarding dynamic text replacement that caught my attention http://documentation.unbounce.com/hc/en-us/articles/203661004-Dynamic-Text-Replacement-pro-
They say “It's always been possible with Unbounce to do text replacement using a little bit of JavaScript, but unfortunately the bots and crawlers that Google (and other ad networks) use to validate your landing page quality don't read Javascript.”
If the fact that the bots cannot read Javascript is true maybe using Javascript for small personalization actions such as changing the location-based text maybe the solution. I wonder if this follows google guidelines or not.
Again I'll appreciate your answers; I'll go through all the links and information and keep investigating. I really need to find some technically supported facts.
Thank again. Ana
-
Hi Dirk
Thanks for the corrections and examples here. I appreciate it and learned something new here myself.
Out of curiosity, what do you make of the following: https://support.google.com/webmasters/answer/6144055?hl=en
After reading your explanation, and Google's suggestion in bold and red here, I understand the importance of your recommendation. I was just wondering your thoughts on this particular article and what you make of it.
Thanks so much again and well done!
-
Hi,
I don't really agree with the answer of Patrick. Depending on the level of personalisation you apply, it can hurt your rankings for locations outside California (our eventual other ip locations for Google bot).
As an example - you manage a chain of Chinese restaurants spread around the country and you have the domain mychineserestaurant.com.
If users accesses the site directly in New York, he will see the address, picture, phone number etc. from the New York restaurant. Googlebot however will never see this content - the bot will only be able to access the content from your branch in Los Angeles. While this is great for the user experience, there is no way to show Google the other locations, as you are obliged to show the bot the same content as normal human users, and hence show the information based on the ip of the bot.
The example of Groupon given by Patrick is not exactly the same - they personalise the homepage based on your ip - but if you search for Groupon New York you go to http://www.groupon.com/local/new-york-city
What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links. In the example above - accessing the site in New York I would go by default to mychineserestaurant.com/newyork but with the option to change the location. This way Google bot would be able to crawl all the locations. It's also the advice coming from Matt Cutts: https://www.mattcutts.com/blog/give-each-store-a-url/
If the personalisation is only minor (example only the local address on the homepage) and if you already have targeted pages for each location it should not really be a problem.
To be honest - it's rather my own opinion than something which is supported by hard facts.
Hope this helps,
Dirk
-
Hi there
Don't worry about this, it shouldn't be an issue. One thing you can do is target your website in Webmaster Tools if you're looking to target specific regions.
Amazon and Groupon have personalization happening on their sites as well - but that doesn't effect their ratings.
I would also take a look at:
SEO in the Personalization AgeLet me know if this helps at all!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
SEO traffic to the homepage is down across sites
Week over week, I've noticed that organic traffic (and oftentimes revenue) for the homepage are down across most of our sites compared to last year. Brand search interest is down for a number of the brands, but in a lot of these cases, it's not down so much that it would make sense for how much the homepage is down (for example: brand search interest was down 4% last week compared to last year, but the homepage traffic was down 32% in visits). What I've done is generate entry page reports (this year vs. last year) and then bucket the pages by homepage, category pages, and product pages. In most cases, category pages are up year over year for traffic and revenue. I'm concerned that the homepage being down is more than a brand heat issue, but I haven't come across anything out of the ordinary in Google Search Console and keywords are pretty consistent in performance for the most part. Branded keywords continue to rank at #1, too. Any thoughts as to what else I can look into?
Technical SEO | | WWWSEO0 -
Getting the SEO right for blog on different server
Hi There This must be a common scenario but there's very little help on it. Right now I have: www.domain.com hosted on a Windows dedicated server. I have blog.domain.com hosted on a separate hosted Wordpress server and I use an A Record at the DNS level to make sure the sub domain works. Easy peasy! However we want to move our blog so its at www.domain.com/blog as we're definitely seeing an issue with the sub domain hosting of the blog in terms of SEO. My problem is that I cannot install WP onto the windows server, its' just not feasible as too much is going on with it, so i can;t simply redirect my blog.subdomain.com to www.domain.com/blog as it won't exist. How do I do this and maintain the SEO/link juice? Any help much appreciated!
Technical SEO | | Raptor-crew0 -
Target: blank. Does it make an SEO difference?
I've notice many sites MOZ included no longer use the target: blank attribute. I think that's what it's called. Basically when a link on your site opens a new tab in the browser as opposed to replacing the browser window you are in. Given that MOZ think of everything, I would love to hear opinions on this.
Technical SEO | | wearehappymedia0 -
Do rss feeds help seo?
If we put relevant RSS feeds on a site, will it help the SEO value? Years ago, I shied away from RSS feeds because they slowed the site down and I didn't like relying on them. However, the past couple years, the Internet has become better, especially in Alaska.
Technical SEO | | manintights280 -
What are your thoughts on Twylah and SEO?
I recently signed up for Twylah. If you are not familiar with it, Twylah creates a summary of all your tweets, which you can then add to your site to make them easily accessible for humans and for search engines. On first glance I am really liking this idea, however after adding Twylah to our site, our crawl diagnostics took a major spike in errors and alerts: https://dl.dropbox.com/u/90501/static/Diagnostics%20After%20Twylah.png Here is our Twylah page: http://tweets.hingeheads.com I am not a SEO expert, but the number of errors is worrying me. Are we getting penalized by Search Engines/Google because of the high number in errors/alerts? Curious to hear your thoughts. P.S. I have fwd this to the Twylah team. They will get back to me in the next few days.Diagnostics%20After%20Twylah.png
Technical SEO | | hingeheads0 -
What is best suggested component for joomla seo
my site is running on joomla cms. Iam using jreviews, jomsocial. I want to know what is best componenet for seo and better results
Technical SEO | | jayadeep0 -
What are SEO factors in re-doing a website?
Most of my work now involves converting older websites to CMS-based sites (in Wordpress) and I'm wondering about best practices here. If I create a "dev" or "sandbox" directory for my development work how do I keep the pages from being indexed while I am working on the new site? Can I "noindex" a directory? What do I do with the old html files when the new site goes live? I'm assuming I will do a 301 redirect from domain.com/index.html to the new domain.com/, and also on all of the inner pages that have equivalent pages in the new site. But there will be a lot of old files left that have no equal in the new site. Do I just delete these, or noindex nofollw them?
Technical SEO | | bvalentine0