Personalization software and SEO
-
Hi guys,
I'm just testing a personalization software in our website, basically changing the "location" text depending on the user's IP.
I can see in my software that when the Google bot comes to our site the personalization software triggers an action changing the location based text to "California". Can this make Google understand that our website targets only users in California and thereof hurt our rankings in other locations nationwide?
I'll appreciate your opinions.
-
So Mr King, would it be reasonable to say that personalizing all locations but California would keep us out of trouble?
Thanks Mike!
-
Thanks for your insights Dirk.
-
Hi Ana,
Just to clarify - if you redirect based on ip to a location based url like /newyork you can still have a link on the page going to the other locations like /sandiego - so Google can access all these pages & index them. This is not possible it the scenario you mentioned.
Not sure how old the article from unbounce is, but Google bot is able to interpret javascript (to a certain extent). Using javascript won't change the problem - as long as you have only one page that adapts automatically to the ip location you will be unable show all versions of the page to Google - it will help your Californian branch, but hurt all the others.
rgds,
Dirk
-
This is great Dirk - thanks so much for your insight as always!
-
Hi Patrick,
If the question would have been about country targeting I guess your answer would have been correct. As mentioned in the article however, the lowest level of geolocation is country. As the question was about locations "nationwide" I would conclude based on this article that at this point of time Google is unable to detect geo-targeted content based on region or city.
Even for countries I think it's a risky business - as the article doesn't indicate if this "local" bots visit the sites with the same frequency & depth as the normal ones, and they don't clearly indicate which country ip's are used.
It's a different story for languages - because you can indicate in the HTTP header that the content is depending on the user's language. A similar case is the dynamic serving for mobile (https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/dynamic-serving?hl=en) - here you can indicate that the content is changing based on the user agent.
As far as I know, there is no way to indicate in the HTTP header that the content is varying based on ip address.
rgds,
Dirk
-
Hi both,
Thanks a lot for your ideas and suggestions. No doubt it's a tough subject. I don't really understand Google's position about this, on one hand they want you to provide a better user experience (what can be done through personalization) and on the other hand they don't seem to be providing reasonable solutions to potential SEO drawbacks.
Dirk, referencing this line of yours "What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links", don't you think if the user is directly redirected to the "location based-page" then the Google bot coming from California (as an example) will also be redirected to it and then understand that the website is targeting California?
I read something at Unbounce regarding dynamic text replacement that caught my attention http://documentation.unbounce.com/hc/en-us/articles/203661004-Dynamic-Text-Replacement-pro-
They say “It's always been possible with Unbounce to do text replacement using a little bit of JavaScript, but unfortunately the bots and crawlers that Google (and other ad networks) use to validate your landing page quality don't read Javascript.”
If the fact that the bots cannot read Javascript is true maybe using Javascript for small personalization actions such as changing the location-based text maybe the solution. I wonder if this follows google guidelines or not.
Again I'll appreciate your answers; I'll go through all the links and information and keep investigating. I really need to find some technically supported facts.
Thank again. Ana
-
Hi Dirk
Thanks for the corrections and examples here. I appreciate it and learned something new here myself.
Out of curiosity, what do you make of the following: https://support.google.com/webmasters/answer/6144055?hl=en
After reading your explanation, and Google's suggestion in bold and red here, I understand the importance of your recommendation. I was just wondering your thoughts on this particular article and what you make of it.
Thanks so much again and well done!
-
Hi,
I don't really agree with the answer of Patrick. Depending on the level of personalisation you apply, it can hurt your rankings for locations outside California (our eventual other ip locations for Google bot).
As an example - you manage a chain of Chinese restaurants spread around the country and you have the domain mychineserestaurant.com.
If users accesses the site directly in New York, he will see the address, picture, phone number etc. from the New York restaurant. Googlebot however will never see this content - the bot will only be able to access the content from your branch in Los Angeles. While this is great for the user experience, there is no way to show Google the other locations, as you are obliged to show the bot the same content as normal human users, and hence show the information based on the ip of the bot.
The example of Groupon given by Patrick is not exactly the same - they personalise the homepage based on your ip - but if you search for Groupon New York you go to http://www.groupon.com/local/new-york-city
What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links. In the example above - accessing the site in New York I would go by default to mychineserestaurant.com/newyork but with the option to change the location. This way Google bot would be able to crawl all the locations. It's also the advice coming from Matt Cutts: https://www.mattcutts.com/blog/give-each-store-a-url/
If the personalisation is only minor (example only the local address on the homepage) and if you already have targeted pages for each location it should not really be a problem.
To be honest - it's rather my own opinion than something which is supported by hard facts.
Hope this helps,
Dirk
-
Hi there
Don't worry about this, it shouldn't be an issue. One thing you can do is target your website in Webmaster Tools if you're looking to target specific regions.
Amazon and Groupon have personalization happening on their sites as well - but that doesn't effect their ratings.
I would also take a look at:
SEO in the Personalization AgeLet me know if this helps at all!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Craft CMS SEO Resources
I'm just starting out in freelance SEO & I've taken on a client who is using Craft CMS (version 2.0ish) for their site. I am not even close to being competent enough to manually code via Twig, but I had the main developer install the SEOmatic plugin for me. My question from here is - are there any resources or tips I should be aware of starting out? I just started by updating meta title/descriptions via "New Template Meta(s)" but I'm a bit concerned i'm doing the "template path" thing right - I haven't seen any visible changes in browser, and the SERP preview I'm getting is giving me a broken link. But i'm doing a fresh Moz crawl right now to see if the changes took place or not. so 1. Am I on the right track? 2. How long does it typically take for changes to start to show? 3. Is there anything I should be aware of? any follow up questions just let me know, I'll be following this thread!
Technical SEO | | dig_ad_austin0 -
Redirect domain without losing SEO benefit,
Now my website need to change to new domain, Since we start this website we have the landing page for sale and blog. So I need to redirect both a.com to b.com and a.com/blog/ to b.com/blog/ and I need to do by these step to keeping the benefit of SEO. For a.com Preparing the new site by duplicate the information of old site. (Get them no-index, no-follow) Export all URL to excel files by Xenu then look up together for re-check the information are they the same or not, I think we can check that if we used the same title. Redirect together by 301 codes. Get google bot know by submission to Google Webmaster Tools (Google Search Console) For a.com/blog/ Since we used wordpress to be our blog CMS, So we need to install wordpress CMS to b.com/blog/, Then backup SQL from PHP C Panel & files. Re-upload that SQL database & files to new server. Export all URL contain /blog/ then look up together by excel again. Redirect by 301 codes. So for this method what am i wrong or need to change please suggestion me. According to this is the technical SEO stuff that I never done before. After we redirect the old domain to new domain how much time Google will take for checking and give the benefit for us with the same benefit. And what we have to do for Moz.org to checking our issues & ranking. Plus - what can we do for no-index, no-follow page that belong to a.com because my website also doing these page as well for each marketing campaign to avoid Google to index.
Technical SEO | | ASKHANUMANTHAILAND0 -
Which URL structure holds the best SEO value?
Hello Community! We are rewriting URLs to better rank and provide better visual usability to our visitors. Would it be better to serve a URL that looks like this: www.domain.com/category-subcategory or www.domain.com/category/subcategory Please note the slight difference--the 2nd URL calls out a category that has a subcategory under it. Would it give us more value? Does it make a difference? Thanks in advance!
Technical SEO | | JCorp0 -
Do rss feeds help seo?
If we put relevant RSS feeds on a site, will it help the SEO value? Years ago, I shied away from RSS feeds because they slowed the site down and I didn't like relying on them. However, the past couple years, the Internet has become better, especially in Alaska.
Technical SEO | | manintights280 -
Dedicated ip helpful for seo
I read somewhere a while back that having a dedicated ip address was helpful for seo if this true or just another rumor? Also I read you should purchase your domain name for multiple yrs, what do you guys think?
Technical SEO | | TinaGammon0 -
Third Party Ranking Software
I have used a number of different tools for getting ranking information and have been using Cuterank but it now seems to be report the wrong results. Any suggestions for the best third party ranking software. It would allow me to check google, yahoo and bing ranks, compare month on month and produce branded PDFs. Any ideas? Thanks in advance.
Technical SEO | | highwayfive0 -
SEO - Localization
Hello Folks, I'm curious why my landing page is ranking at #1 on www.google.com.br ( brazil ). keyword: build wizard diablo 3. My lp is above well know domain names such as: wikia.com , blizzard.com , and also above a keyworded domain name: www.wizardbuilds.com/ Is just because my website is focus in brazilian google( .com.br ) or is because my landing page are better than the others( i don't think so)? Thanks.
Technical SEO | | augustos1 -
Huge and sudden fall in SEO traffic
Hi, I’m writing you because we have noticed in Google Analytics account, a huge fall in the SEO traffic of mywebsite, starting suddenly in one precise day. The difference between this day and the previous one is: -46,36% . Traffic has fallen in all parts of the site in the same way. Some details: - we did not make any relevant change to the website - we haven’t been banned by Google (or we don’t have any message about it in GWT) - the number of the landing pages remains approximately the same - the number of indexed keywords falls from 9.000 to 4.000. - around 2 weeks ago we found out around 120 pages with 404 error, but the error now is solved and the pages are currently working we open the content to Google about 4 months but still we have not published our Site Map. I know that the 404 error could could affect the indexation, but do you have any other idea, apart of this, of what could have happened? Thanks!
Technical SEO | | bodaclick0