Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which URL and rel=canonical structure to use for location based product inventory pages?
I am working on an automotive retailer site that displays local car inventory in nearby dealerships based on location. Within the site, a zip code is required to search, and the car inventory is displayed in a typical product list that can be filtered and sorted by the searcher to fit the searchers needs. We would like to structure these product inventory list pages that are based on location to give the best chance at ranking, if not now, further down the road when we have built up more authority to compete with the big dogs in SERP like AutoTrader.com, TrueCar.com, etc. These higher authority sites are able to rank their location based car inventory pages on the first page consistently across all makes and models. For example, searching the term "new nissan rogue" in the Los Angeles, CA area returns a few location based inventory pages on page 1. The sites in the industry that are able to rank their inventory pages will display a relatively clean looking URL with no redirect that still displays the local inventory like this in the SERP:
Local Website Optimization | | tdastru
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue
but almost always use a rel=canonical tag within the page to a page with a location parameter attached to the end of the URL like this one:
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue/Los+Angeles+CA-90001"/>
I'm having a hard time figuring out why sites like this example have their URLs and pages structured this way. What would be the best practice for structuring the URL and rel=canonical tags to be able to rank for and display location based inventory pages for cars near the searcher?0 -
Looking to create a "best practice" doc on location pages. Anyone know of a useful resource?
I'm working for a few regional brands and would like to create a best practice doc for the structure of a location page. Has anyone seen anything recent regarding a structure for local, regional and national pages? Thanks all, Kevin
Local Website Optimization | | Kevin.Bekker1 -
In local SEO, how important is it to include city, state, and state abbreviation in doctitle?
I'm trying to balance local geographic keywords with product keywords. I appreciate the feedback from the group! Michael
Local Website Optimization | | BFMichael0 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Local SEO - Multiple stores on same URL
Hello guys, I'm working on a plan of local SEO for a client that is managing over 50 local stores. At the moment all the stores are sharing the same URL address and wanted to ask if it s better to build unique pages for each of the stores or if it's fine to go with all of them on the same URL. What do you think? What's the best way and why? Thank you in advance.
Local Website Optimization | | Noriel0 -
A question about similar services a multiple locations
Moz Friends, I hope you can help with this question. My company has 25 locations, and growing. Our rankings are strong in the Serps and Local Maps. With each location, we create a new page (with a unique URL) for that specific location (ex: Thriveworks.com/knoxville-counseling). We then write about 15 pages of unique content for that location, each page about one of the services we provide like: Depression Counseling, Couples Therapy, Anger Management, Eating Disorder Treatment, Life Coaching, Child Therapy, and the list goes on and on.... Hence, for each location, we create a pile of URLS like: Thriveworks.com/knoxville-counseling/couples-therapy, ..../knoxville-counseling/depression-therapy, .../knoxville-counseling/anger-management... We do this to rank for medium-long-tail searches like "Knoxville Marriage Therapy." As we grow, this results in us writing lots and lots of original content for each location. Original, but somewhat redundant. We would much rather write one AMAZING article on depression counseling, than 25 'okay' ones for each office we open. So, my question (if you're still reading) is our current approach the right one? Should we continue the grind and for each location create a unique page for each service offered out of that office? Or is there a better way, where we can create One anger management page that would suffice for each of our local offices? Has anyone addressed this topic in an article? I Haven't found one... I look forward to your feedback, and thanks in advance!!
Local Website Optimization | | Thriveworks-Counseling0 -
Is my competitor doing something blackhat? - Cannot only access pages via serps , not from website navigation /search
Hi Mozzers, One of my competitors uses a trick whereby they have a number of different sitemaps containing location specific urls for their most popular categories on their eCommerce store. It's quite obvious that they are trying to rank for keyword <location>and from what I am see, you cant to any of these pages from their website navigation or search , so it's like these pages are separated from the main site in terms of accessing them but you can access the main website pages/navigation the other way round (i.e if you select one of the pages from finding it in serps) </location> I know that google doesn't really like anything you can't access from the main website but would you class this as blackhat ? / cheating etc ... They do tend to rank quite well for these alot of the pages and it hasn't seem to have affected pages on their main website in terms of rankings. I am just wondering , if it's worth us doing similar as google hasn't penalised them by the looks of things.. thanks Pete
Local Website Optimization | | PeteC120 -
Does the Location of my Server effect my SEO?
Does the geographic Location of my Server effect my SEO? HELP US! We are arguing for 3 weeks already. My partner has mentioned multiple times in the past that "since 2013 google does not require your server to be in the country you are targeting for seo"
Local Website Optimization | | DanielBernhardt
And that actually all they care about is if its a good and fast server - not where its physically located in the world. I am a strong believer that the geographic location of your server directly effects your SEO ranking... lets say if you want to target www.google.ru for your seo, best you have a server located in Russia for hosting your website.. WHO IS RIGHT? Choose the winner and base the facts.
If anybody has the correct answer and information to base it on it will help us alot - and maybe even spare some unnecessary violent between us two! we found some articles across the web, sadly they are all dated back to 2012.... Thanks in Advance for all the help guys!0