Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Differentiating Franchise Location Names to better optimize locations
Hello All, I am currently spear heading SEO for a national franchise. I am coming across locations in the same city and zip code. I'm definitely finding difficulties in naming the location in a way that will be specific to the franchise locations (locations are 1 mile away from each other). I am looking to apply geo specific location names for each center regardless of local city terms. (e.g. Apexnetwork of north madronna, Apexnetwork of south madronna) Also, building the website and location to read (apexnetwork.com/north-madronna….. apexnetwork.com/south-madronna) While encouraging the client to continue using the geo specific terms while writing blogs. Is this best practice? Any feedback would help.
Local Website Optimization | | Jeffvertus0 -
Mysterious Location Based SERP Disappearance
Hi Everyone, I've got a bit of a confusing SEO issue which I'm hoping you'll be able to help with. Apologies in advance for the long post, I've put an abridged version below also. We have one main keyword and it seems to have disappeared in some locations. The main keyword is "clothing manufacturers" and up until recently we had stability for almost a year. We're based in London, England and we regularly check "clothing manufacturers" to see where we're showing in search, and we usually see between 3rd - 5th. We use AHREFS to track rankings and noticed recently that "clothing manufacturers" had disappeared totally. We asked some people in different areas of the country to check where we were showing in search - one in Somerset, one in Liverpool, one in Beckingham and we used a VPN in Manchester. In all of these areas we aren't ranking for our main keyword at all. In London though we're 5th which is the lower end of normal. We then checked other keywords and it turns out "Clothes manufacturers" is one we're also not ranking for outside of London. However for "clothing manufacturers uk" and "clothes manufacturers uk" we are ranking for in every location we have tried. "Clothing manufacturers uk" is currently the keyword which brings us the most traffic. There are no manual penalties in webmaster tools, but looking at analytics it looks like our impressions for the main keyword have been down over the past 90 days, so we think we have had a problem and not realised for some time. Around a week before we see that our traffic for "clothing manufacturers" dropped, we made some structural changes to the website homepage, where we added LSIs, more H2s, more long tail keywords and more content, taking the copy from around 500 words to around 1100 words. This was in an effort to make the homepage less keyword stuffed and more natural. As a result of this we saw an overall increase in traffic and enquiries, and that's the reason we didn't notice for so long that traffic from "clothing manufacturers" has dropped so badly. Our first thought is that this might be something to do with Schema. Our website was until last week using a schema which included our "postal address" which is our physical office location in London. The schema was implemented in June 2017 and we have noticed that 3 months after implementing the schema, in October, our traffic fell dramatically for our main keyword, "clothing manufacturers". At the same time, our traffic for "clothing manufacturers uk" increased dramatically. Interestingly, the schemas used by our competitors don't include their office addresses and they show up all over the country for "clothing manufacturers" and "clothes manufacturers". One of our competitors is physically within half a mile of us. Have you guys seen a schema limit a company to searches only in one locality before? We have now removed the address from the schema to see if we start ranking all over the country again, like we used to before we implemented it. If this is the problem then it could take 3 months to turn around like it did for us to get in to this situation (Schema implemented June 2017, traffic fell October 2017). We're therefore trying to investigate every possibility to ensure we leave no stone unturned. Do you have any thoughts on the problem and if it could be schema related, or possibly something else? Thank you in advance! TL:DR Keywords "clothing manufacturers" and "clothes manufacturers" no longer ranking around the UK. Still ranking in London where we are based. Still ranking well for "clothing manufacturers uk" and "clothes manufacturers uk". Traffic for "clothing manufacturers" dropped 3 months after implementing schema and one week after making changes to website homepage (increased word count, added long tail keywords, LSIs and H2s). Schema included "postal address" which we notice none of our competitors have. They rank all over the country for "clothing manufacturers". One of our competitors is based within half a mile of us in London. Could having the address in the schema limit us to one locality? Could it be something else entirely?
Local Website Optimization | | rswhtn0 -
Country/Language combination in subdirectory URL
Hello, We are a multi country/multi lingual (English, Arabic) website. We are following a subdirectory structure to separate and geotarget the country/language combinations. Currently our english and arabic urls are the same: For UAE: example.com/ae (English Site) For Saudi Arabic: example.com/sa (Saudi Arabia) We want to separate the English and Arabic language URLs and I wanted to know if there is any preference as to which kind of URL structure we should go with : example.com/ae-en (Country-Language) example.com/en-ae (Language-Country) example.com/ae/en (Country/Language) Is there any logic to deciding how to structure the language/country combinations or is is entirely a matter of personal preference. Thanks!
Local Website Optimization | | EcommRulz0 -
Moving to a new Location: SEO Website
I'm moving to a different state and want to keep my business and clients in both locations. Is it better to build two separate sites, one for Ohio locations and create a new site for Tennessee content? (www.ohiosite.com & www.tennesseesite.com) Or is it best to keep one site, and install a second wordpress site in a separate folder like ( www.site.com + www.site.com/tennessee )
Local Website Optimization | | morg454540 -
Targeting different cities for my service - Geo landing pages
I am breaking my head trying to figure out the best way around this... so we have an hvac company located in nyc. We want to also target all the different boroughs. We have a bunch of different major keywords hvac repair + location hvac service + location along with keywords such as air conditioning repair + location, heating service + location , and so on..... Should each borough + keyword have its own page? Or should we just have one page called brooklyn and in that page target all the different keywords like hvac, air conditining, and heating ? Also does it matter how we have it laid out? Domaim/hvac-repair-brooklyn or should I add domain/service-area/hvac. ..... Some of my competitors have the same content written on each borough page just moved around a little with different city names, how are they ranking so well? Isn't that duplicate? Would love to hear from some people with success in this local area. Thanks!
Local Website Optimization | | interstate0 -
Local SEO + Best Practice for locations
Hi All, Based on a hypothetical scenario, lets say you are a plumber. You live and operate within Chelsea in London. You have established a Google places profile and incorporated schema data to tell Google your fixed place location. In addition you operate in several nearby towns with no fixed location presence. i.e Brentford, Bromley, Catford, Cheswick and Tottenham. I create a feature rich page on 'How to find a quality plumber'. Within the page I incorporate the following description: blah blah, as a quality plumber serving the community of Chelsea, we also offer our services to nearby towns of Brentford, Bromley, Catford, Cheswick and Tottenham. I create hyperlinks for the towns (Brentford, Bromley, Catford, Cheswick and Tottenham) that allow the user see in details a full list of services, operation hours, etc. Naturally all towns will have there own unique content (no duplication). Question
Local Website Optimization | | Mark_Ch
Is the above scenario the correct way to provide local seo or is this approach considered spammy to Google? Thanks Mark0 -
Website Mods and SEO for Multi-Location Practice?
We're in the process of taking over a WordPress website within the next week for a 3 location medical practice. These are in 3 different cities. 1 location is in a pretty competitive market, while the other 2 are not. The current site isn't bad for design and navigation and they don't have the budget for a full-redesign. Structurally, it is sound. It lacks a lot of content though and a blog. It is not responsive, should we convert to make it responsive? At first glance you can't tell they have 3 locations and their content for each location and services offered is pretty weak. What other suggestions do any of you have for getting the main site to rank for all 3 locations? I know it'll take some time since they are no where to be found now, but just looking for any other tips you may all have. Thanks!! - Patrick
Local Website Optimization | | WhiteboardCreations0 -
Separate Domains for Different Locations (in Different Cities)
We are in the process of building a new website for a client with locations in Tucson and Phoenix. Currently, they have one website that encompasses all locations, however, we are going to build them location specific websites (as many of the services are different between locations). Now my question is, as far as SEO goes, which one of these options would be the best? Option 1: Have separate domain names for each location. For example, StevesPetTucson.com and StevesPetPhoenix.com. _Pros: Easy to target specific, local keywords. Better looking domains. _ _Cons: Splits backlinks between two domains. _ Option 2: Setup StevesPet.com/Phoenix and StevesPet.com/Tucson. Pros: Keeps all backlinks pointing to one root domain. Note: We are going to use seperate WordPress installs for both websites, regardless of how we setup the domains. As we will be using different templates, menus and so on, we found this to be the best option. Thanks for any advice!
Local Website Optimization | | McFaddenGavender1