Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking for keywords locally with multiple locations
If we have a company with multiple physical locations across multiple states, but selling the same products, what would be an optimal strategy? All local locations have been claimed, but the site is not coming up for searches with local intent. If the corporate site focuses on the "products", what is the best way to get that associated with the individual locations as well? When implementing json+ld, would we put the specific location on the specific location pages and nothing on the rest? Any other tips would be great! Thanks in advance,
Local Website Optimization | | IDMI.Net0 -
Optimizing Local SEO for Two Locations
Hi there! I have a client that has just opened a 2nd location in another state. When optimizing for local I have a few questions: We're creating a landing page for each location, this will have contact information and ideally some information on each location. Any recomendations for content on these landing pages? The big question is dual city optimization. Should Include the city & state of BOTH locations in all my title tags? or should I leave that to the unique city landing pages? What other on-page optimizations should i consider across the site? Thanks! Jordan
Local Website Optimization | | WorkhorseMKT0 -
Closed Location Pages - 301 to open locations?
I work with several thousand local businesses and have a listing page for each on my site. Recently a large chunk of these locations closed, and a number of these pages rank well for localized keywords. I'm trying to figure out the best course of action.
Local Website Optimization | | Andrew_Mac
What I've done so far is make a note on each of the closed location pages that says something to the effect of "This location is currently closed. Here are some nearby options" and provide links to the location pages of 3 open places nearby. The closed location pages are continuing to rank well, but conversion rates from visitors landing on these pages has dropped. What I'm considering doing is 301ing these pages to the nearest open location page. I'm hoping this will preserve the ranking of the page for keywords for which the nearby location is still relevant, while not hurting user experience by serving up a closed location. I'm also thinking of, as a second step, creating new pages (with slightly altered URLs) for the closed listings. They won't rank as well obviously, but if someone searches for the address or even the street of the closed location, my hope is that I could still capture some of that traffic and hope to convert it through someone clicking through to an open location from there. I spoke with someone about this second step and he thought it sounded spammy. My thinking is, combined with the 301, I'm telling Google that the page it is currently ranking well no longer has the importance it once did and that the page I'm 301ing to does, but that the content on the page I'm creating for the closed location still has enough value to justify the newly created page. I'd really appreciate thoughts from the community on this. Thanks!0 -
How to approach SEO for a national umbrella site that has multiple chapters in different locations that are different URLS
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local site. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago. We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
What is the optimal approach for a new site that has geo-targeted content available via 2 domains?
OK, so I am helping a client with a new site build. It is a lifestyle/news publication that traditionally has focused on delivering content for one region. For ease of explanation, let's pretend the brand/domain is 'people-on-the-coast.com'. Now they are now looking to expand their reach to another region using the domain 'people-in-the-city.com'. Whilst on-the-coast is their current core business and already has some search clout, they are very keen on the city market and the in-the-city domain. They would like to be able to manage the content through one CMS (joomla) and the site will deliver articles and the logo based on the location of the user (city or coast). There will also be cases where the content is duplicated for both regions. The design/layout etc. will all remain identical. So what I am really wanting to know is the pros, cons and ultimately the best approach to handle the setup and ongoing management from an SEO (and UX) perspective. All I see is problems! Any help would be greatly appreciated! Thanks,
Local Website Optimization | | bennyt
Confused O.o0 -
What's your opinion on stores with multiple locations around the country that sell the same products?
Is there a way to capture local SEO traffic by only having one website/page for our product pages or do we have to have a website for each location even though the content is identical? We do have a location finder where we list each location. But we want to generate local traffic in the cities we are in to our product pages through SEO, but it's difficult because they all sell the exact same product. We know Google doesn't like duplicate content.
Local Website Optimization | | GrowBrilliant0 -
Looking for resources (or personal tips/suggestions) on developing higher client conversion rates
I work for a local Canadian digital marketing company and we are seeking to expand our client base but are having difficulty with converting potential clients. We do excellent work and get good results across the board for our current customers. That being said, we can't seem to convert new ones at the rate we hope to. I am wondering what we are missing and would really like to see what has worked for others to see if we can augment our approach to be more successful. Any ideas are greatly appreciated! We are currently beating the competition on local SERPs, and rank very well nationally.
Local Website Optimization | | Toddfoster
We have run AdWords campaigns, attempted cold-calling, worked with website audits, worked with different sales funnel ideas, social media and done our share of outreach in both online and local communities, among other marketing moves. Tips on these topics are also welcome, because it is possible we just went about it the wrong way.0 -
Keywords with locations
I've seen quite a few threads that orbit around my questions, but none in the last year, so I'll ask it 🙂 I'm seeing some strange results when testing various keywords with and without locations included. For a foundation repair company in Indiana, we've optimized for all the big cities, since the company services the whole state. Here's a sample of weird stuff: Test 1: If I set my location (all other Google 'helps' turned off) to Indianapolis and search 'foundation repair' result is #3 'foundation repair indianapolis' result is #20 'indiana foundation repair' result is #18 Test 2: Location set to the small town the company is based in (Rossville, IN) 'foundation repair' result is #1 'foundation repair rossville' result is #3 behind other companies located in Rossville, GA, and Rossville, PA!! I suppose I was under the impression that the ip location data Google gathers would weigh more heavily than how place names are optimized as part of keywords (or just that the physical location would supplant the place name typed into the search if it happened to be the same). But according to these tests, it seems that inferred location is by far a secondary factor. I can deduce that we're more optimized than our competitors for 'foundation repair', but less optimized for keywords with place names in them (we feel like we'd be verging on stuffing if we did more). Am I missing something here? Has anyone else seen this sort of thing?
Local Website Optimization | | clearlyseo0