Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Locations Same City
I have a local seo campaign im trying to reconfigure. Lets say i am a dwi lawyer and i have multiple locations. These are merely examples for cities and keywords. Home page is Criminal defense lawyer - this is the term we should be targeting. Maybe i can target the state name, but i am losing so much SEO weight by not leveraging this home page as the main page for this term. Then we have a location page in south Boston that is "S Boston DWI lawyer" as the title tag. Then we have another location page north Boston that is "N Boston DWI Lawyer" as the title tag. I can leave the city name off the home page title tag, but then what do i do with these pages that are pretty much competing with one another? I know the home page will not rank since none of the locations point to it, and only to a location page. I was thinking about creating one page with both locations and having both G map listings go directly there, but that doesn't make sense because other locations do not have the same setup. Or choosing the most central location and pointing that to the home page and let the rest have a locations page. Finally the home page will not rank well for any major terms. The location page does rank for the fictional south Boston DWI lawyer, but the other listing does not show up. The home page does not show up in the first ten pages either. One other aspect is that the home page ranks for terms that I am not even targeting. These pages are all targeted on specific keywords so that they do not overlap or compete, but some pages are the services main outline, but the location pages have their own version. I have removed all mentions of the same keyword from the home page. I made a few wchanges about 2 weeks ago and already noticed movement in rankings days later.
Local Website Optimization | | waqid0 -
Is it deceptive to attempt to rank for a city you're located just outside of?
I live in Greenville, SC (who has a large "Greater Greenville" reach). I work for an agency with many clients who are located just outside of the city in smaller towns, sometimes technically in counties other than Greenville. Often, they provide services in the city of Greenville and aim to grow business there, so we'll use "Greenville, SC" throughout site copy, in titles, and in meta descriptions. Are there any negative implications to this? Any chance search engines think these clients are being deceptive? And is it possible these clients are hurting their ranking in their actual location by trying to appear to be a Greenville-based company? Thank you for any thoughts!
Local Website Optimization | | engeniusbrent1 -
Blogs/content marketing or slower salesfunnel on webshop?
Hi all, Im considering about building contents en blogs on a webshop, because a visitor will get see a lot of information about blogs, etc. The salefunnel will be chaotic, purchasing will be slower on a webshop. The webshop has more then 5000 products. Focus on gamers. For example Ikea or mahuranna shop, they have builded a website near their webshops. To get more traffic ofcourse, but its to hard to do both of them. Your focus will get lost and they way of communication on website/shop will be changing. Your brand and strategic will also change a lot, thats why im considering to find the right way. Who can give me an advice?
Local Website Optimization | | Dreamgame20160 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Store Locator Apps - Which Do You Use?
Hey Everybody! I'd so appreciate feedback from our web developers and Local SEO wizards here regarding store locator apps (you know - type in a city/zip and get shown the stores nearest you). There are a number of different paid options out there on the market, and a couple of free ones. If you are managing the websites/SEO for multi-location clients, would you share with me which store locator app you chose, why you chose it and how you like it? I am particularly interested in two things about these: Does you app allow you to build a permanent landing page for each store location, including the ability to fully customize the content on that page? In terms of ensuring that these landing pages get crawled, have you used an html sitemap, some type of directory page with crawlable links or some other feature that allows bots to reach the landing pages? Or, if you're not doing any of that, do you believe Google is crawling javascript/ajax/something else to get through your store locator widget to the landing pages? Thanks, in advance, for helping me with my research on this topic!
Local Website Optimization | | MiriamEllis0 -
What is the optimal approach for a new site that has geo-targeted content available via 2 domains?
OK, so I am helping a client with a new site build. It is a lifestyle/news publication that traditionally has focused on delivering content for one region. For ease of explanation, let's pretend the brand/domain is 'people-on-the-coast.com'. Now they are now looking to expand their reach to another region using the domain 'people-in-the-city.com'. Whilst on-the-coast is their current core business and already has some search clout, they are very keen on the city market and the in-the-city domain. They would like to be able to manage the content through one CMS (joomla) and the site will deliver articles and the logo based on the location of the user (city or coast). There will also be cases where the content is duplicated for both regions. The design/layout etc. will all remain identical. So what I am really wanting to know is the pros, cons and ultimately the best approach to handle the setup and ongoing management from an SEO (and UX) perspective. All I see is problems! Any help would be greatly appreciated! Thanks,
Local Website Optimization | | bennyt
Confused O.o0 -
What's your opinion on stores with multiple locations around the country that sell the same products?
Is there a way to capture local SEO traffic by only having one website/page for our product pages or do we have to have a website for each location even though the content is identical? We do have a location finder where we list each location. But we want to generate local traffic in the cities we are in to our product pages through SEO, but it's difficult because they all sell the exact same product. We know Google doesn't like duplicate content.
Local Website Optimization | | GrowBrilliant0 -
What is the Best Keyword Placement within a URL for Inner Location Pages?
I'm working on a website with 100s of locations. There is a location search page (Find Widget Dealer), a page for each state (Tennessee Widget Dealers) and finally a page for each individual location which has localized unique content and contact info (Nashville Widget Dealer). My question is is related to how I should structure my URL and the keywords within the URL. Keywords in my examples being the location and the product (i.e. widget). Here is a quick overview of each of the 3 tiered pages, with the Nashville page being the most optimized: Find Widget Dealer - Dealer Page only includes a location search bar and bullet list links to states Tennessee Widget Dealers - Page includes brief unique content for the the state and basic listing info for each location along with links to the local page) Nashville Widget Dealer - Page includes a good amount of unique content for this specific location (Most optimized page) That said, here are the 3 URL structure options I am considering: http://website.com/widget-dealers/tennesee/nashville http://website.com/dealers/tennesee-widget-dealers/nashville http://website.com/dealers/tennesee/nashville-widget-dealer Any help is appreciated! Thank you
Local Website Optimization | | the-coopersmith0