Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Locations Same City
I have a local seo campaign im trying to reconfigure. Lets say i am a dwi lawyer and i have multiple locations. These are merely examples for cities and keywords. Home page is Criminal defense lawyer - this is the term we should be targeting. Maybe i can target the state name, but i am losing so much SEO weight by not leveraging this home page as the main page for this term. Then we have a location page in south Boston that is "S Boston DWI lawyer" as the title tag. Then we have another location page north Boston that is "N Boston DWI Lawyer" as the title tag. I can leave the city name off the home page title tag, but then what do i do with these pages that are pretty much competing with one another? I know the home page will not rank since none of the locations point to it, and only to a location page. I was thinking about creating one page with both locations and having both G map listings go directly there, but that doesn't make sense because other locations do not have the same setup. Or choosing the most central location and pointing that to the home page and let the rest have a locations page. Finally the home page will not rank well for any major terms. The location page does rank for the fictional south Boston DWI lawyer, but the other listing does not show up. The home page does not show up in the first ten pages either. One other aspect is that the home page ranks for terms that I am not even targeting. These pages are all targeted on specific keywords so that they do not overlap or compete, but some pages are the services main outline, but the location pages have their own version. I have removed all mentions of the same keyword from the home page. I made a few wchanges about 2 weeks ago and already noticed movement in rankings days later.
Local Website Optimization | | waqid0 -
Should Multi Location Businesses "Local Content Silo" Their Services Pages?
I manage a site for a medical practice that has two locations. We already have a location page for each office location and we have the NAP for both locations in the footer of every page. I'm considering making a change to the structure of the site to help it rank better for individual services at each of the two locations, which I think will help pages rank in their specific locales by having the city name in the URL. However, I'm concerned about diluting the domain authority that gets passed to the pages by moving them deeper in the site's structure. For instance, the services URLs are currently structured like this: www.domain.com/services/teeth-whitening (where the service is offered in each of the two locations) Would it make sense to move to a structure more like www.domain.com/city1name/teeth-whitening www.domain.com/city2name/teeth-whitening Does anyone have insight from dealing with multi-location brands on the best way to go about this?
Local Website Optimization | | formandfunctionagency1 -
Service Area Location Pages vs. User Experience
I'm familiar with the SAB best practices outlined here. Here's my issue: Doing local landing pages as described here might not be ideal from a user experience point of view. Having a "Cities We Serve" or "Service Areas" link in the main navigation isn't necessarily valuable to the user when the city-specific landing pages are all places within a 15-mile radius of the SAB's headquarters. It would just look like the company did it for SEO. It wouldn't look natural. Seriously, it feels like best practices are totally at odds with user experience here. If I absolutely must create location pages for 10 or so municipalities within my client's service area, I'd rather NOT put the service areas as a primary navigation item. It is not useful to the user. Anyone who sees that the company provides services in the [name of city] metropolitan area will already understand that the company can service their town that is 5 miles away. It is self-evident. For example**, who would wonder whether a plumbing company with a Los Angeles address also services Beverly Hills?** It's just... silly. But the Moz guide says I've got to do those location pages! And that I've got to put them high up in the navigation! This is a problem because we've got to do local SEO, but we also have to provide an ideal experience. Thoughts?
Local Website Optimization | | Greenery1 -
Multiple location pages are they bad?
Hello all, I am research some competitors of a client of mine. My client specializes in H.P. printer repair and over the last 8 years has lost market shares to the competition. I want to reclaim market share. As I was searching some of the service companies many have page that list multiple towns that they service. here is an example. http://printerrepairservice.com/locations-we-service/ Should I be recommending this to my client? To me it seems like a spam keyword process. I know an employee of this particular company and he say their online business is booming. I want my clients to boom too! What are your thoughts on these location type pages?
Local Website Optimization | | donsilvernail0 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
Multilocation business, how can you rank for different categories in different locations with only branch pages?
Hello Mozzers, I am wondering how do you rank for categories locally where when you operate from multiple branches. Currently our eCommerce website has location pages for every category but I know that this is now classed as doorway pages and spammy so I am in the process of sorting out our site structure. I understand that the general format for having sites with multiple branches is to have a branch page per physical location and that's about it. Is there any more to this ? However, What confuses me though, is that if you offer all these services in all these branches, how are you going to rank for them locally if you don't have a specific page for each of them in that location? So for example - We rent Carpet cleaners , floor sanders, generators in each of our different branches. My site currently has a carpet cleaner hire <location>url , floor sander hire <location>url and a generator hire <location>url. Every branch has a url for each of my categories.</location></location></location> So if I was to get rid of all of my location category pages. How am I going to rank for these renting these products in different cities where our branches does without having specific location pages for them ? Is it just a case that google knows that because I have branch pages at locations x, y, x , then my carpet cleaner , floor sander and generator category pages will rank locally in those locations providing I have decent citations etc etc etc thanks
Local Website Optimization | | PeteC12
Pete0 -
Schema for same location on multiple sites - can this be done?
I'm looking to find more information on location/local schema. Are you able to implement schema for one location on multiple different sites? (i.e. - Multiple brands/websites (same parent company) - the brands share the same location and address). Also, is schema still important for local SEO? Thank you in advance for your help!
Local Website Optimization | | EvolveCreative0 -
Does the Location of my Server effect my SEO?
Does the geographic Location of my Server effect my SEO? HELP US! We are arguing for 3 weeks already. My partner has mentioned multiple times in the past that "since 2013 google does not require your server to be in the country you are targeting for seo"
Local Website Optimization | | DanielBernhardt
And that actually all they care about is if its a good and fast server - not where its physically located in the world. I am a strong believer that the geographic location of your server directly effects your SEO ranking... lets say if you want to target www.google.ru for your seo, best you have a server located in Russia for hosting your website.. WHO IS RIGHT? Choose the winner and base the facts.
If anybody has the correct answer and information to base it on it will help us alot - and maybe even spare some unnecessary violent between us two! we found some articles across the web, sadly they are all dated back to 2012.... Thanks in Advance for all the help guys!0