Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo-Targeting Boroughs/Neighborhoods in New York
Is there a way to determine which boroughs/neighborhoods are drawing traffic in big cities like NYC and LA?Google Analytics lists all traffic under the city name, so New York, NY, gets 90%+ of the traffic.
Local Website Optimization | | GoogleAlgoServant3 -
How does Google read multiple Geo Shape Schema Mark Up?
Hi Guys, I posted a question recently about "Can I have multiple areaServed mark up on one domain?" and the responses I got was no. My client work predominantly in the South East of England in specific towns, so I wanted to be able to list all the areas they service. However, after being told no, I went ahead anyway and put in multiple areaServed markup on the page to see if this generates any errors and it isn't when I run it through the Structured Data Testing Tool. I don't get any errors by doing this, so hurray! But... What I want to understand (which I can't find the answer anywhere), is if this is okay, and how will Google read my markup? Will Google see that we are in multiple areas across the SE of England and push my content up before other sites, or is this just going to confused Google? By putting in all these areas into the website as multiple locations, will Google identify that person X in area Y fits the areaServed mark up I've added and push my content to them? Overall... has anyone else used multiple areaServed markup and can validate that this works? hHpEyQf
Local Website Optimization | | Virginia-Girtz1 -
Differentiating Franchise Location Names to better optimize locations
Hello All, I am currently spear heading SEO for a national franchise. I am coming across locations in the same city and zip code. I'm definitely finding difficulties in naming the location in a way that will be specific to the franchise locations (locations are 1 mile away from each other). I am looking to apply geo specific location names for each center regardless of local city terms. (e.g. Apexnetwork of north madronna, Apexnetwork of south madronna) Also, building the website and location to read (apexnetwork.com/north-madronna….. apexnetwork.com/south-madronna) While encouraging the client to continue using the geo specific terms while writing blogs. Is this best practice? Any feedback would help.
Local Website Optimization | | Jeffvertus0 -
Service Location links in footer and on the service page - spamming or good practice?
We are are a managed IT services business so we try and target people searching for IT support in a number of key areas. We have created individual location pages (11) to localise our service in these specific areas. We put these location links in the footer which went to the specified IT support pages respectively. Now we have created a general 'managed IT services' page and are thinking of linking to these specific pages on there as well as it makes sense to do it. Would having these 11 links in the footer as well as on the 'managed IT services' page be spamming? or would it be good practice? If this is spamming, which linking location should hold preference. Would appreciate the feedback
Local Website Optimization | | AndyL93
Thanks
Andy0 -
Should I open a new domain and website for a new location under one company?
Hi my name is Gina and I wanted to ask for some advice. I'm thinking opening a diff location and was thinking if its a good idea to open up a new domain and new website? And why that may be a good idea and why or a bad idea and why?
Local Website Optimization | | LittleDog0 -
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install. Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets. The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Local Website Optimization | | Adam_RushHour_Marketing
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.com Should we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com? Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution? Thank you so much for the help.0 -
Does the Location of my Server effect my SEO?
Does the geographic Location of my Server effect my SEO? HELP US! We are arguing for 3 weeks already. My partner has mentioned multiple times in the past that "since 2013 google does not require your server to be in the country you are targeting for seo"
Local Website Optimization | | DanielBernhardt
And that actually all they care about is if its a good and fast server - not where its physically located in the world. I am a strong believer that the geographic location of your server directly effects your SEO ranking... lets say if you want to target www.google.ru for your seo, best you have a server located in Russia for hosting your website.. WHO IS RIGHT? Choose the winner and base the facts.
If anybody has the correct answer and information to base it on it will help us alot - and maybe even spare some unnecessary violent between us two! we found some articles across the web, sadly they are all dated back to 2012.... Thanks in Advance for all the help guys!0 -
Yoast Local SEO Reviews/Would it work for me?
Hi everyone, I'm looking for some feedback on Yoast Local SEO, and if you think it'd work for our site. www.kempruge.com. Our site is a wordpress site, and there's nothing about it, off the top of my head, that makes me think it wouldn't work, but I've been wrong before. We do use All-In-One SEO, not the Yoast plugin, so I'm not sure if that's compatible.or would cause a problem? (The reason we use All-In-One and not Yoast is because that's what we had when I got here, and I'm worried what would happen if we switched). Also, we have three offices, and I need to be able to do local seo for all three. I know Yoast says it supports multiple offices, but I'd feel more comfortable if someone on here let me know from his/her experience that it did. Anything else you want to add about Yoast Local, I'm all ears! Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0