Ecommerce Site Structure -- "/our_locations" page: helpful or harmful?
-
Hello!
We are a retailer with brick and mortar stores in different cities. We have a website (ourbusiness.com), which includes
- a blog (ourbusiness.com/blog) and
- a separate ecommerce site for each store in subfolders (ourbusiness.com/Boston-store and ourbusiness.com/Atlanta-store).
- NB: We do this for non-business reasons and have no choice.
So, this is not like REI (for example) or other stores with lots of locations but one central ecommerce operation.
Most experts seem to recommend a site structure that echoes REIs. IE:
- a home page principally devoted to ecommerce (rei.com)
- includes an Our Locations-type page (rei.com/stores) which links to local store pages like
- (rei.com/stores/fresno)
I understand how this would help REI, since their homepage is devoted to ecommerce and they need a store locator page that doesn't compete with the shopping experience. But since we can't send people to products directly from our home page, is there any reason for us not to put the store locator function right on the home page?
That is, is there any reason in our case to prefer (A) ourbusiness.com/our_locations/Boston_store over (B) ourbusiness.com/Boston-store?
As i see it, the extra page (/our_locations/) could actually hurt, as it puts products one click further away from customers, and one link deeper for bots.
On the other hand, it may make the multi-store structure clearer to bots (and maybe people) and help us in local search.
Finally, would it make a difference if there were 10 stores vs 2?
Thanks for any thoughts!
-
Hi there!
I do not have any resource or study for those statements.
- for the extra clic, it a well known fact that, for any extra clic that the user does, the conversion rate decreases.
- On how usefull is that landing hub, it is something to be experimented in your case. Theoretically it helps, but its really difficult to estimate how much.
Hope it helps.
Best luck.
GR -
Thanks for the response. I think you boiled it down to the basic questions very nicely.
If you have any research you can point me to to help me understand just how useful the extra page might be to google bot, or (on the other hand) how much traffic we may lose to the extra click, i'd be most grateful for a pointer.
thanks again!
-
Hi there,
On one hand, my opinion, its a little helpful to GoogleBot, having that folder stating that you have several stores.
So I'd go with A: ourbusiness.com/stores/locationOn the other hand, there is no need to increase an extra click to users, all links could be directly to the stores and not making the users to choose. Also, that extra folder, could be used as an internal link page (also called, landing HUB) were you link all your stores. This would also help a little to GoogleBot.
On a third hand, there is no way to forecast how much your traffic will improve. So analyze and take into consideration how much time and effort you and your dev team need for this improvement.
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Is it ok to redirect users to a market-specific home page based on their previous selection?
I'm working with a real estate client currently that asks users to identify the market they are in prior to showing them properties. The markets are far enough apart that no user would conceivably be browsing within two separate markets. When the user selects their market choice, they are redirected to a market-specific home page whenever they login after the original home page loads. These market-specific pages are ranking currently (page 2-4) for market-related phrases, but before embarking on further optimization I wanted to get a second opinion on whether or not keeping this redirect process is even a good idea or not. Thoughts?
Local Website Optimization | | jluke.fusion0 -
Not all structured data not showing up
Hello, I am relatively new to all the different kinds of schema one can use to mark up a site, but not all of the structured data I entered in showing up in the structured data testing tool? Does that mean that Google can't see it either? Thanks so much in advance for your help!
Local Website Optimization | | lfrazer0 -
How best to clean up doorway pages. 301 them or follow no index ?
Hi Mozzers, I have what is classed as doorway pages on my website. These have historically been location specific landing pages for some of our categories but from speaking to a number of different webmasters , then general consensus is that they are not in google guidelines so I will be getting punished by having them. My options are : I can 301 the pages back to their original category pages . This will conserve some link juice to pass back to the respective category page. I can set these as Follow No index. Not sure what will happen here with regards to link value etc. What would be best ?... Some of the pages do currently rank "fairly well" for some of the locations so I am getting traffic from them but I also know I will be getting a algorithmic penalty for having them so how best I clean these up ?. Also , by cleaning up the site structure , would I see any benefit here ? or will I have to wait for a new panda update/ refresh ? I thought the panda refresh won't use a new dataset thanks Pete
Local Website Optimization | | PeteC120 -
Title Tag, URL Structure & H1 for Localization
I am working with a local service company. They have one location but offer a number of different services to both residential and commercial verticals. What I have been reading seems to suggest that I put the location in URLs, Title Tags & H1s. Isn't it kind of spammy and possibly annoying user experience to see location on every page?? Portland ME Residential House Painting Portland ME Commercial Painting Portland Maine commercial sealcoating Portland Maine residential sealcoating etc, etc This strikes me as an old school approach. Isn't google more adept at recognizing location so that I don't need to paste it In H1s all over the site? Thanks in advance. PAtrick
Local Website Optimization | | hopkinspat0 -
Killing it in Yahoo/Bing...Sucking it in Google. What gives?
Our website http://www.survive-a-storm.com has historically performed well in Google for the search terms "storm shelters" and "tornado shelters." Our geographic focus is nationwide, but we are particularly interested in ranking up for Oklahoma. Right now we are hovering at about the third position in Yahoo/Bing, and in some geographic areas (i.e., as selected in Google's search settings) we are doing reasonably to quite well for these terms in Google (i.e., first page). In Oklahoma, though, we are holding steady around positions 20-25. We have just changed the title tag on our home page, cleaned up a bit of on-page optimization, and are going to work on getting some more optimized content on the page. We are outperforming the competition on Domain Authority (38) and Page Authority (46), and as far as I can tell, other key metrics are respectable. Our social isn't bad, but could always use improvement--which we are working on. Any idea why we might be lagging so badly in Google? Any help would be appreciated!
Local Website Optimization | | Survive-a-Storm0 -
Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
Not sure where this belongs.. I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city. Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover. Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here! Here's the site concept: **FOR EACH CITY: ** User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user. After search area is determined, user chooses 1 of 6 types of coupons searches: 1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all. 2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range. 3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range. 4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range. 5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc.. 6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city. So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages The combinations of potential urls to index are nearly unlimited: Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb? Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them.. Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site. The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons. Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
Local Website Optimization | | couponguy1 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0