Ecommerce Site Structure -- "/our_locations" page: helpful or harmful?
-
Hello!
We are a retailer with brick and mortar stores in different cities. We have a website (ourbusiness.com), which includes
- a blog (ourbusiness.com/blog) and
- a separate ecommerce site for each store in subfolders (ourbusiness.com/Boston-store and ourbusiness.com/Atlanta-store).
- NB: We do this for non-business reasons and have no choice.
So, this is not like REI (for example) or other stores with lots of locations but one central ecommerce operation.
Most experts seem to recommend a site structure that echoes REIs. IE:
- a home page principally devoted to ecommerce (rei.com)
- includes an Our Locations-type page (rei.com/stores) which links to local store pages like
- (rei.com/stores/fresno)
I understand how this would help REI, since their homepage is devoted to ecommerce and they need a store locator page that doesn't compete with the shopping experience. But since we can't send people to products directly from our home page, is there any reason for us not to put the store locator function right on the home page?
That is, is there any reason in our case to prefer (A) ourbusiness.com/our_locations/Boston_store over (B) ourbusiness.com/Boston-store?
As i see it, the extra page (/our_locations/) could actually hurt, as it puts products one click further away from customers, and one link deeper for bots.
On the other hand, it may make the multi-store structure clearer to bots (and maybe people) and help us in local search.
Finally, would it make a difference if there were 10 stores vs 2?
Thanks for any thoughts!
-
Hi there!
I do not have any resource or study for those statements.
- for the extra clic, it a well known fact that, for any extra clic that the user does, the conversion rate decreases.
- On how usefull is that landing hub, it is something to be experimented in your case. Theoretically it helps, but its really difficult to estimate how much.
Hope it helps.
Best luck.
GR -
Thanks for the response. I think you boiled it down to the basic questions very nicely.
If you have any research you can point me to to help me understand just how useful the extra page might be to google bot, or (on the other hand) how much traffic we may lose to the extra click, i'd be most grateful for a pointer.
thanks again!
-
Hi there,
On one hand, my opinion, its a little helpful to GoogleBot, having that folder stating that you have several stores.
So I'd go with A: ourbusiness.com/stores/locationOn the other hand, there is no need to increase an extra click to users, all links could be directly to the stores and not making the users to choose. Also, that extra folder, could be used as an internal link page (also called, landing HUB) were you link all your stores. This would also help a little to GoogleBot.
On a third hand, there is no way to forecast how much your traffic will improve. So analyze and take into consideration how much time and effort you and your dev team need for this improvement.
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Leveraging the authority of a blog to boost pages on a root domain.
Hi! Looking for some link building advice. For some background, I work for a company that has over 100 locations across the US. So we are deeply involved with local SEO. We also do a ton of evergreen/ national SEO as well and the spectrums are widely different for the most part. We also have a very successful blog in our industry. It really is an SEO’s dream. I do not even need to worry about a link strategy for this because it just naturally snatches them up. I’m trying to find some unique ways to utilize the blog to boost pages on my main root domain, more specifically, at the local level. It is really hard, besides the standard methods for local link building, to get outside sources to link to our local office pages. These pages are our bread and butter, and the pages we need to be as successful as possible. In every market we are in, we are at a disadvantage because we have one page to establish our local footprint and rank, compared to domains that have their entire site pointed at that local area we are trying to rank in. I’ve tried linking to local office pages from successful blog posts to attempt to pass link juice to the local pages, but I haven’t seen much in terms of moving the needle doing this. Are there any crafty ideas on how I can shuffle some internal linking around to capitalize on the blog’s authority to make my local pages rank higher in their markets? Thank you! -Ben
Local Website Optimization | | Davey_Tree0 -
City Pages for Local SEO
Hey Mozzers, I have a local SEO question for you. I am working with a medical professional to SEO their site. I know that when creating city pages, you want to try and make each page as strong as you can, showcasing testimonials from people who live in those towns, for instance. Since my client is in the medical profession, i was going to include a list of parks from that town and say something about how, "we want to encourage good health, etc." However, i began to wonder whether i should just create one, large resource for the surrounding towns having to do with parks, dog parks, and athletic activities and link to it in the top nav. thoughts? Nails
Local Website Optimization | | matt.nails0 -
Does having a host located in a different country than the location of the website/website's audience affects SEO?
For example if the website is example.ro and the hosting would be on Amazon Web Services. Thanks for your help!
Local Website Optimization | | IrinaIoana0 -
Sub domain for geo pages
Hello Group! I have been tossing the idea in my head of using sub domains for the geo pages for each of my clients. For example: one of my clients is a lawyer in a very competitive Atlanta market http://bestdefensega.com. Can I set his geo page to woodstock.bestdefensega.com? Is this a viable option? Will I get penalized? Thoughts or suggestions always appreciated! Thanks in Advance
Local Website Optimization | | underdogmike0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Yoast Local SEO Reviews/Would it work for me?
Hi everyone, I'm looking for some feedback on Yoast Local SEO, and if you think it'd work for our site. www.kempruge.com. Our site is a wordpress site, and there's nothing about it, off the top of my head, that makes me think it wouldn't work, but I've been wrong before. We do use All-In-One SEO, not the Yoast plugin, so I'm not sure if that's compatible.or would cause a problem? (The reason we use All-In-One and not Yoast is because that's what we had when I got here, and I'm worried what would happen if we switched). Also, we have three offices, and I need to be able to do local seo for all three. I know Yoast says it supports multiple offices, but I'd feel more comfortable if someone on here let me know from his/her experience that it did. Anything else you want to add about Yoast Local, I'm all ears! Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0 -
Recommendations on implementing regional home pages
My site is a directory that serves several regions. Each region has it's own "home page" with specific content for that visitor about their region. Right now we use Google location recognition after you visit the home page to redirect you to your regional home page. I am in the process of reviewing the best way to implement our home page for SEO purposes. Any advice or recommendations on how to present home pages that are location specific would be greatly appreciated. Thank you Steve
Local Website Optimization | | steve_linn0