Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's the best possible URL structure for a local search engine?
-
Hi Mozzers,
I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets.
Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers.
Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
-
In regard to shorter URLs:
The goal is to find a proper balance for your needs. You want to group things into sub-groups based on proper hierarchy, however you also don't want to go too deep if you don't have enough pages/individual listings deep down the chain.
So the Moz post you point to refers to that - at a certain point, having too many layers can be a problem. However there is one one single correct answer.
The most important thing to be aware of and consider is your own research and evaluation process for your situation in your market.
However, as far as what you found most people search for, be aware that with location based search, many people don't actually type in a location when they are doing a search. Except Google DOES factor in the location when deciding what to present in results. So the location matters even though people don't always include it themselves.
The issue is not to become completely lost in making a decision either though - consider all the factors, make a business decision to move forward with what you come up with, and be consistent in applying that plan across the board.
What I mean in regard to URLs and Breadcrumbs:
If the URL is www.askme.com/dehli/saket/pizza/pizza-hut/ the breadcrumb should be:
Home > Dehli > Saket > Pizza > Pizza Hut
If the URL is www.askme.com/pizza-huts/saket-delhi/ the breadcrumb should be
Home > Pizza Hut > Saket-Delhi
-
While thinking about the ideal URL structure, I did consider some of the blogs (including this one by Rand: https://moz.com/blog/15-seo-best-practices-for-structuring-urls, check point #11. Attaching a screenshot as well) and websites which were doing really good with their one level static URLs.
I actually did some keyword research on user's search pattern and google suggest data. Generally, our target search term comes before ("pizza huts" in this case) the geo location, may be people search things in a different way in India. Hence, I thought of keeping the URL structure that way.
A little confused about this point though "URL, breadcrumb both should match the sequence. If one has one sequence, and the other has a different sequence, that confuses search algorithms". Because, have seen many website doing tremendously well who're not following these principles.
-
Proximity to root is not a valid best practice, especially in this instance.
Here's why:
More people search based on geo-location than actual business name when looking for location based businesses. So by putting "Pizza Hut" first, that contradicts this notion. It implies "more people look for Pizza Hut than the number of people looking for all the different businesses in this geo-location".
Also, by using the URL you suggest, that's blatant over-optimization - attempting to stuff exact match keywords into the URL. In reality, people use a very wide range of keyword variations, so that's another conflict that harms your overall focus needs.
All of the individual factors need to reinforce each other as much as is reasonable for human readability. So URL, breadcrumb both should match the sequence. If one has one sequence, and the other has a different sequence, that confuses search algorithms.
-
Thank you so much once again Sir Alan.
Well, I'm just thinking aloud here. How about putting my primary keyword in the first level instead of having this well structured URL syntax? For instance:
Here,
- The complete primary keyword (or target search string) is closer to the domain. "Closer your keywords to the domain, better it is", I heard this somewhere. Is it still true and adds any additional value?
- We don't have deep URL directory structure and our primary keyword is together too. In the well structure URL (the one you suggested), the target keyword is broken into multiple pieces & the URL directories.
- But, I'm not exposing the hierarchy/navigation-flow via URL. I hope that's okay as far as I'm handling it cleanly from the breadcrumbs and rich snippets. What's your take on this?
I know there are chances of URL conflicts. For instance, if we have an area "foo" in the city "bar" vs a city "foo bar". I'll end up having the same URL for both the cases i.e /<search-query>-in-foo-bar. There are many such edge cases, I'm on it.</search-query>
-
Local pack exists, yet is far from complete or consistently helpful. Business directories thrive even in an age of local packs. It's all about finding the best way to provide value, and the internet is large enough that many players can play in the game.
-
Sorry for my ignorance here but does googl.in not show the local pack in its serps, with reviews and ratings?
if so, isn't the business model flawed, assuming you're going to be charging companies to be listed in your directory when they can get listed as a local business in Google right now for free?
perhaps I've overlooked something...
-
Business listing directory environments have a big challenge when it comes to URL structure / information architecture and content organization because:
- Many businesses are searched for based on geo-location
- Many of those require hyper-local referencing while many others can be "in the general vacinity"
- Many other businesses are not as relevant to geo-location
So what is a site to do?
The best path is to recognize that as mobile becomes more and more critical to searcher needs, hyper-local optimization becomes more critical. It becomes the most important focus for SEO.
As a result, URL structure needs to reflect hyper-local first and foremost. So:
- www.askme.com/delhi/
- www.askme.com/delhi/saket/
- www.askme.com/delhi/saket/pizza/
- www.askme.com/dehli/saket/pizza/pizza-hut/
This way, if someone searches for "Pizza Hut Dehli", all of the Dehli Pizza Huts will show up, regardless of neighborhood, while anyone searching for "Pizza Hut Saket" will get more micro-locally relevant results.
And for those businesses that serve a wider geo-area, even though they too will be assigned a hyper-local final destination page, they will still be related to their broader geo-area as well. So someone searching "plumbers in Dehli" will get the right results and then they can choose any of the plumbers in Dehli regardless of what neighborhood they are in.
Note how I removed /search/ from the URL structure as well. It's an irrelevant level.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Why is /home used in this company's home URL?
Just working with a company that has chosen a home URL with /home latched on - very strange indeed - has anybody else comes across this kind of homepage URL "decision" in the past? I can't see why on earth anybody would do this! Perhaps simply a logic-defying decision?
Intermediate & Advanced SEO | | McTaggart0 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
A few questions on Google's Structured Data Markup Helper...
I'm trying to go through my site and add microdata with the help of Google's Structured Data Markup Helper. I have a few questions that I have not been able to find an answer for. Here is the URL I am referring to: http://www.howlatthemoon.com/locations/location-chicago My company is a bar/club, with only 4 out of 13 locations serving food. Would you mark this up as a local business or a restaurant? It asks for "URL" above the ratings. Is this supposed to be the URL that ratings are on like Yelp or something? Or is it the URL for the page? Either way, neither of those URLs are on the page so I can't select them. If it is for Yelp should I link to it? How do I add reviews? Do they have to be on the page? If I make a group of days for Day of the Week for Opening hours, such as Mon-Thu, will that work out? I have events on this page. However, when I tried to do the markup for just the event it told me to use itemscope itemtype="http://schema.org/Event" on the body tag of the page. That is just a small part of the page, I'm not sure why I would put the event tag on the whole body? Any other tips would be much appreciated. Thanks!
Intermediate & Advanced SEO | | howlusa0 -
Search Engine Pingler
Hello everyone, it's me again 😉 I've just got a Pro membership on SeoMoz and I am full of questions. A few days ago I found very interesting tool called: Search Engine Pingler And description of it was like this: Your website or your page was published a long time, but you can not find it on google. Because google has not index your site. Tool Search engine pingler will assist for you. It will ping the URL of your Page up more than 80 servers of google and other search engines. Inform to the search engine come to index your site. So my question is that tool really helps to increase the indexation of the link by search engine like Google, if not, please explain what is a real purpose of it. Thank you to future guru who can give a right answer 🙂
Intermediate & Advanced SEO | | smokin_ace0 -
Best way to block a search engine from crawling a link?
If we have one page on our site that is is only linked to by one other page, what is the best way to block crawler access to that page? I know we could set the link to "nofollow" and that would prevent the crawler from passing any authority, and we can set the page to "noindex" to prevent it from appearing in search results, but what is the best way to prevent the crawler from accessing that one link?
Intermediate & Advanced SEO | | nicole.healthline0