Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's the best possible URL structure for a local search engine?
-
Hi Mozzers,
I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets.
Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers.
Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
-
In regard to shorter URLs:
The goal is to find a proper balance for your needs. You want to group things into sub-groups based on proper hierarchy, however you also don't want to go too deep if you don't have enough pages/individual listings deep down the chain.
So the Moz post you point to refers to that - at a certain point, having too many layers can be a problem. However there is one one single correct answer.
The most important thing to be aware of and consider is your own research and evaluation process for your situation in your market.
However, as far as what you found most people search for, be aware that with location based search, many people don't actually type in a location when they are doing a search. Except Google DOES factor in the location when deciding what to present in results. So the location matters even though people don't always include it themselves.
The issue is not to become completely lost in making a decision either though - consider all the factors, make a business decision to move forward with what you come up with, and be consistent in applying that plan across the board.
What I mean in regard to URLs and Breadcrumbs:
If the URL is www.askme.com/dehli/saket/pizza/pizza-hut/ the breadcrumb should be:
Home > Dehli > Saket > Pizza > Pizza Hut
If the URL is www.askme.com/pizza-huts/saket-delhi/ the breadcrumb should be
Home > Pizza Hut > Saket-Delhi
-
While thinking about the ideal URL structure, I did consider some of the blogs (including this one by Rand: https://moz.com/blog/15-seo-best-practices-for-structuring-urls, check point #11. Attaching a screenshot as well) and websites which were doing really good with their one level static URLs.
I actually did some keyword research on user's search pattern and google suggest data. Generally, our target search term comes before ("pizza huts" in this case) the geo location, may be people search things in a different way in India. Hence, I thought of keeping the URL structure that way.
A little confused about this point though "URL, breadcrumb both should match the sequence. If one has one sequence, and the other has a different sequence, that confuses search algorithms". Because, have seen many website doing tremendously well who're not following these principles.
-
Proximity to root is not a valid best practice, especially in this instance.
Here's why:
More people search based on geo-location than actual business name when looking for location based businesses. So by putting "Pizza Hut" first, that contradicts this notion. It implies "more people look for Pizza Hut than the number of people looking for all the different businesses in this geo-location".
Also, by using the URL you suggest, that's blatant over-optimization - attempting to stuff exact match keywords into the URL. In reality, people use a very wide range of keyword variations, so that's another conflict that harms your overall focus needs.
All of the individual factors need to reinforce each other as much as is reasonable for human readability. So URL, breadcrumb both should match the sequence. If one has one sequence, and the other has a different sequence, that confuses search algorithms.
-
Thank you so much once again Sir Alan.
Well, I'm just thinking aloud here. How about putting my primary keyword in the first level instead of having this well structured URL syntax? For instance:
Here,
- The complete primary keyword (or target search string) is closer to the domain. "Closer your keywords to the domain, better it is", I heard this somewhere. Is it still true and adds any additional value?
- We don't have deep URL directory structure and our primary keyword is together too. In the well structure URL (the one you suggested), the target keyword is broken into multiple pieces & the URL directories.
- But, I'm not exposing the hierarchy/navigation-flow via URL. I hope that's okay as far as I'm handling it cleanly from the breadcrumbs and rich snippets. What's your take on this?
I know there are chances of URL conflicts. For instance, if we have an area "foo" in the city "bar" vs a city "foo bar". I'll end up having the same URL for both the cases i.e /<search-query>-in-foo-bar. There are many such edge cases, I'm on it.</search-query>
-
Local pack exists, yet is far from complete or consistently helpful. Business directories thrive even in an age of local packs. It's all about finding the best way to provide value, and the internet is large enough that many players can play in the game.
-
Sorry for my ignorance here but does googl.in not show the local pack in its serps, with reviews and ratings?
if so, isn't the business model flawed, assuming you're going to be charging companies to be listed in your directory when they can get listed as a local business in Google right now for free?
perhaps I've overlooked something...
-
Business listing directory environments have a big challenge when it comes to URL structure / information architecture and content organization because:
- Many businesses are searched for based on geo-location
- Many of those require hyper-local referencing while many others can be "in the general vacinity"
- Many other businesses are not as relevant to geo-location
So what is a site to do?
The best path is to recognize that as mobile becomes more and more critical to searcher needs, hyper-local optimization becomes more critical. It becomes the most important focus for SEO.
As a result, URL structure needs to reflect hyper-local first and foremost. So:
- www.askme.com/delhi/
- www.askme.com/delhi/saket/
- www.askme.com/delhi/saket/pizza/
- www.askme.com/dehli/saket/pizza/pizza-hut/
This way, if someone searches for "Pizza Hut Dehli", all of the Dehli Pizza Huts will show up, regardless of neighborhood, while anyone searching for "Pizza Hut Saket" will get more micro-locally relevant results.
And for those businesses that serve a wider geo-area, even though they too will be assigned a hyper-local final destination page, they will still be related to their broader geo-area as well. So someone searching "plumbers in Dehli" will get the right results and then they can choose any of the plumbers in Dehli regardless of what neighborhood they are in.
Note how I removed /search/ from the URL structure as well. It's an irrelevant level.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
Forwarded vanity domains, suddenly resolving to 404 with appended URL's ending in random 5 characters
We have several vanity domains that forward to various pages on our primary domain.
Intermediate & Advanced SEO | | SS.Digital
e.g. www.vanity.com (301)--> www.mydomain.com/sub-page (200) These forwards have been in place for months or even years and have worked fine. As of yesterday, we have seen the following problem. We have made no changes in the forwarding settings. Now, inconsistently, they sometimes resolve and sometimes they do not. When we load the vanity URL with Chrome Dev Tools (Network Pane) open, it shows the following redirect chains, where xxxxx represents a random 5 character string of lower and upper case letters. (e.g. VGuTD) EXAMPLE:
www.vanity.com (302, Found) -->
www.vanity.com/xxxxx (302, Found) -->
www.vanity.com/xxxxx (302, Found) -->
www.vanity.com/xxxxx/xxxxx (302, Found) -->
www.mydomain.com/sub-page/xxxxx (404, Not Found) This is just one example, the amount of redirects, vary wildly. Sometimes there is only 1 redirect, sometimes there are as many as 5. Sometimes the request will ultimately resolve on the correct mydomain.com/sub-page, but usually it does not (as in the example above). We have cross-checked across every browser, device, private/non-private, cookies cleared, on and off of our network etc... This leads us to believe that it is not at the device or host level. Our Registrar is Godaddy. They have not encountered this issue before, and have no idea what this 5 character string is from. I tend to believe them because per our analytics, we have determined that this problem only started yesterday. Our primary question is, has anybody else encountered this problem either in the last couple days, or at any time in the past? We have come up with a solution that works to alleviate the problem, but to implement it across hundreds of vanity domains will take us an inordinate amount of time. Really hoping to fix the cause of the problem instead of just treating the symptom.0 -
How to get local search volumes?
Hi Guys, I want to get search volumes for "carpet cleaning" for certain areas in Sydney, Australia. I'm using this process: Choose to ‘Search for new keyword and ad group ideas’. Enter the main keywords regarding your product / service Remove any default country targeting Specify your chosen location (s) by targeting specific cities / regions Click to ‘Get ideas’ The problem is none of the areas, even popular ones (like north sydney, surry hills, newtown, manly) are appearing and Google keyword tool, no matches. Is there any other tools or sources of data i can use to get accurate search volumes for these areas? Any recommendations would be very much appreciated. Cheers
Intermediate & Advanced SEO | | wozniak650 -
Magento: Should we disable old URL's or delete the page altogether
Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages. We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google. I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap. My questions are: 1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether? 2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?
Intermediate & Advanced SEO | | andyheath0 -
404's - Do they impact search ranking/how do we get rid of them?
Hi, We recently ran the Moz website crawl report and saw a number of 404 pages from our site come back. These were returned as "high priority" issues to fix. My question is, how do 404's impact search ranking? From what Google support tells me, 404's are "normal" and not a big deal to fix, but if they are "high priority" shouldn't we be doing something to remove them? Also, if I do want to remove the pages, how would I go about doing so? Is it enough to go into Webmaster tools and list it as a link no to crawl anymore or do we need to do work from the website development side as well? Here are a couple of examples that came back..these are articles that were previously posted but we decided to close out: http://loyalty360.org/loyalty-management/september-2011/let-me-guessyour-loyalty-program-isnt-working http://loyalty360.org/resources/article/mark-johnson-speaks-at-motivation-show Thanks!
Intermediate & Advanced SEO | | carlystemmer0 -
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0