URL Structure Question
-
Am starting to work with a new site that has a domain name contrived to help it with a certain kind of long tail search.
Just for fictional example sake, let's call it WhatAreTheBestRestaurantsIn.com. The idea is that people might do searches for "what are the best restaurants in seattle" and over time they would make some organic search progress. Again, fictional top level domain example, but the real thing is just like that and designed to be cities in all states.
Here's the question, if you were targeting searches like the above and had that domain to work with, would you go with...
whatarethebestrestaurantsin.com/seattle-washington
whatarethebestrestaurantsin.com/washington/seattle
whatarethebestrestaurantsin.com/wa/seattle
whatarethebestrestaurantsin.com/what-are-the-best-restaurants-in-seattle-wa
... or what and why?
Separate question (still need the above answered), would you rather go with a super short (4 letter), but meaningless domain name, and stick the longtail part after that?
I doubt I can win the argument the new domain name, so still need the first question answered.
The good news is it's pretty good content.
Thanks... Darcy
-
take the new 4 letter domainname you can market and brand. Redirect the old domain as best and logical you can to the specific pages on the 4 letter domainname.
4 letters are much easy-er to market. usernames in twitter, facebook etc, and you can make xyxy seatle, xyxy newyork as branding or social handlers for local markets and stuf..
#marketing #branding #worlddomination
-
Thanks for the answers Richard, Tobey & Lesley. Good points all.
Another option is to repurpose a domain name/one page site (used to be 1000 pages) that has been up for a long time, gained a bunch of authority/links for a totally unrelated subject, had a tragic developer experience where it's old content and could be used for this project. Currently it's a one page placeholder. That old TLD is equally meaningless to the new subject matter and could be anything.
So, if the choice were new 4 letter meaningless .com TLD or old meaningless 13 letter domain name with links for its old purpose and lots of old pages gone, which would you prefer? Is it hard to get Google to see an old domain name as a new subject... any harder than establishing relevance through content etc for a new domain name?
Thanks... Darcy
-
If it's a new domain then I definitely wouldn't go with anything like WhatAreTheBestRestaurantsIn.com. I would rather go with besteat.com or bestin.com and I could rank those domains much easier too. Don't start with a long spammy domain, build a brand instead. New domains with keywords help very little these days.
Most of the words in your domain examples are 'stop words' and shouldn't even be in domain names. (Words like 'are-best-in'). Even if you had categories for states they still don't belong in the final url either. Example, whatarethebestrestaurantsin.com/wa/seattle should still resolve to whatarethebestrestaurantsin.com/seattle Although you could still visit whatarethebestrestaurantsin.com/wa/ when you click on seatttle the url should rewrite to whatarethebestrestaurantsin.com/seattle
For longevity, quality, branding, trust, and non spammy purposes, I would build the site using clean short urls like the below made up examples. EMD's are all but dead, especially long ones like whatarethebestrestaurantsin.com
tastyeat.com/seattle/
bestin.com/seattle/
tastytown.com/seattle/
dinein.com/seattle/ -
Personally, I would go for something much shorter. Long domain names can appear spammy, and I believe are one of the spam metrics used by Moz in their spam score. The other problem with a long domain name is that pages and posts on your site may have titles which will be much too long to fit in a search engines search window, although you may be able to tweak this. You may well be better off having a very short domain name so that as new keywords come through which you want to target you can do this effectively without having too long a URL.
-
I would prefer this one, whatarethebestrestaurantsin.com/wa/seattle It keeps the state ISO in the url for when you grow large enough that you start running into cities with multiple names. Plus people are lazy, they abbreviate states and I think that helps with using that url structure as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Structure June 2019
Question Which link structure is better in 2019 for best SEO practice Example A) https://www.fishingtackleshop.com.au/soft-plastic-lures/ Or B) https://www.fishingtackleshop.com.au/fishing/fishing-lures/soft-plastic-lures/ We're on the bigcommerce platform and used to have https://www.fishingtackleshop.com.au/categories/soft-plastic-lures/ Last year we went from bigcommerce long URL to short to bypass the link juice being sent to /categories Now we have an SEO company trying to sell me their services after a bit of a steady decline since september 2018 and told me that we should have link structure as example B and that is likely the reason for the dip.. Due to breadcrumbing, True or False?
Intermediate & Advanced SEO | | oceanstorm
I explained i had bread crumb like shown in https://www.fishingtackleshop.com.au/berkley-powerbait-t-tail-minnow/ buy the SEO guy said no it needs to be in the URL structure too. I was under the impression that Short urls opposed to long was better these days and link juice is passed better if it is short url direct to the point? Am i wrong?1 -
My url disappeared from Google but Search Console shows indexed. This url has been indexed for more than a year. Please help!
Super weird problem that I can't solve for last 5 hours. One of my urls: https://www.dcacar.com/lax-car-service.html Has been indexed for more than a year and also has an AMP version, few hours ago I realized that it had disappeared from serps. We were ranking on page 1 for several key terms. When I perform a search "site:dcacar.com " the url is no where to be found on all 5 pages. But when I check my Google Console it shows as indexed I requested to index again but nothing changed. All other 50 or so urls are not effected at all, this is the only url that has gone missing can someone solve this mystery for me please. Thanks a lot in advance.
Intermediate & Advanced SEO | | Davit19850 -
Google Only Indexing Canonical Root URL Instead of Specified URL Parameters
We just launched a website about 1 month ago and noticed that Google was indexing, but not displaying, URLs with "?location=" parameters such as: http://www.castlemap.com/local-house-values/?location=great-falls-virginia and http://www.castlemap.com/local-house-values/?location=mclean-virginia. Instead, Google has only been displaying our root URL http://www.castlemap.com/local-house-values/ in its search results -- which we don't want as the URLs with specific locations are more important and each has its own unique list of houses for sale. We have Yoast setup with all of these ?location values added in our sitemap that has successfully been submitted to Google's Sitemaps: http://www.castlemap.com/buy-location-sitemap.xml I also tried going into the old Google Search Console and setting the "location" URL Parameter to Crawl Every URL with the Specifies Effect enabled... and I even see the two URLs I mentioned above in Google's list of Parameter Samples... but the pages are still not being added to Google. Even after Requesting Indexing again after making all of these changes a few days ago, these URLs are still displaying as Allowing Indexing, but Not On Google in the Search Console and not showing up on Google when I manually search for the entire URL. Why are these pages not showing up on Google and how can we get them to display? Only solution I can think of would be to set our main /local-house-values/ page to noindex in order to have Google favor all of our other URL parameter versions... but I'm guessing that's probably not a good solution for multiple reasons.
Intermediate & Advanced SEO | | Nitruc0 -
Robots.txt question
I notice something weird in Google robots. txt tester I have this line Disallow: display= in my robots.text but whatever URL I give to test it says blocked and shows this line in robots.text for example this line is to block pages like http://www.abc.com/lamps/floorlamps?display=table but if I test http://www.abc.com/lamps/floorlamps or any page it shows as blocked due to Disallow: display= am I doing something wrong or Google is just acting strange? I don't think pages with no display= are blocked in real.
Intermediate & Advanced SEO | | rbai0 -
Migration Challenge Question
I work for a company that recently acquired another company and we are in the process of merging the brands. Right now we have two website, lets call them: www.parentcompanyalpha.com www.acquiredcompanyalpha.com We are working with a web development company who is designing our brand new site, which will launch at the end of September, we can call that www.parentacquired.com. Normally it would be simple enough to just 301 redirect all content from www.parentcompanyalpha.com and www.acquiredcompanyalpha.com to the mapped migrated content on www.parentacquired.com. But that would be too simple. The reality is that only 30% of www.acquiredcompanyalpha.com will be migrating over, as part of that acquired business is remaining independent of the merged brands, and might be sold off. So someone over there mirrored the www.acquiredcompanyalpha.com site and created an exact duplicate of www.acquiredcompanybravo.com. So now we have duplicate content for that site out there (I was unaware they were doing this now, we thought they were waiting until our new site was launched). Eventually we will want some of the content from acquiredcompanyalpha.com to redirect to acquiredcompanybravo.com and the remainder to parentacquired.com. What is the best interim solution to maintain as much of the domain values as possible? The new site won't launch until end of September, and it could fall into October. I have two sites that are mirrors of each other, one with a domain value of 67 and the new one a lowly 17. I am concerned about the duplicate site dragging down that 67 score. I can ask them to use rel=canonical tags temporarily if both sites are going to remain until Sept/Oct timeframe, but which way should they go? I am inclined to think the best result would be to have acquiredcompanybravo.com rel=canonical back to acquiredcompanyalpha.com for now, and when the new site launches, remove those and redirect as appropriate. But will that have long term negative impact on acquiredcomapnybravo.com? Sorry, if this is convoluted, it is a little crazy with people in different companies doing different things that are not coordinated.
Intermediate & Advanced SEO | | Kenn_Gold0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
A very basic seo question
Sorry, been a long day and wanted a second opinion on this please.... I am developing an affiliate store which will have dozens of products in each category. We will not be indexing the product pages themselves as they are all duplicate content. The plan is to have just the first page of the category results indexed as this will have unique content about the products in that section. The later pagnated pages (ie pages 2,3,4,5 etc) will have 12 products on each but no unique content. Would the best advice be to add a canonical tag to all pages in the 'chairs' category pointing to the page with the first 12 results and the descriptions? This would ensure that the visitors are able to browse many pages of product but google won't index products 13 and onwards. Am I right in my thinkings? A supplemental question. What is the best way to block google from indexing/crawling 90,000 product listings which are pulled direct from the merchant so are not unique in the least. I have previous played with banning google from the product folder but it reports health issues in webmaster tools. Would the best route be a no index tag on all the product pages and to no follow all the products in the category listings? Many thanks Carl
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Is My Competitor Beating Me With A Better URL Structure?
A competitor is consistently beating my website on non-competitive, long tail keywords. His DA is 32 compared to my 46. His average PA is 23 to my 28. His average On Page Optimization Grade is a C compared to my A. His page speed score using YSlow is a 71 compared to my 78. The only thing I can think of at this point is that he has a better URL structure. We both have the keyword in the URL, but his structure goes like this (keyword: apw wyott parts): www.competitor.com/apw-wyott/parts While mine goes like this (I had nothing to do with this site's architecture; this is what I'm stuck with for the time being): http://www.etundra.com/APW_Wyott_Parts-C347.html It should be noted that the last word in these keywords is always the same - "parts." These keywords are for parts by different manufacturers so they follow a consistent pattern: [manufacturer-name] followed by "parts." Also, the "C347" on the end of my URL is the category number given to this particular category of products in our database. Are his URLs beating me or should I continue to look for other factors? If so, what other factors should I consider?
Intermediate & Advanced SEO | | eTundra0