Which URL would you choose?
-
1 – www.company.com/subfolder/subfolder/keyword-keyword-product (I’m able to keyword match with this url)
or
2. www.company.com/subfolder/subfolder/product (no url keyword match)
What would you choose? A url which is "short" but still relevant, or, a url which is more descriptive allowing “keyword” match?
Be great to get your feedback guys.
Many thanks
Gary
-
Good share, thx Dean
-
Good article on Moz: Should I Change My URLs for SEO in particular number 4 re Keyword Stuffing
-
My thoughts also. We're limiting to 2 sub folders to allow everything to be more manageable, plus naturally this contributes to keyword targeting.
Thanks Leonie
-
Hi, I'll choose the short url and put keywords in title, description and content. If possible choose only 1 subfolder.
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International URL Structures
Hi everyone! I've read a bunch of articles on the topic, but I can't seem to be able to figure out a solution that works for the specific case. We are creating a site for a service agency, this agency has offices around the world - the site has a global version (in English/French & Spanish) and some country specific versions. Here is where it gets tricky: in some countries, each office has a different version of the site and since we have Canada for example we have a French and an English version of the site. For cost and maintenance reason, we want to have a single domain : www.example.com We want to be able to indicate via Search Console that each subdomain is attached to a different country, but how should we go about it. I've seen some examples with subfolders like this: Global FR : www.example.com/fr-GL Canada FR: www.example.com/fr-ca France: www.example.com/fr-fr Does this work? It seems to make more sense to use : **Subdirectories with gTLDs, **but I'm not sure how that would work to indicate the difference between my French Global version vs. France site. Global FR : www.example.com/fr France : www.example.com/fr/fr Am I going about this the right way, I feel the more I dig into the issue, the less it seems there is a good solution available to indicate to Google which version of my site is geo-targeted to each country. Thanks in advance!
Technical SEO | | sarahcoutu150 -
Backlinks that go to a redirected URL
Hey guys, just wondering, my client has 3 websites, 2 of 3 will be closed down and the domains will be permanently redirected to the 1 primary domain - however they have some high quality backlinks pointing the domains that will be redirected. How does this effective SEO? Domain One (primary - getting redesign and rebuilt) - not many backlinks
Technical SEO | | thinkLukeSEO
Domain Two (will redirect to Domain One) - has quality backlinks
Domain Three (will redirect to Domain One) - has quality backlinks When the new website is launched on Domain One I will contact the backlink providers and request they update their URL - i assume that would be the best.0 -
Redirecting a Few URLS from One Domain to Another
Hello, I have two websites within a similar niche...some of the top organic traffic driving pages on Website B I'd like to redirect to a similar page on Website A. The reason is Website A is a bigger and better and is monetized much better as well. I only want to redirect a few of the main URLS on Website A and also only those that I have similar content on my main Website B. Is this process safe for SEO? What is the best way to go about this process. I am not really concerned with Website B and what happens to it's rankings, but in the meantime, I'd like to redirect the traffic from some of it's main organic traffic driving pages to my main website A and to it's similar pages. I am also concerned with making sure my main website A stays white hat and doesn't receive any negativity from these redirects. Thanks.
Technical SEO | | juicyresults0 -
Title Tags & Url Structure
So I'm working on a website for a client in the Tourism Industry. We've got a comprehensive list of museums & other attractions in a number of cities that have to go online. And we have to come up with the correct url structure, title tags and obviously content. My current line of thought was to work the urls in the following way. http://domain.com/type-of-attraction/city/name-of-attraction/ This is mainly because we think that the type of attraction is far more important then the city (SEO wise) as the country as a whole receives more searches, however we require a city in the url to make it unique because some attractions across cities happen to share names and we don't want to have the names of attractions littered with city names. However for title-tags I wanted to go the other way around, again due to the attraction type being more important then the city. Name of Attraction - Type of Attraction - City - Brand Name or Name of Attraction - Type of Attraction in City - Brand Name I am quite confident in working it this way; however I would appreciate if I receive some feedback on this structure, you think its good or you would make any suggestions / alterations. One last thing, There's the possibility of having many urls ending up with the same city names (For each type of attraction) I would think that just providing a list of links & duplicate text is not enough; would you suggest a canonical pointing to a link containing just information on the city? and using the other pages for user-navigation only? or should i set variables in the text which are replaced by the types of attraction so that the text looks different for each one?
Technical SEO | | jonmifsud0 -
Spider Indexed Disallowed URLs
Hi there, In order to reduce the huge amount of duplicate content and titles for a cliënt, we have disallowed all spiders for some areas of the site in August via the robots.txt-file. This was followed by a huge decrease in errors in our SEOmoz crawl report, which, of course, made us satisfied. In the meanwhile, we haven't changed anything in the back-end, robots.txt-file, FTP, website or anything. But our crawl report came in this November and all of a sudden all the errors where back. We've checked the errors and noticed URLs that are definitly disallowed. The disallowment of these URLs is also verified by our Google Webmaster Tools, other robots.txt-checkers and when we search for a disallowed URL in Google, it says that it's blocked for spiders. Where did these errors came from? Was it the SEOmoz spider that broke our disallowment or something? You can see the drop and the increase in errors in the attached image. Thanks in advance. [](<a href=)" target="_blank">a> [](<a href=)" target="_blank">a> LAAFj.jpg
Technical SEO | | ooseoo0 -
Urls with or without .html ending
Hello, Can anyone show me some authority info on wheher links are better with or without a .html ending? Thanks is advance
Technical SEO | | sesertin0 -
Lost FaceBook Shares with URL change
I recently changed the URL of a page and used a 301 redirect from old to new. I just realized I lost all my Facebook shares. Now it shows 0 on that page. What can I do to get back my count of shares? I cannot find any information.
Technical SEO | | MiamiWebCompany0 -
URL's for news content
We have made modifications to the URL structure for a particular client who publishes news articles in various niche industries. In line with SEO best practice we removed the article ID from the URL - an example is below: http://www.website.com/news/123/news-article-title
Technical SEO | | mccormackmorrison
http://www.website.com/news/read/news-article-title Since this has been done we have noticed a decline in traffic volumes (we have not as yet assessed the impact on number of pages indexed). Google have suggested that we need to include unique numerical IDs in the URL somewhere to aid spidering. Firstly, is this policy for news submissions? Secondly (if the previous answer is yes), is this to overcome the obvious issue with the velocity and trend based nature of news submissions resulting in false duplicate URL/ title tag violations? Thirdly, do you have any advice on the way to go? Thanks P.S. One final one (you can count this as two question credits if required), is it possible to check the volume of pages indexed at various points in the past i.e. if you think that the number of pages being indexed may have declined, is there any way of confirming this after the event? Thanks again! Neil0