URL Length or Exact Breadcrumb Navigation URL? What's More Important
-
Basically my question is as follows, what's better:
www.romancingdiamonds.com/gemstone-rings/amethyst-rings/purple-amethyst-ring-14k-white-gold (this would fully match the breadcrumbs).
or www.romancingdiamonds.com/amethyst-rings/purple-amethyst-ring-14k-white-gold (cutting out the first level folder to keep the url shorter and the important keywords are closer to the root domain).
In this question http://www.seomoz.org/qa/discuss/37982/url-length-vs-url-keywords I was consulted to drop a folder in my url because it may be to long. That's why I'm hesitant to keep the bradcrumb structure the same.
To the best of your knowldege do you think it's best to drop a folder in the URL to keep it shorter and sweeter, or to have a longer URL and have it match the breadcrumb structure?
Please advise,
Shawn
-
The question is a great one, and the responses to it are also great, but they directly contradict each other! Could the SEOMoz staff weigh in on this one?
-
Shwan,
I have noticed that when you have a long URL structure with multiple folders, Google tends to lose "interest" in your deep pages.
Let me give you an example: If you have a domain called www.website.com and you have a category called gemstones. In gemstones, you have diamond as a subcategory and a solitaire as a page.
If you consider your homepage to have an importance of 1, you would not have a category page which also has an importance of greater than or equal to 1. So, your category page gets a page weight value...lets say 0.9. Now, your subcategory page is treated that same way and you give it a page weight of say 0.8. Now, your solitaire page gets a value less than 0.8. Now, if you cut out one or more levels in your URL, you have a better chance of assigning of a higher value to your page.
Now, coming to your question. Breadcrumbs are essentially meant to help your users navigate better. So, your website hiearchy (the folders, sub folders or categories, sub categories) should reflect in your breadcrumb.
So, keep your URLs short, but keep your breadcrumbs like your website flow.
-
In my experience, Google does not ranking lower longer URLs. I know Rand published once a correlation study that shows this, but I can show you many examples that I've used the first approach and got good rankings. Same for the second approach.
I think you should keep your website architecture solid, following the correct path and not removing one level.
Another thing is that dropping that specific level may cause you some duplicate URLs problem. You will need some database check before enable each URLs. Pay attention if you choose the second approach.
-
The url
length should not be ridiculously long, I think 255 chars in the safe limit.So long as
you keep in that it’s ok, but if you have a long bread crumb trail, your page
is probably many clicks from the home page also and this is bad.Many people
think that the problem is having pages deep in the folder structure, but it is
not, it is the amount of clicks from the home page that is the problem
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Pagination and View All Pages Question. We currently don't have a canonical tag pointing to View all as I don't believe it's a good user experience so how best we deal with this.
Hello All, I have an eCommerce site and have implemented the use rel="prev" and rel="next" for Page Pagination. However, we also have a View All which shows all the products but we currently don't have a canonical tag pointing to this as I don't believe showing the user a page with shed loads of products on it is actually a good user experience so we havent done anything with this page. I have a sample url from one of our categories which may help - http://goo.gl/9LPDOZ This is obviously causing me duplication issues as well . Also , the main category pages has historically been the pages which ranks better as opposed to Page 2, Page 3 etc etc. I am wondering what I should do about the View All Page and has anyone else had this same issue and how did they deal with it. Do we just get rid of the View All even though Google says it prefers you to have it ? I also want to concentrate my link juice on the main category pages as opposed being diluted between all my paginated pages ? - Does anyone have any tips on how to best do this and have you seen any ranking improvement from this ? Any ideas greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
We sold our site's domain and have a new one. Where do we go from here?
We recently sold our established domain -- for a compelling price -- and now have the task of transitioning to our new domain. What steps would you recommend to lesson the anticipated decline from search engines in this scenario?
Intermediate & Advanced SEO | | accessintel0 -
It's a good idea to have a directory on your website?
Currently I have a directory on a sub domain but Google apparently sees it as part of my main domain so all outgoing links may be affecting my rankings?
Intermediate & Advanced SEO | | Valarlf0 -
How Long Before a URL is 'Too Long'
Hello Mozzers, Two of the sites I manage are currently in the process of merging into one site and as a result, many of the URLs are changing. Nevertheless (and I've shared this with my team), I was under the impression that after a certain point, Google starts to discount the validity of URLs that are too long. With that, if I were to have a URL that was structured as follows, would that be considered 'too long' if I'm trying to get the content indexed highly within Google? Here's an example: yourdomain.com/content/content-directory/article and in some cases, it can go as deep as: yourdomain.com/content/content-directory/organization/article. Albeit there is no current way for me to shorten these URLs is there anything I can do to make sure the content residing on a similar path is still eligible to rank highly on Google? How would I go about achieving this?
Intermediate & Advanced SEO | | NiallSmith0 -
Our site has been penalized and it's proving to be very hard to get our rankings back...
So I have a question. We have used nearly every trick in the book to rank our site, including a ton of white hat stuff.... but then also a lot of black hat practices that resulted in us dropping in the rankings by about 30-40 positions. And getting back to where we were (top 10 for most keywords) is proving to be nearly impossible. We have a ton of great content coming off of the site and we actually offer a quality product. We follow most of the guidelines advocated here on SEOmoz. But the black hat stuff we did has really taken a toll. And it's gonna be pretty much impossible to go back in time and erase all of the Black Hat stuff we did. So what should we do? Should we design a completely new website with a new domain? What can be done to help?
Intermediate & Advanced SEO | | LilyRay0 -
Removing URLs in bulk when directory exclusion isn't an option?
I had a bunch of URLs on my site that followed the form: http://www.example.com/abcdefg?q=&site_id=0000000048zfkf&l= There were several million pages, each associated with a different site_id. They weren't very useful, so we've removed them entirely and now return a 404.The problem is, they're still stuck in Google's index. I'd like to remove them manually, but how? There's no proper directory (i.e. /abcdefg/) to remove, since there's no trailing /, and removing them one by one isn't an option. Is there any other way to approach the problem or specify URLs in bulk? Any insights are much appreciated. Kurus
Intermediate & Advanced SEO | | kurus1