Urls Too Long - Should I shorten?
-
On the crawl of our website we have had a warning that 157 have urls that are too long.
When I look at the urls they are generally from 2016 or earlier. Should I just leave them as they are or shorten the urls and redirect to new url?
Thanks
-
Long URLs is a pet peeve! But I agree with Martin on the traffic analogy. Modify the pages that need the most help and measure against the pages you do not change.
-
Look at the potential, are these URLs that you currently are receiving search traffic for. If that isn't the case I would hesitate on changing them as it's work that might not provide you with any benefit.
-
If there is an opportunity for those blog posts to rank for certain relevant searches, I absolutely think it would be worth it to go back in and optimize them. You can shorten the URLs, create 301 redirects from the old to the new ones, and re-optimize the blog posts as a whole, as well as add some internal links with pages you are wanting to improve rankings on.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
AJAX and High Number Of URLS Indexed
I recently took over as the SEO for a large ecommerce site. Every Month or so our webmaster tools account is hit with a warning for a high number of URLS. In each message they send there is a sample of problematic URLS. 98% of each sample is not an actual URL on our site but is an AJAX request url that users are making. This is a server side request so the URL does not change when users make narrowing selections for items like size, color etc. Here is an example of what one of those looks like Tire?0-1.IBehaviorListener.0-border-border_body-VehicleFilter-VehicleSelectPanel-VehicleAttrsForm-Makes We have over 3 million indexed URLs according to Google because of this. We are not submitting these urls in our site maps, Google Bot is making lots of AJAX selections according to our server data. I have used the URL Handling Parameter Tool to target some of those parameters that are currently set to let Google decide and set it to "no urls" with those parameters to be indexed. I still need more time to see how effective that will be but it does seem to have slowed the number of URLs being indexed. Other notes: 1. Overall traffic to the site has been steady and even increasing. 2. Google bot crawls an average of 241000 urls each day according to our crawl stats. We are a large Ecommerce site that sells parts, accessories and apparel in the power sports industry. 3. We are using the Wicket frame work for our website. Thanks for your time.
Technical SEO | | RMATVMC0 -
Question about creating friendly URLs
I am working on creating new SEO friendly URLs for my company website. The products are the items with the highest search volume and each is very geo-specific
Technical SEO | | theLotter
There is not a high search volume for the geo-location associated with the product, but the searches we do get convert well. Do you think it is preferable to leave the location out of the URL or include it?0 -
How to handle lots of URL parameters
Howdy mozzers I'm hoping you can lend some advice. I'm dealing with a site now with loads of URL parameters. It's a vehicle dealership group which hosts its entire inventory from multiple locations on one page, sorted by parameters. Example inventory URL: www.dealership.com/car-inventory.asp?pa=&ns=10&so=m&sor=DESC&ma=&mod=&mt=&yr=&bs=&pr=&t=used&ln= Where pa (page no.); ns (number of vehicles shown); so (sort by condition); sor (sort order); ma (make); mod (model); yr (year); bs (body style); pr (price range); t (type - new, used, etc.); ln (location no.). As you can imagine this generates a gazillion URLs (or slightly less). Any thoughts on best canonicalization options? Thanks as always
Technical SEO | | jamesm5i0 -
URL Folders and Naming Convention Changes?
1. We’re looking for some clarification in regards to our URL structure. Currently, at our product level we have http://www.ties.com/v/a/elite-solid-black-black-tie however the parent URL is http://www.ties.com/black-ties. a. So here are the question. How much is this hurting because semantically the naming convention of this URL and weird and doesn’t follow logical patterns. In other words. Should the product page for this be http://ties.com/black-ties/elite-solid-black-tie. How bad is this hurting us? b. If we were to change the ULR structure, should we do it in phases or all at once? We don’t want to get penalized. We have well over 3,000 product pages.
Technical SEO | | Ties.com0 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
Hyphen in URL
Hi, I would like to know if the following statement holds true today or it doesn't matter whether we use hyphens or underscore If you have a URL like keyword1_keyword2, Google will only return that page if the user searches for keyword1_keyword2 ( highly unlikely ) . But If you have a URL like keyword1-keyword2, that page can be returned for the searches - keyword1,keyword2 and even “keyword1keyword2” Thanks
Technical SEO | | seoug_20050 -
/$1 URL Showing Up
Whenever I crawl my site with any kind of bot or a sitemap generator over my site. it comes up with /$1 version of my URLs. For example: It gives me hdiconference.com & hdiconference.com/$1 and hdiconference.com/purchases & hdiconference.com/purchases/$1 Then I get warnings saying that it's duplicate content. Here's the problem: I can't find these /$1 URLs anywhere. Even when I type them in, I get a 404 error. I don't know what they are, where they came from, and I can't find them when I scour my code. So, I'm trying to figure out where the crawlers are picking this up. Where are these things? If sitemap generators and other site crawlers are seeing them, I have to assume that Googlebot is seeing them as well. Any help? My developers are at a loss as well.
Technical SEO | | HDI0 -
URL Structure
Hi Guys, I'm in the process of creating a very exciting startup aimed at the baby industry. It's essentially a social commerce question where parents can shop for products, create lists of products and ask questions. The challenge I'm facing is how best to structure my URLs from an SEO standpoint. For example a common baby topic such as "feeding", can sit in all three categories: Shopping category aggregates all products related to feeding List category aggregates all lists related to feeding Question category aggregates all question and answers on feeding So for that keyword "feeding" you have 3 potential landing pages. What I was wondering is what is the most effective way of doing it? I was thinking of something along these lines: /shopping/feeding /baby_list/feeding /ask/feeding Would love to hear your points of view on this. Thanks! Walid
Technical SEO | | walidalsaqqaf0