What is the proper URL length? in seo
-
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google.
but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me?
my competitors have 8 characters domain url and keywords length of 13
and my site has 15 character domain url and keywords length of 13
which one will be prefered by google.
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
In terms of SEO (Search Engine Optimization), while there's no strict rule for the optimal URL length, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines and considerations:
-
Short and Descriptive:
- Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. Avoid unnecessary parameters or overly complex structures.
-
Keywords:
- Include relevant keywords in the URL, especially in the domain and the path. This can help search engines understand the topic of the page.
-
Readability:
- Keep URLs readable by using hyphens to separate words instead of underscores. For example, use "example.com/important-page" instead of "example.com/important_page."
-
Avoid Dynamic Parameters:
- If possible, avoid using dynamic parameters in URLs (e.g., "example.com/page?id=123"). Static, keyword-rich URLs are generally more SEO-friendly.
-
Consistency:
- Maintain consistency in your URL structure across your website. This helps both users and search engines navigate and understand the organization of your content.
-
301 Redirects for Changes:
- If you need to change a URL, use 301 redirects to inform search engines that the content has permanently moved. This preserves SEO value.
-
Limit Length:
- While there's no strict character limit for URLs, it's advisable to keep them reasonably short, ideally under 100 characters. Shorter URLs are easier to remember and share.
-
HTTPS:
- Use HTTPS for secure connections. Search engines tend to favor secure websites, and HTTPS is considered a ranking factor.
Remember that the primary goal is to create URLs that are user-friendly and provide a clear indication of the content. Search engines use URLs to understand the context and relevance of a page, so optimizing them for readability and keywords can positively impact your SEO efforts. Additionally, creating a logical URL structure helps users navigate your site more easily.
-
-
The ideal URL length for SEO is typically under 60 characters. Shorter URLs are easier for search engines to crawl and for users to read and remember. Keeping URLs concise, relevant to the page content, and including keywords can positively impact SEO performance. Avoid lengthy URLs with unnecessary parameters or characters.
-
The appropriate page URL is 75 characters length. And the maximum length of URL in the address bar is 2049 characters. For more info. like this click here.
-
In SEO, there is no strict rule for an ideal URL length, but it's generally recommended to keep URLs concise, relevant, and user-friendly. Here are some guidelines to consider:
Short and Descriptive: Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. A concise URL is easier to remember and share.
Include Keywords: If possible, include relevant keywords in your URL. This can contribute to the page's SEO, but don't over-optimize by stuffing too many keywords.
Avoid Dynamic Parameters: Clean, static URLs are preferred over URLs with dynamic parameters (e.g., https://azdentalclub.com/). Search engines prefer URLs that are easily readable and don't contain unnecessary parameters.
Hyphens Between Words: Use hyphens (-) rather than underscores (_) to separate words in the URL. Search engines treat hyphens as space, but underscores are not recognized as separators.
Avoid Stop Words: Consider omitting unnecessary stop words (e.g., "and," "or," "but") from your URLs. Focus on the main keywords that represent the page's content.
Be Consistent: Maintain a consistent URL structure across your site. Consistency makes it easier for both users and search engines to navigate and understand your website.
HTTPS: Ensure that your URLs use the secure HTTPS protocol. Google tends to favor secure websites, and HTTPS is a ranking factor.
While there's no strict character limit for URLs, it's generally advisable to keep them under 255 characters. This is because longer URLs may be truncated in search results, making them less user-friendly.
Remember that user experience is crucial, so prioritize creating URLs that are easy to read and understand. Additionally, focus on providing valuable content on your pages, as content quality is a key factor in SEO.
-
The proper URL length for SEO is generally recommended to be under 256 characters. It's important to keep your URLs concise and descriptive. Short and relevant URLs tend to perform better in search engine rankings and are easier for users to remember and share. Including relevant keywords in your URL can also help search engines and users understand the content of the page. Additionally, using hyphens to separate words in the URL is preferred over underscores or other special characters. Overall, aim for clear, concise, and keyword-rich URLs that accurately represent the content of your web pages.
-
50- 60 characters in a URL is good enough and will not be considered spam by Google. However, the vital aspect would be how you use the keywords and whether they are elegantly placed or one is stuffing it. Try to be as descriptive for the search engine, try to make it scannable and break it down.
Try to aim for a low-character URL because it is less likely to be mistaken as spam.
-
length can be detected as spam. You have to pay attention to the length.
-
The optimal length is 50-60 characters. If you're using a plugin like Rankmath or Yoast, they will also tell you which is optimum.
I'm following the Rankmath's guide to URL length and it's working perfectly and getting amazing results on my courier tracking website. -
It is crucial to consistently conduct competitor analysis, paying close attention to the length of their URLs.
A common mistake that many people make is incorporating long-tail keywords into their URLs, which is not considered a good SEO practice.
Personally, I strive to limit my site article URLs to a maximum of 4-5 words. In certain cases where the search volume is relatively low, I may include additional words, but the general best practice is to keep the URL as short as possible.
Once again, I cannot emphasize enough the importance of competitor analysis in shaping your approach.
-
When it comes to URL length for SEO, there is no definitive answer. However, it's generally recommended to keep URLs concise, include relevant keywords, avoid excessive parameters and unnecessary characters, use hyphens as word separators, maintain consistency, and prioritize usability and readability. Remember, URL length is just one factor among many that affect SEO.
-
Somewhere up to 75 characters max, from what I read. Longer than that could cause some difficulties in ranking.
-
While the length of a URL can have some impact on search engine optimization (SEO), it is generally recommended to keep URLs concise and relevant to the content of the page. URLs with fewer words tend to be easier for users to read and remember, and they also tend to be more user-friendly for sharing and linking purposes.
The impact of URL length on SEO is relatively small compared to other factors such as the quality and relevance of the content on your website, backlinks, site speed, user experience, and overall website optimization.
In terms of your specific scenario, where your competitors have 8-character domain URLs and keywords with a length of 13, and your site has a 15-character domain URL and keywords of the same length, it's unlikely that the slight difference in URL length alone would significantly impact your search engine rankings.
Google's algorithms consider numerous factors when determining the relevance and ranking of a website, and URL length is just one of them. It's important to focus on creating high-quality content, using relevant keywords, and ensuring a positive user experience on your website. These factors are likely to have a more substantial impact on your search engine rankings than the length of your URL.
-
I have tried to use proper URL length in my site but in some instances, long tail KWs mess it up. Then you have no option but a more than appropriate URL length
-
but sometimes the the long tail KW makes it difficult to have shorter URL length. for example "how many questions can you ask chatgpt"
-
When it comes to URL length in SEO (Search Engine Optimization), there is no strict rule for the maximum or ideal length. However, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines to consider:
Descriptive and Relevant: A URL should give users and search engines a clear idea of what the page is about. Including relevant keywords or a brief description of the content can help improve understanding and visibility.
Concise and Readable: Aim for shorter URLs that are easy to read and remember. Long, complex URLs can be confusing and difficult to share. Use hyphens (-) to separate words within the URL safe-ways and avoid using unnecessary characters, numbers, or special characters.
Avoid Keyword Stuffing: While it's important to include relevant keywords, avoid keyword stuffing in URLs. Maintain a natural flow and readability, and prioritize clarity over excessive keyword usage.
Maintain Consistency: Consistency in URL structure can benefit both users and search engines. Use a consistent format throughout your website, which can include using lowercase letters, eliminating unnecessary parameters, and organizing URLs in a logical and hierarchical manner.
-
@calvinkj Always analyze your competitors and analyze the length of their URLs.
Most people do big mistake and add long tail keyword in URL which isn't a good SEO practice.
I always add max. 4-5 words in URL for my site articles and in some articles where search volume is relatively lower, I do add more words but the best practice is have the shorter URL as possible.
Again, competitor analysis is the key
-
Some experience from words and hypehns in domain names
I used a hyphenated site www.octopus-energy-referral.co.uk and it is not doing too well compared to the non-hyphenated name. Similarly I have a site www.octopuscode.co.uk and it is doing really well compared to the hyphenated name because is is short and has fewer key words..
I know this is not a forensic comparison but I believe a non-hyphenated short name with fewer keywords is best if you have a choice.. -
If you haven't read this yet, please do (best practices for URLs).
So, it's a combination of things. As Devi Allen said, less is more. You want to use (and not over-use) descriptive words, separated by hyphens, "keeping URLs as simple, relevant, compelling, and accurate as possible". "To correctly render in all browsers, URLs must be shorter than 2,083 characters."
Which is better, your URL or your competitors? They sound pretty close based on your description but what matters is the actual words used in the URL, the site structure represented by that construct, whether the words truly represent what a visitor will find on the page, and whether the page content will provide visitors with the information they came looking for. URL length is but one of many factors that go into determining whether you or your competitor will rank higher.
-
You already answer it, less word is better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seeing URLS indexed that we don't want how do we approach this?
Hey guys, I have seen a few pages in the SERPS that are appearing from my site, some of these pages urls are actually ajax to refresh the buttons on our site... If these are important to our site but don't need to show up in the serps results can anyone recommend anything? Should I remove the urls? Or exclude them from the sitemap? or noindex? Any advice would be much appreciated thanks
White Hat / Black Hat SEO | | edward-may0 -
A doorway-page vendor has made my SEO life a nightmare! Advice anyone!?
Hey Everyone, So I am the SEO at a mid-sized nationwide retailer and have been working there for almost a year and half. This retailer is an SEO nightmare. Imagine the worst possible SEO nightmare, and that is my unfortunate yet challenging everyday reality. In light of the new algorithm update that seems to be on the horizon from Google to further crack down on the usage of doorway pages, I am coming to the Moz community for some desperately needed help. Before I was employed here, the eCommerce director and SEM Manager connected with a vendor that told them basically that they can do a PPC version of SEO for long-tail keywords. This vendor sold them on the idea that they will never compete with our own organic content and can bring in incremental traffic and revenue due to all of this wonderful technology they have that is essentially just a scraper. So for the past three years, this vendor has been creating thousands of doorway pages that are hosted on their own server but our masked as our own pages. They do have a massive index / directory in HTML attached to our website and even upload their own XML site maps to our Google Web Master Tools. So even though they “own” the pages, they masquerade as our own organic pages. So what we have today is thousands upon thousands of product and category pages that are essentially built dynamically and regurgitated through their scraper / platform, whatever. ALL of these pages are incredibly thin in content and it’s beyond me how Panda has not exterminated them. ALL of these pages are built entirely for search engines, to the point that you would feel like the year was 1998. All of these pages are incredibly over- optimized with spam that really is equivalent to just stuffing in a ton of meta keywords. (like I said – 1998) Almost ALL of these scraped doorway pages cause an incredible amount of duplicate content issues even though the “account rep” swears up and down to the SEM Manager (who oversees all paid programs) that they do not. Many of the pages use other shady tactics such as meta refresh style bait and switching. For example: The page title in the SERP shows as: Personalized Watch Boxes When you click the SERP and land on the doorway page the title changes to: Personalized Wrist Watches. Not one actual watch box is listed. They are ALL simply the most god awful pages in terms of UX that you will ever come across BUT because of the sheer volume of this pages spammed deep within the site, they create revenue just playing the odds game. Executives LOVE revenue. Also, one of this vendor’s tactics when our budget spend is reduced for this program is to randomly pull a certain amount of their pages and return numerous 404 server errors until spend bumps back up. This causes a massive nightmare for me. I can go on and on but I think you get where I am going. I have spent a year and half campaigning to get rid of this black-hat vendor and I am finally right on the brink of making it happen. The only problem is, it will be almost impossible to not drop in revenue for quite some time when these pages are pulled. Even though I have helped create several organic pages and product categories that will pick-up the slack when these are pulled, it will still be awhile before the dust settles and stabilizes. I am going to stop here because I can write a novel and the millions of issues I have with this vendor and what they have done. I know this was a very long and open-ended essay of this problem I have presented to you guys in the Moz community and I apologize and would love to clarify anything I can. My actual questions would be: Has anyone gone through a similar situation as this or have experience dealing with a vendor that employs this type of black-hat tactic? Is there any advice at all that you can offer me or experiences that you can share that can help be as armed as I can when I eventually convince the higher-ups they need to pull the plug? How can I limit the bleeding and can I even remotely rely on Google LSI to serve my organic pages for the related terms of the pages that are now gone? Thank you guys so much in advance, -Ben
White Hat / Black Hat SEO | | VBlue1 -
Best URL structure for SEO for Malaysian/Singapore site on .com.au domain
Hi there I know ideally i need a .my or .sg domain, however i dont have time to do this in the interim so what would be the best way to host Malaysian content on a www.domainname.com.au website? www.domainname.com.au/en-MY
White Hat / Black Hat SEO | | IsaCleanse
www.domainname.com.au/MY
domainname.com.au/malaysia
malaysia.domainname.com.au
my.domainname.com.au Im assuming this cant make the .com.au site look spammy but thought I'd ask just to be safe? Thanks in advance! 🙂0 -
Competition cheating on seo
So im trying to rank for O'fallon lawn care. And my competitor bought a domain lawncareofallonmo.com and now ranks number one....there is even a link to "take me to my homepage" What is going on i thought this was so 2008 not 2014.....
White Hat / Black Hat SEO | | grnside10 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
White Hat / Black Hat SEO | | howardd0 -
Press Release SEO. More than 90 press released in prweb. How good for seo
One of a competition sites known to us has published 90 press releases of all upcoming car models with PRWEB, PR7, DA 97 in last 4 months. All with different anchor based keywords and links. The page authority too is very high in some of them around 70 Though they might have had spent a sum, does this really influence from SEO perspective. If yes, can 10 press releases with PR Newswire which has PR8 and DA as 95 can be good if we consider doing this with all unique anchor text & links
White Hat / Black Hat SEO | | Modi0 -
Moving content to a clean URL
Greetings My site was seriously punished in the recent penguin update. I foolishly got some bad out sourced spammy links built and I am now paying for it 😞 I am now thinking it best to start fresh on a new url, but I am wondering if I can use the content from the flagged site on the new url. Would this be flagged as duplicate content, even if i took the old site down? your help is greatly appreciated Silas
White Hat / Black Hat SEO | | Silasrose0 -
Negative SEO on my website with paid +1's
Hi guys, I need a piece of advice. Some scumbag played me quite well with paid +1's on my two articles and now I'm in a problem.
White Hat / Black Hat SEO | | Fastbridge
http://sr.stateofseo.com/seo-vesti/google-implementacija-ssl-protokola-not-provided-problem/
http://sr.stateofseo.com/napredni-seo/najnovije-promene-google-panda-algoritma/
They are both translated articles (written originally by me on the same website). I've noticed those +1's (476 on both articles) when my website received a penalty for "SEO" keyword on Google.rs (Serbian Google) and I'm now on the 11th page.
Other keywords still rank just fine. Not cool, right? Now, I think there could be two solutions:
First one is to remove my inner link that's pointing to my homepage with "SEO" anchor, and hope for the best. Second one is to completely remove/delete those two articles and wait for Google to reindex the website and hopefully remove my ban. Do you guy have some other ideas how can I fix this or remove / disavow those +1 or somehow explain to the Google crew / algo that I'm just a humble SEO without any evil thoughts? 🙂 Thank you in advance.0