What is the proper URL length? in seo
-
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google.
but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me?
my competitors have 8 characters domain url and keywords length of 13
and my site has 15 character domain url and keywords length of 13
which one will be prefered by google.
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
In terms of SEO (Search Engine Optimization), while there's no strict rule for the optimal URL length, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines and considerations:
-
Short and Descriptive:
- Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. Avoid unnecessary parameters or overly complex structures.
-
Keywords:
- Include relevant keywords in the URL, especially in the domain and the path. This can help search engines understand the topic of the page.
-
Readability:
- Keep URLs readable by using hyphens to separate words instead of underscores. For example, use "example.com/important-page" instead of "example.com/important_page."
-
Avoid Dynamic Parameters:
- If possible, avoid using dynamic parameters in URLs (e.g., "example.com/page?id=123"). Static, keyword-rich URLs are generally more SEO-friendly.
-
Consistency:
- Maintain consistency in your URL structure across your website. This helps both users and search engines navigate and understand the organization of your content.
-
301 Redirects for Changes:
- If you need to change a URL, use 301 redirects to inform search engines that the content has permanently moved. This preserves SEO value.
-
Limit Length:
- While there's no strict character limit for URLs, it's advisable to keep them reasonably short, ideally under 100 characters. Shorter URLs are easier to remember and share.
-
HTTPS:
- Use HTTPS for secure connections. Search engines tend to favor secure websites, and HTTPS is considered a ranking factor.
Remember that the primary goal is to create URLs that are user-friendly and provide a clear indication of the content. Search engines use URLs to understand the context and relevance of a page, so optimizing them for readability and keywords can positively impact your SEO efforts. Additionally, creating a logical URL structure helps users navigate your site more easily.
-
-
The ideal URL length for SEO is typically under 60 characters. Shorter URLs are easier for search engines to crawl and for users to read and remember. Keeping URLs concise, relevant to the page content, and including keywords can positively impact SEO performance. Avoid lengthy URLs with unnecessary parameters or characters.
-
The appropriate page URL is 75 characters length. And the maximum length of URL in the address bar is 2049 characters. For more info. like this click here.
-
In SEO, there is no strict rule for an ideal URL length, but it's generally recommended to keep URLs concise, relevant, and user-friendly. Here are some guidelines to consider:
Short and Descriptive: Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. A concise URL is easier to remember and share.
Include Keywords: If possible, include relevant keywords in your URL. This can contribute to the page's SEO, but don't over-optimize by stuffing too many keywords.
Avoid Dynamic Parameters: Clean, static URLs are preferred over URLs with dynamic parameters (e.g., https://azdentalclub.com/). Search engines prefer URLs that are easily readable and don't contain unnecessary parameters.
Hyphens Between Words: Use hyphens (-) rather than underscores (_) to separate words in the URL. Search engines treat hyphens as space, but underscores are not recognized as separators.
Avoid Stop Words: Consider omitting unnecessary stop words (e.g., "and," "or," "but") from your URLs. Focus on the main keywords that represent the page's content.
Be Consistent: Maintain a consistent URL structure across your site. Consistency makes it easier for both users and search engines to navigate and understand your website.
HTTPS: Ensure that your URLs use the secure HTTPS protocol. Google tends to favor secure websites, and HTTPS is a ranking factor.
While there's no strict character limit for URLs, it's generally advisable to keep them under 255 characters. This is because longer URLs may be truncated in search results, making them less user-friendly.
Remember that user experience is crucial, so prioritize creating URLs that are easy to read and understand. Additionally, focus on providing valuable content on your pages, as content quality is a key factor in SEO.
-
The proper URL length for SEO is generally recommended to be under 256 characters. It's important to keep your URLs concise and descriptive. Short and relevant URLs tend to perform better in search engine rankings and are easier for users to remember and share. Including relevant keywords in your URL can also help search engines and users understand the content of the page. Additionally, using hyphens to separate words in the URL is preferred over underscores or other special characters. Overall, aim for clear, concise, and keyword-rich URLs that accurately represent the content of your web pages.
-
50- 60 characters in a URL is good enough and will not be considered spam by Google. However, the vital aspect would be how you use the keywords and whether they are elegantly placed or one is stuffing it. Try to be as descriptive for the search engine, try to make it scannable and break it down.
Try to aim for a low-character URL because it is less likely to be mistaken as spam.
-
length can be detected as spam. You have to pay attention to the length.
-
The optimal length is 50-60 characters. If you're using a plugin like Rankmath or Yoast, they will also tell you which is optimum.
I'm following the Rankmath's guide to URL length and it's working perfectly and getting amazing results on my courier tracking website. -
It is crucial to consistently conduct competitor analysis, paying close attention to the length of their URLs.
A common mistake that many people make is incorporating long-tail keywords into their URLs, which is not considered a good SEO practice.
Personally, I strive to limit my site article URLs to a maximum of 4-5 words. In certain cases where the search volume is relatively low, I may include additional words, but the general best practice is to keep the URL as short as possible.
Once again, I cannot emphasize enough the importance of competitor analysis in shaping your approach.
-
When it comes to URL length for SEO, there is no definitive answer. However, it's generally recommended to keep URLs concise, include relevant keywords, avoid excessive parameters and unnecessary characters, use hyphens as word separators, maintain consistency, and prioritize usability and readability. Remember, URL length is just one factor among many that affect SEO.
-
Somewhere up to 75 characters max, from what I read. Longer than that could cause some difficulties in ranking.
-
While the length of a URL can have some impact on search engine optimization (SEO), it is generally recommended to keep URLs concise and relevant to the content of the page. URLs with fewer words tend to be easier for users to read and remember, and they also tend to be more user-friendly for sharing and linking purposes.
The impact of URL length on SEO is relatively small compared to other factors such as the quality and relevance of the content on your website, backlinks, site speed, user experience, and overall website optimization.
In terms of your specific scenario, where your competitors have 8-character domain URLs and keywords with a length of 13, and your site has a 15-character domain URL and keywords of the same length, it's unlikely that the slight difference in URL length alone would significantly impact your search engine rankings.
Google's algorithms consider numerous factors when determining the relevance and ranking of a website, and URL length is just one of them. It's important to focus on creating high-quality content, using relevant keywords, and ensuring a positive user experience on your website. These factors are likely to have a more substantial impact on your search engine rankings than the length of your URL.
-
I have tried to use proper URL length in my site but in some instances, long tail KWs mess it up. Then you have no option but a more than appropriate URL length
-
but sometimes the the long tail KW makes it difficult to have shorter URL length. for example "how many questions can you ask chatgpt"
-
When it comes to URL length in SEO (Search Engine Optimization), there is no strict rule for the maximum or ideal length. However, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines to consider:
Descriptive and Relevant: A URL should give users and search engines a clear idea of what the page is about. Including relevant keywords or a brief description of the content can help improve understanding and visibility.
Concise and Readable: Aim for shorter URLs that are easy to read and remember. Long, complex URLs can be confusing and difficult to share. Use hyphens (-) to separate words within the URL safe-ways and avoid using unnecessary characters, numbers, or special characters.
Avoid Keyword Stuffing: While it's important to include relevant keywords, avoid keyword stuffing in URLs. Maintain a natural flow and readability, and prioritize clarity over excessive keyword usage.
Maintain Consistency: Consistency in URL structure can benefit both users and search engines. Use a consistent format throughout your website, which can include using lowercase letters, eliminating unnecessary parameters, and organizing URLs in a logical and hierarchical manner.
-
@calvinkj Always analyze your competitors and analyze the length of their URLs.
Most people do big mistake and add long tail keyword in URL which isn't a good SEO practice.
I always add max. 4-5 words in URL for my site articles and in some articles where search volume is relatively lower, I do add more words but the best practice is have the shorter URL as possible.
Again, competitor analysis is the key
-
Some experience from words and hypehns in domain names
I used a hyphenated site www.octopus-energy-referral.co.uk and it is not doing too well compared to the non-hyphenated name. Similarly I have a site www.octopuscode.co.uk and it is doing really well compared to the hyphenated name because is is short and has fewer key words..
I know this is not a forensic comparison but I believe a non-hyphenated short name with fewer keywords is best if you have a choice.. -
If you haven't read this yet, please do (best practices for URLs).
So, it's a combination of things. As Devi Allen said, less is more. You want to use (and not over-use) descriptive words, separated by hyphens, "keeping URLs as simple, relevant, compelling, and accurate as possible". "To correctly render in all browsers, URLs must be shorter than 2,083 characters."
Which is better, your URL or your competitors? They sound pretty close based on your description but what matters is the actual words used in the URL, the site structure represented by that construct, whether the words truly represent what a visitor will find on the page, and whether the page content will provide visitors with the information they came looking for. URL length is but one of many factors that go into determining whether you or your competitor will rank higher.
-
You already answer it, less word is better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Active Rain and SEO
I have been an active rain member for a long time. When I check my web site I can not find any links from Active Rain. I just updated my Active Rain profile and upgraded to their paid subscription. Can you tell me if this blog is creating a follow link back to my web site at www.RealEstatemarketLeaders.com the blog on active rain is here. at http://activerain.trulia.com/blogsview/4529309/hud-homes-for-sale-in-tri-cities-wa
White Hat / Black Hat SEO | | Brandon_Patton0 -
SEO for Career sites and sup-pages
For main job categories: We manage several career pages for several clients but the competition for the main keywords (even several long tail) is from big names like Indeed and similar job boards?
White Hat / Black Hat SEO | | rflores
What would you recommend? For job posts: Since the job posts that our clients post are short lived (80% live less than a month) would it still be incorrect to purchase backlinks? or is it always a big no Thanks for your help. And if a similar question has been asked I would appreciate if you could point me to it. I could not find one.0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Victim of Negative SEO - Can I Redirect the Attacked Page to an External Site?
My site has been a victim of Negative SEO. During the course of 3 weeks, I have received over 3000 new backlinks from 200 referring domains (based on Ahref report). All links are pointing to just 1 page (all other pages within the site are unaffected). I have already disavowed as many links as possible from Ahref report, but is that all I can do? What if I continue to receive bad backlinks? I'm thinking of permanently redirecting the affected page to an external website (a dummy site), and hope that all the juice from the bad backlinks will be transferred to that site. Do you think this would be a good practice? I don't care much about keeping the affected page on my site, but I want to make sure the bad backlinks don't affect the entire site. The bad backlinks started to come in around 3 weeks ago and the rankings haven't been affected yet. The backlinks are targeting one single keyword and are mostly comment backlinks and trackbacks. Would appreciate any suggestions 🙂 Howard
White Hat / Black Hat SEO | | howardd0 -
SEO expert advice needed :)
So I have a niche site that I'm pretty sure has received an over-optimization penalty. This was about nine months ago or so. I haven’t really done much with the site since however I’d like the site to start appearing in the serps again, as I am adding fresh content and trying to create a really useful resource. I don't appear in the serps for any keywords related to my niche anymore. The site IS still indexed though. I didn't get any messages telling me that I was penalized so I don't think it was manual. I didn't use any spam or anything like that but I believe the penalty was probably for anchor text over-optimization and/or too many links to non-home page urls in comparison to the total amount of links the site had. I know removing these links or changing the anchor can help but the thing is the site only has about 30 total linking root domains pointed at it. So I was wondering if I could just add more links to other pages/the home page and add more links with varied anchors/naked urls to change the ratios and make it appear more natural. Now, would/could this fix my penalty? I am frustrated that I even received a penalty at all because much of my competition is ranking for fairly competitive terms with no real solid links pointed at their site and tons of comment spam. I have some relevant links/quality links so I am hoping that fixing this penalty could help put me back where I was before I got knocked into oblivion. There is one example of a competitor with a PR0 site getting good traffic and ranking for some nice keywords with only a bunch of self-set up web properties (and some comment spam) containing one only page for the purpose of linking back to their money site (blogspot, wordpress, weebly, mywebstarts ect). On top of that a lot of the sites I'm competing again are MFA, garbage sites that are written by non-native English speakers that offer zero value to the visitor. I need to start out ranking these spammers again. What should I do? thanks!
White Hat / Black Hat SEO | | jmckiernan86_gmail.com0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Banner Ads help seo?
I see in OSE banner ads counting ads as incoming links - My question is has anyone done a study showing a non tagged banner ad link and its effects on seo? Does google counting it as organic since it has no tagging or since its in a ad spot its ignored?
White Hat / Black Hat SEO | | DavidKonigsberg0