What is the proper URL length? in seo
-
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google.
but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me?
my competitors have 8 characters domain url and keywords length of 13
and my site has 15 character domain url and keywords length of 13
which one will be prefered by google.
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
In terms of SEO (Search Engine Optimization), while there's no strict rule for the optimal URL length, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines and considerations:
-
Short and Descriptive:
- Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. Avoid unnecessary parameters or overly complex structures.
-
Keywords:
- Include relevant keywords in the URL, especially in the domain and the path. This can help search engines understand the topic of the page.
-
Readability:
- Keep URLs readable by using hyphens to separate words instead of underscores. For example, use "example.com/important-page" instead of "example.com/important_page."
-
Avoid Dynamic Parameters:
- If possible, avoid using dynamic parameters in URLs (e.g., "example.com/page?id=123"). Static, keyword-rich URLs are generally more SEO-friendly.
-
Consistency:
- Maintain consistency in your URL structure across your website. This helps both users and search engines navigate and understand the organization of your content.
-
301 Redirects for Changes:
- If you need to change a URL, use 301 redirects to inform search engines that the content has permanently moved. This preserves SEO value.
-
Limit Length:
- While there's no strict character limit for URLs, it's advisable to keep them reasonably short, ideally under 100 characters. Shorter URLs are easier to remember and share.
-
HTTPS:
- Use HTTPS for secure connections. Search engines tend to favor secure websites, and HTTPS is considered a ranking factor.
Remember that the primary goal is to create URLs that are user-friendly and provide a clear indication of the content. Search engines use URLs to understand the context and relevance of a page, so optimizing them for readability and keywords can positively impact your SEO efforts. Additionally, creating a logical URL structure helps users navigate your site more easily.
-
-
The ideal URL length for SEO is typically under 60 characters. Shorter URLs are easier for search engines to crawl and for users to read and remember. Keeping URLs concise, relevant to the page content, and including keywords can positively impact SEO performance. Avoid lengthy URLs with unnecessary parameters or characters.
-
The appropriate page URL is 75 characters length. And the maximum length of URL in the address bar is 2049 characters. For more info. like this click here.
-
In SEO, there is no strict rule for an ideal URL length, but it's generally recommended to keep URLs concise, relevant, and user-friendly. Here are some guidelines to consider:
Short and Descriptive: Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. A concise URL is easier to remember and share.
Include Keywords: If possible, include relevant keywords in your URL. This can contribute to the page's SEO, but don't over-optimize by stuffing too many keywords.
Avoid Dynamic Parameters: Clean, static URLs are preferred over URLs with dynamic parameters (e.g., https://azdentalclub.com/). Search engines prefer URLs that are easily readable and don't contain unnecessary parameters.
Hyphens Between Words: Use hyphens (-) rather than underscores (_) to separate words in the URL. Search engines treat hyphens as space, but underscores are not recognized as separators.
Avoid Stop Words: Consider omitting unnecessary stop words (e.g., "and," "or," "but") from your URLs. Focus on the main keywords that represent the page's content.
Be Consistent: Maintain a consistent URL structure across your site. Consistency makes it easier for both users and search engines to navigate and understand your website.
HTTPS: Ensure that your URLs use the secure HTTPS protocol. Google tends to favor secure websites, and HTTPS is a ranking factor.
While there's no strict character limit for URLs, it's generally advisable to keep them under 255 characters. This is because longer URLs may be truncated in search results, making them less user-friendly.
Remember that user experience is crucial, so prioritize creating URLs that are easy to read and understand. Additionally, focus on providing valuable content on your pages, as content quality is a key factor in SEO.
-
The proper URL length for SEO is generally recommended to be under 256 characters. It's important to keep your URLs concise and descriptive. Short and relevant URLs tend to perform better in search engine rankings and are easier for users to remember and share. Including relevant keywords in your URL can also help search engines and users understand the content of the page. Additionally, using hyphens to separate words in the URL is preferred over underscores or other special characters. Overall, aim for clear, concise, and keyword-rich URLs that accurately represent the content of your web pages.
-
50- 60 characters in a URL is good enough and will not be considered spam by Google. However, the vital aspect would be how you use the keywords and whether they are elegantly placed or one is stuffing it. Try to be as descriptive for the search engine, try to make it scannable and break it down.
Try to aim for a low-character URL because it is less likely to be mistaken as spam.
-
length can be detected as spam. You have to pay attention to the length.
-
The optimal length is 50-60 characters. If you're using a plugin like Rankmath or Yoast, they will also tell you which is optimum.
I'm following the Rankmath's guide to URL length and it's working perfectly and getting amazing results on my courier tracking website. -
It is crucial to consistently conduct competitor analysis, paying close attention to the length of their URLs.
A common mistake that many people make is incorporating long-tail keywords into their URLs, which is not considered a good SEO practice.
Personally, I strive to limit my site article URLs to a maximum of 4-5 words. In certain cases where the search volume is relatively low, I may include additional words, but the general best practice is to keep the URL as short as possible.
Once again, I cannot emphasize enough the importance of competitor analysis in shaping your approach.
-
When it comes to URL length for SEO, there is no definitive answer. However, it's generally recommended to keep URLs concise, include relevant keywords, avoid excessive parameters and unnecessary characters, use hyphens as word separators, maintain consistency, and prioritize usability and readability. Remember, URL length is just one factor among many that affect SEO.
-
Somewhere up to 75 characters max, from what I read. Longer than that could cause some difficulties in ranking.
-
While the length of a URL can have some impact on search engine optimization (SEO), it is generally recommended to keep URLs concise and relevant to the content of the page. URLs with fewer words tend to be easier for users to read and remember, and they also tend to be more user-friendly for sharing and linking purposes.
The impact of URL length on SEO is relatively small compared to other factors such as the quality and relevance of the content on your website, backlinks, site speed, user experience, and overall website optimization.
In terms of your specific scenario, where your competitors have 8-character domain URLs and keywords with a length of 13, and your site has a 15-character domain URL and keywords of the same length, it's unlikely that the slight difference in URL length alone would significantly impact your search engine rankings.
Google's algorithms consider numerous factors when determining the relevance and ranking of a website, and URL length is just one of them. It's important to focus on creating high-quality content, using relevant keywords, and ensuring a positive user experience on your website. These factors are likely to have a more substantial impact on your search engine rankings than the length of your URL.
-
I have tried to use proper URL length in my site but in some instances, long tail KWs mess it up. Then you have no option but a more than appropriate URL length
-
but sometimes the the long tail KW makes it difficult to have shorter URL length. for example "how many questions can you ask chatgpt"
-
When it comes to URL length in SEO (Search Engine Optimization), there is no strict rule for the maximum or ideal length. However, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines to consider:
Descriptive and Relevant: A URL should give users and search engines a clear idea of what the page is about. Including relevant keywords or a brief description of the content can help improve understanding and visibility.
Concise and Readable: Aim for shorter URLs that are easy to read and remember. Long, complex URLs can be confusing and difficult to share. Use hyphens (-) to separate words within the URL safe-ways and avoid using unnecessary characters, numbers, or special characters.
Avoid Keyword Stuffing: While it's important to include relevant keywords, avoid keyword stuffing in URLs. Maintain a natural flow and readability, and prioritize clarity over excessive keyword usage.
Maintain Consistency: Consistency in URL structure can benefit both users and search engines. Use a consistent format throughout your website, which can include using lowercase letters, eliminating unnecessary parameters, and organizing URLs in a logical and hierarchical manner.
-
@calvinkj Always analyze your competitors and analyze the length of their URLs.
Most people do big mistake and add long tail keyword in URL which isn't a good SEO practice.
I always add max. 4-5 words in URL for my site articles and in some articles where search volume is relatively lower, I do add more words but the best practice is have the shorter URL as possible.
Again, competitor analysis is the key
-
Some experience from words and hypehns in domain names
I used a hyphenated site www.octopus-energy-referral.co.uk and it is not doing too well compared to the non-hyphenated name. Similarly I have a site www.octopuscode.co.uk and it is doing really well compared to the hyphenated name because is is short and has fewer key words..
I know this is not a forensic comparison but I believe a non-hyphenated short name with fewer keywords is best if you have a choice.. -
If you haven't read this yet, please do (best practices for URLs).
So, it's a combination of things. As Devi Allen said, less is more. You want to use (and not over-use) descriptive words, separated by hyphens, "keeping URLs as simple, relevant, compelling, and accurate as possible". "To correctly render in all browsers, URLs must be shorter than 2,083 characters."
Which is better, your URL or your competitors? They sound pretty close based on your description but what matters is the actual words used in the URL, the site structure represented by that construct, whether the words truly represent what a visitor will find on the page, and whether the page content will provide visitors with the information they came looking for. URL length is but one of many factors that go into determining whether you or your competitor will rank higher.
-
You already answer it, less word is better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
It's possible a bounce-rate attack manipulate SEO?
My site has been visited by unusual users with one second session times. This leaves my analytics data confused.
White Hat / Black Hat SEO | | CompraBit0 -
SEO Tactics - All in the Game?
Hey Mozzers Hoping to get some opinions on SEO at a small business level. We're engaged in SEO for a number of clients which are small businesses (small budgets). We stick to strictly white hat techniques - producing decent content (and promoting it) and link building (as much as is possible without dodgy techniques/paying huge sums). For some clients we seem to have hit a ceiling about with rankings anywhere between roughly position #5 - #15 in Google. In the majority of cases - the higher ranking clients don't appear to be engaged in any kind of content marketing - often have much worse designed websites - and not particularly spectacular link profiles (In other words they're not hugely competitive - apart from sometimes on the AdWords front - but that's another story) The only difference seems to be links on agency link farms - you know the kind? Agency buys expired domains with an existing PR - then just builds simple site with multiple blog posts that link back to their clients sites. (Also links that are simply paid for) Obviously these sites serve no purpose other than links - but I guess it's harder for Google to recognize that than with obvious SEO directories etc?... It seems to me that at this level of SEO for small businesses (limited budgets, limited time) the standard approach for SEO is the "expired domains agency link sites" described above - and simply paying bloggers for links. Are the above techniques considered black hat? Or are they more grey-hat? - Are they risky? - Or is this kind of thing all in the game for SEO at the small business level (by that I mean businesses that don't have the budget to employ a full time SEO and have to rely on engaging agencies for low level - low resource SEO campaigns) Look forward to your always wise council...
White Hat / Black Hat SEO | | wearehappymedia0 -
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
Competitor Bad Practice SEO Still Ranking Well But Why ?
Moz Friends, A very close competitor have always been challenging for similar competitive keywords. We seem to have the advantage for alot of long tail keywords but on one of the higher traffic relevant keywords they seem to do well. I really struggle to understand why, particularly with the back links they use Just my thoughts and notes on the two: Our Page Better written text content (Maybe slightly written to for experienced target audience but we are working on simplifying things) Good Clear site URL structure and navigation for usability Fresh content updates Mobile optimized Reasonable page speeds Good on-page optimization Good back links from industry influences Competitor Page Negatives Site structure and URL's are inconsistent and messy Lower quality content site wide They use tried and tested on page optimization methods like Keyword spamming, Bold Keywords,Underlining Keywords (Sarcasm) Terrible back links, all directories and free article submission sites (Seriously take a look) Less focused on page optimization Not mobile optimized Most of the rest of the sites carry on the same sort of differences, Engine: www.google.co.uk Keyword: Sound level meters **Our Page: **www.cirrusresearch.co.uk/products/sound-level-meters/ **Competitor Page: **www.pulsarinstruments.com/product-information/Sound-Level-Meter.html Any feedback would be greatly appreciated please, i am really struggling to get my head around this Thanks James
White Hat / Black Hat SEO | | Antony_Towle1 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Does posting on Craigslist damage our SEO or reuptation?
We have a website that's a single person barbershop. She has been promoting on Craigslist, and that is outranking the website in the SERPs. However, the craigslist results showing up are actually expired and don't link to anything. They just seem to be cached by Craigslist. My question is, is Craigslist considered to generally not be a good avenue for directing inbound links for services on your site? Or is it a good strategy to use Craigslist to build link traffic for service businesses? I get mixed responses when I search for this. Thanks eYtdHtg.png
White Hat / Black Hat SEO | | smallpotatoes0 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0 -
Negative SEO - Case Studies Prove Results. De-rank your competitors
Reading these two articles made me feel sick. People are actually offering a service to de-rank a website. I could have swore I heard Matt Cutts say this was not possible, well the results are in. This really opens up a whole new can of worms for google. http://trafficplanet.com/topic/2369-case-study-negative-seo-results/ http://trafficplanet.com/topic/2372-successful-negative-seo-case-study/ This is only going to get worse as news like this will spread like wildfire. In one sense, its good these people have done this to prove it to google its just a pity they did it on real business's that rely on traffic.
White Hat / Black Hat SEO | | dean19860