URL or Domain length
-
Hi All,
I am wondering if google still does give importance to the length of the domain or url. If yes then how much is the acceptable length of a domain and URL.
Many Thanks!
-
Thank you guys for the responses. I will take your advice into consideration.
-
the longer the domain name the harder it is to remember it or type it in.
URL length should be kept as short as possible which still being descriptive but not keyword stuffed, that includes directories made up just for the sake of adding in keywords. However people have done experiments and Google will index pages with extremely long URLs, but it doesn't mean it's good.
Basically, just use common sense when purchasing domains and naming pages. what looks better to you
or
www.wayneinspections.com/10-tips-on-choosing-the-right-home-inspector-in-nj
-
Normally i would go with a url that is less then 20 characters as most people can remember 20 characters or less. The recommended of what I have seen is 3 to 40 characters. But Google, from what I have seen does not tell you what they recommend. But remember, Google looks for a good user experience and a shorter url promotes people remembering your site, which in turn is a good url experience,
-
I think there is nothing wrong about domain length, BUT you have to just consider domain name which should not be the exact keyword or phrase you want to target.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL indexed but not submitted in sitemap, however the URL is in the sitemap
Dear Community, I have the following problem and would be super helpful if you guys would be able to help. Cheers Symptoms : On the search console, Google says that some of our old URLs are indexed but not submitted in sitemap However, those URLs are in the sitemap Also the sitemap as been successfully submitted. No error message Potential explanation : We have an automatic cache clearing process within the company once a day. In the sitemap, we use this as last modification date. Let's imagine url www.example.com/hello was modified last time in 2017. But because the cache is cleared daily, in the sitemap we will have last modified : yesterday, even if the content of the page did not changed since 2017. We have a Z after sitemap time, can it be that the bot does not understands the time format ? We have in the sitemap only http URL. And our HTTPS URLs are not in the sitemap What do you think?
Intermediate & Advanced SEO | | ZozoMe0 -
Why is a canonicalized URL still in index?
Hi Mozers, We recently canonicalized a few thousand URLs but when I search for these pages using the site: operator I can see that they are all still in Google's index. Why is that? Is it reasonable to expect that they would be taken out of the index? Or should we only expect that they won't rank as high as the canonical URLs? Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
Duplicate URLs ending with #!
Hi guys, Does anyone know why a site can contain duplicate URLs ending with hastag & exclamation mark e.g. https://site.com.au/#! We are finding a lot of these URLs (as duplicates) and i was wondering what they are from developer standpoint? And do you think it's worth the time and effort adding a rel canonical tag or 301 to these URLs eventhough they're not getting indexed by Google? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
Community inside the domain or in a separate domain
Hi there, I work for an ecommerce company as an online marketing consultant. They make kitchenware, microware and so on. The are reviewing their overall strategy and as such they want to build up a community. Ideally, they would want to have the community in a separate domain. This domain wouldn't have the logo of the brand. This community wouldn't promote the brand itself. The brand would post content occassionally and link the store domain. The reasoning of this approach is to not interfere in the way of the community users and also the fact that the branded traffic acquired doesn't end up buying at the store I like this approach but I am concerned because the brand is not that big to have two domains separated and lose all the authority associated with one strong domain. I would definitely have everything under the same domain, store and community, otherwise we would have to acquire traffic for two domains. 1. What do you think of both scenarios, one domain versus two? Which one is better? 2. Do you know any examples of ecommerce companies with successful communities within the store domain? Thanks and regards
Intermediate & Advanced SEO | | footd0 -
Why do some domains out rank stronger authority domains
Hi, If we take the Moz stats into account here, how comes sometimes weak Moz stat domains out ranking strong Moz stat domains? For example: A inner page with DA56 / PA40 is outranking a Wikipedia inner page with DA100 / PA82. That's a massive difference basically twice as strong on the Wikipedia page but being out ranking. In this case I assume the onpage SEO is playing a big part, but can onpage optimisation be that powerful? And I see this all the time, what SEO factors cause this? Thanks.
Intermediate & Advanced SEO | | Bondara0 -
Redirecting Powerful Domains
What do you do if you have a client that never implemented a 301 redirect on their domain? For example here are the OSE stats for the URLs; http://url.com PA: 48 DA: 50 LRD: 65 TL: 1,084 FB: 178 FB: 14 T:5 http://www.url.com PA: 51 DA: 50 LRD: 165 TL: 2,271 FB: 178 FB: 14 T:5 G+1:3 My first instincts are to redirect the first one to the second one, but is it too late for that? Will that screw up all of their established stats? Any input or examples of past experiences with this would be great.
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Domain migration strategy
Imagine you have a large site on an aged and authoritative domain. For commercial reasons the site has to be moved to a new domain, and in the process is going to be revamped significantly. Not an ideal starting scenario obviously to be biting off so much all at once, but unavoidable. The plan is to run the new site in beta for about 4 weeks, giving users the opportunity to play with it and provide feedback. After that there will be a hard cut over with all URLs permanently redirected to the new domain. The hard cut over is necessary due to business continuity reasons, and real complexity in trying to maintain complex UI and client reporting over multiple domains. Of course we'll endeavour to mitigate the impact of the change by telling G about the change in WMC and ensuring we monitor crawl errors etc etc. My question is whether we should allow the new site to be indexed during the beta period? My gut feeling is yes for the following reasons: It's only 4 weeks and until such time as we start redirecting the old site the new domain won't have much whuffie so there's next to no chance the site will ranking for anything much. Give Googlebot a headstart on indexing a lot of URLs so they won't all be new when we cut over the redirects Is that sound reasoning? Is the duplication during that 4 week beta period likely to have some negative impact that I am underestimating?
Intermediate & Advanced SEO | | Charlie_Coxhead0