Is using dots in URL path really a problem?
-
we have a couple of pages displaying a dot in the URL path like
domain.com/mr.smith/widget-mr.smith
It displays fine in chrome, firefox and IE and for the user it may actually look better than replacing it by _ or -.
Did this ever cause problems to anybody?
Any statement from google about it?
Should I change existing URLs?If so, which other characters can I use in the URL instead of underscore and dash, since in our system dash and underscore are already used for rewriting other characters.
Thanks
-
Hi Andrews,
While the difference between dashes and underscores use to be a big issue a few years back its something that seems to hold minimal merit now. The two can be used rather interchangeably without any major impact. This was phased out around the same time as exact-match-domain value was as far too many people were abusing the long-tail dash page method.
-
While I've never come across this exact problem before I can share with you one my mantras that applies here:
"If a system (browser, search engine, etc) needs to perform a data re-write, you aren't accessible enough."
Google loves accessibility. It always wants the user to be able to easily access information and it wants it's spiders to be able to easily index and categorize the information. When accessibility options such as javascript versioning or if a site is using flash or not have an impact then it would only logically follow that more obvious structural access issues do come into effect.
From a technology stand point I can tell you that "." is not traditionally used in the scope of a URL/file structure as it a reserved character and therefore your structure is being re-written to display those. Much like international domains like the chinese internationalized domain name extension .中国 (which is basically a visual re-encode of the unicode: xn--fiqs8s) For the sake of accessibility, proper structure formatting and system practicality you should avoid using non-standard characters such as the . in your url
-
Hi!
As far as I know, this really isn't a huge problem (could be mistaken). I guess it depends...
In regards to readability, I prefer using dashes (-), as they tend to be easier to read. Underscores may be mistaken for a space). Here's what Matt Cutts had to say about this some years ago: http://www.mattcutts.com/blog/whitehat-seo-tips-for-bloggers/ (and http://www.mattcutts.com/blog/dashes-vs-underscores/)
I believe I have read that Google and other search engines read URLs like this when looking for semantic meanings:
- /this-is-part-of-a-website-address = this is part of a website address
- /this_is_part_of_a_website_address = thisispartofawebsiteaddress
At least that used to be the case...It could be changed now.
In your example, I would not obsess too much about it, as it gives perfect semantic meaning. Have you considered removing special characters, instead of replacing them with a "-" ?
Hope this helps.
Best regards,
Anders
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidate URLs on Wordpress?
Hi Guys, On a WordPress site, we are working with currently has multiple different versions of each URL per page. See screenshot: https://d.pr/i/ZC8bZt Data example: https://tinyurl.com/y8suzh6c Right now the non-https version redirects to the equivalent https versions while some of the https versions don't redirect and are status code 200. We all want all of them to redirect to the highlighted blue version (row a).Is this easily doable in wordpress and how would one go about it? Cheers.
Intermediate & Advanced SEO | | wickstar1 -
What is the Redirect Rule for corresponding https urls to new domain with the same https urls?
2 sites have the same urls but the owner wants just the 1 site. So I will be doing a 301 redirect with .htaccess from https://www.example.co.uk/sportsbook/SOCCER/today/ redirecting to https://www.example.com//sportsbook/SOCCER/today/ There are a lot of urls that are the same, so I was wondering what the rule is to put in the file please that will change them all to the corresponding urls? Would this be correct?... RewriteEngine on
Intermediate & Advanced SEO | | WSIDW
RewriteCond %{HTTPS_HOST} ^example.co.uk [NC,OR]
RewriteCond %{HTTPS_HOST} ^www.example.co.uk [NC]
RewriteRule ^(.*)$ https://example.com$1 [L,R=301,NC] Or would a simple rule like this work... redirect 301 / http://www.new domain.com/ If not correct could you please give me the correct rule, thanks! Then of course doing a change of address of address in webmaster tools after. Also... do I still need to do the forwarding from the https://www.example.co.uk/ domain provider after as well? Many thanks for your help in advance.0 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Problem with Google finding our website
We have an issue with Google finding our website: (URL removed) When we google "(keyword removed)" in google.com.au, our website doesn't come up anywhere. This is despite inserting the suitable title tag and onsite copy for SEO. We found this strange, and thought we'd investigate further. We decided to just google the website URL in google.com.au, to see if it was being properly found. Our site appeared at the top but with this description: A description for this result is not available because of this site's robots.txt – learn more. We also can see that the incorrect title tag is appearing. From this, we assumed that there must be an issue with the robot.txt file. We decided to put a new robot.txt file up: (URL removed) This hasn't solved the problem though and we still have the same issue. If someone could get to the bottom of this for us, we would be most appreciative. We are thinking that there may possibly be another robot.txt file that we can't find that is causing issues, or something else we're not sure of! We want to get to the bottom of it so that the site can be appropriately found. Any help here would be most appreciated!
Intermediate & Advanced SEO | | Gavo0 -
Question about using abbreviation
Hello, I have this abbreviation inside my domain name, ok? now for a page URL name, do you recommend me to use the actual word (which shortened form of it is inside domain name) in a page name? Or when have abbreviation in domain name, then using its actual word in a page name is not good? It's all about how much google recognize abbreviation as the actual word and gives the same value of word to it? do I risk not using the actual word? Hope made myself clear ) thanks.
Intermediate & Advanced SEO | | mdmoz0 -
For those of you that used LINK DETOX.
Did you go ahead and remove all the TOXIC and HIGH RISK links? Just the toxic? Were you successful with the tool?
Intermediate & Advanced SEO | | netviper0 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2 -
Long URL with QueryStrings
Hi, I have a search page that generates some querystrings (with the term, current page, number of pages etc). This long url is something bad for Google indexing? Thanks.
Intermediate & Advanced SEO | | GDB0