Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Special characters in URL
-
Hi There,
We're in the process of changing our URL structure to be more SEO friendly. Right now I'm struggling to find a good way to handle slashes that are part of a targeted keyword.
For example, if I have a product page and my product title is "1/2 ct Diamond Earrings in 14K Gold" which of the following URLs is the right way to go if I'm targeting the product title as the search keyword?
- example.com/jewelry/1-2-ct-diamond-earrings-in-14k-gold
- example.com/jewelry/12-ct-diamond-earrings-in-14k-gold
- example.com/jewelry/1_2-ct-diamond-earrings-in-14k-gold
- example.com/jewelry/1%2F2-ct-diamond-earrings-in-14k-gold
Thanks!
-
Jonaz just to add to what others said..
#1 would be the most logical answer.
/ (forward slash) indicates a new directory so you can't use that.
% is reserved for character encoding so you shouldn't use that.
_ (underscore) joins as one word
and 12ct would be wrong
-
doesn't seem to, no
-
Quick follow-up question: Does google treat the phrases "half" and "1/2" as the same?
-
You could totally replace common occurrences:
- 1/2 = half
- 1/4 = quarter
- 1/3 - third
- etc
Then just remove the less common ones entirely.
-
I personally would go with #1. Definitely not #4, you never want special characters in the URL. The reason I say number 1 is because it separates the 1 from the 2 in your 1/2. #2 could be confused for a 12ct diamond earring, WOW. #3 I typically avoid underscores in all URLs.
To sum up my choice is #1. Looks cleanest and when you optimize your page with the 1/2 ct wording, Google is smart enough to see that. Overall, it won't probably make a huge difference in the end.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB3 -
Category URL Pagination where URLs don't change between pages
Hello, I am working on an e-commerce site where there are categories with multiple pages. In order to avoid pagination issues I was thinking of using rel=next and rel=prev and cannonical tags. I noticed a site where the URL doesn't change between pages, so whether you're on page 1,2, or 3 of the same category, the URL doesn't change. Would this be a cleaner way of dealing with pagination?
Technical SEO | | whiteonlySEO0 -
Is it Detrimental to Repeat a Word in Our URL?
Hey guys! We run a tour company in Barcelona. Our company name is Barcelona Experience. We're customizing our URL's to include keywords which can be found in all the important areas on the page (title tage, meta descp., etc).
Technical SEO | | BarcelonaExperience
We want to change "www.barcelonaexperience.com/bike-tours" to "www.barcelonaexperience.com/barcelona-bike-tours"
We're worried the repetition of "barcelona" could be a bad thing. True, or not true? Thanks!0 -
XML Sitemap and unwanted URL parameters
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !
Technical SEO | | jfmonfette0 -
Cyrillic letter in URL - Encoding
Hi all We are launching our site in Russia. As far as I can see by searching Google all sites have URLs in latin letters. Is there a special reason for this? - It seems that cyrillic letters also work. My technical staff says that it might give some encoding problems. Can anyone give me some insight into this? Thanks in advance.. / Kenneth
Technical SEO | | Kennethskonto0 -
URL rewriting from subcategory to category
Hello everybody! I have quite simple question about URL rewriting from subcategory to category, yet I can't find any solution to this problem (due to lack of my deeper apache programming knowledge). Here is my problem/question: we have two website url structures that causes dublicate problems: www.website.lt/language/category/ www.website.lt/language/category/1/ 1 and 2 pages are absolutely same (both also returns 200 OK). What we need is 301 redirect from 2 to 1 without any other deeper categories redirects (like www.website.com/language/category/1/169/ redirecting to .../category/1/ or .../category/). Here goes .htaccess URL rewrite rules: RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&par3=$5&par4=$6&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&par3=$5&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/$ /index.php?lang=$1&idr=$2&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/$ /index.php?lang=$1&%{QUERY_STRING} [L] There are other redirects that handles non-www to www and related issues: RedirectMatch 301 ^/lt/$ http://www.domain.lt/ RewriteCond %{HTTP_HOST} ^domain.lt RewriteRule (.*) http://www.domain.lt/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$RewriteRule ^(.)$ http://www.domain.lt/$1/ [R=301,L] At this moment we cannot solve this problem with rel canonical (due to our CMS limits). Thanks for your help guys! If You need any other details on our coding, just let me know.
Technical SEO | | jkundrotas0 -
Double byte characters in the URL - best avoided?
We are doing some optimisation on sites in the APAC region, namely China, Hong Kong, Taiwan and Japan. We have set the url generator to automatically use the heading of the page in the URL which works fine for countries using Latin characters, but is causing problems, particularly in IE, when it comes to the double byte countries. For some reason, IE struggles with double byte and displays URLs in their rather ugly, coded form. Anybody got any suggestions on whether we should persist with the keyword URLs or revert to the non-descriptive URLs for the double byte countries? The reason I ask is it's a balance of SEO benefit vs not scaring IE users off with ugly URLs that look dreadful and spammy.
Technical SEO | | Red_Mud_Rookie0