What is the best way to handle special characters in URLs
-
What is the best way to handle special characters? We have some URL's that use special characters and when a sitemap is generate using Xenu it changes the characters to something different. Do we need to have physically change the URL back to display the correct character?
Example:
URL: http://petstreetmall.com/Feeding-&-Watering/361.html
Sitmap Link: http://www.petstreetmall.com/Feeding-%26-Watering/361.html
-
As far as "search-engine-spider-stoppers" does this also go for "%" in on-page copy?
-
Hi there,
The special characters on your website URL is changed because URLs are sent over the Internet using the ASCII character-set. That's why the & is converted to %26.
If you're trying to make your site rank, it is better to just simplify or shorten the URL. This is because search engines have problems indexing sites when the URLs contain special characters.
The special characters below are known to be "search-engine-spider-stoppers":
- ampersand (&)
- dollar sign ($)
- equals sign (=)
- percent sign (%)
- question mark (?)
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing URLS: from a short well optimised URL to a longer one – What's the traffic risk
I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages. They've invested in their content over the years and managed to rank well for key search terms. They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed. E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.
Intermediate & Advanced SEO | | moge0 -
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
What are the best practices for microdata?
Not too long ago, Dublin Core was all the rage. Then Open Graph data exploded, and Schema seems to be highly regarded. In a best-case scenario, on a site that's already got the basics like good content, clean URLs, rich and useful page titles and meta descriptions, well-named and alt-tagged images and document outlines, what are today's best practices for microdata? Should Open Graph information be added? Should the old Dublin Core be resurrected? I'm trying to find a way to keep markup light and minimal, but include enough microdata for crawlers to get a better sense of the content and its relationships to other subdomains and sites.
Intermediate & Advanced SEO | | WebElaine0 -
Are these URLs too Keyword-packed?
Hi guys, Here is the URL: http://www.consumerbase.com/mailing-lists/dog-stores-mailing-list.html The target keywords are "Dog stores mailing list" and "Dog stores mailing lists" Does having "mailing-list" and "mailing-lists" in my URL hurt me?
Intermediate & Advanced SEO | | Travis-W0 -
Whats the best way to revive a directory that was 301'd and now I want to remove that?
Last year i 301'd one of my directories on my site, pointing everything to a different directory. Long story short I am going to sell this product line again and would like to just remove the 301 to that original directory, but I am reading that the 301s are also cached in most browsers for a long time. Has anyone successfully done this and if you did what was it that you had to do? Thanks Mike
Intermediate & Advanced SEO | | SandyEggo0 -
URL tracking on offline material
Hi there, Hope someone can give some advice. We are doing some magazine advertising, the main purpose of the advert is to promote one of our new products, however the URL goes something like this: http://www.domain.com/products/new-product-libra-furniture/ which is just too long for anyone to remember, I think it should be simply domain.com/libra which redirects to the product page, however how can I track this in Google Analytics? if using a 301 that's impossible? Any advice would be grateful.
Intermediate & Advanced SEO | | Paul780 -
List of Best SEO forums
Could I please get some input on a list of the best SEO forums out there besides SEOmoz? (Both mainstream and non mainstream)
Intermediate & Advanced SEO | | Luia0 -
Best solution to get mass URl's out the SE's index
Hi, I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there. So say for example the problem URL's are like www.mysite.com/incorrect-directory/folder1/page1/ It seems I can correct this by doing the following: 1/. Use Robots.txt to disallow access to /incorrect-directory/* 2/. 301 the urls like this:
Intermediate & Advanced SEO | | James77
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/ 3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/ 301 to:
www.mysite.com/correct-directory/ Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.0