Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Use of the tilde in URLs
-
I just signed up for SEOMoz and sent my site through the first crawl. I use the tilde in my rewritten URLs. This threw my entire site into the Notice section 301 (permanent redirect) since each page redirects to the exact URL with the ~, not the %7e.
I find conflicting information on the web - you can use the tilde in more recent coding guidelines where you couldn't in the old.
It would be a huge thing to change every page in my site to use an underscore instead of a tilde int he URL. If Google is like SEOMoz and is 301 redirecting every page on the site, then I'll do it, but is it just an SEOMoz thing?
I ran my site through Firebug and and all my pages show the 200 response header, not the 301 redirect.
Thanks for any help you can provide.
-
Thanks for all the advice! I realized that Google doesn't care about the tilde -- or at least is not doing the same redirect as SEOMoz. Recently one of my older sitemaps was flagged by Google with errors because too many of the files were redirecting. All of my sitemaps would be flagged if pages were redirecting on a wide scale.
My pages generally rank in the top 5 in Google and maybe losing the tilde would get me to #1, so I'll keep it in mind for the future. Thanks again for the help.
-
We use tildes pretty heavily on our new site. They seem to be okay with Google. However I did not want to use them because some foreign keyboards do not include the character... like Mexico.
So... do folks in Mexico type in our URLs by hand? Probably not common... but it is a potential problem. It is missing from other keyboards as well.
We use the tilde because we think it helps break up words we do not want to be seen as "together" in a string. All my product URLs have the product name, separated by dashes, then we use the tilde then comes the product number. We think it may help Google see the product title as a complete string and not include the product number. Not sure if it works or not.
-
Tildes are okay these days, but 'unsafe'.
http://www.cs.tut.fi/~jkorpela/rfc/2396/full.html#2.3
Tilde was (begrudgingly) added to the unreserved character list a while ago, so Google should treat them fine without encoding.
However, if you can avoid using them I would, so leave the old addresses but from now on I'd use a hyphen (in preference to an underscore, still) instead of a tilde if you can.
-
Hi,
Since this is really about the way that the tool works, the quickest and most accurate way of getting the correct answer would be to email help@seomoz.org.
That being said, avoiding special characters would be our company's preferred option. This thread from Google Webmaster Central would be worth a read.
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Using Weglot on wordpress (errors)
Good day to you all, Does anyone have experience of the errors being pulled up by Moz about the utility of the weglot plugin on Wordpress? Moz is pulling up URLs such as: https://www.ibizacc.com/es/chapparal-2/?wg-choose-original=false These are classified under "redirect issues" and 99% of the pages are with the ?wg-choose parameter in the URL. Is this having an actual negative impact on my search or is it something more Moz related being highlighted. Any advice be appreciated and a resolution .. Im thinking I could exclude this parameter.
Moz Pro | | alwaysbeseen0 -
What is the best way to treat URLs ending in /?s=
Hi community, I'm going through the list of crawl errors visible in my MOZ dashboard and there's a few URLs ending in /?s= How should I treat these URLs? Redirects? Thanks for any help
Moz Pro | | Easigrass0 -
Url-delimiter vs. SEO
Hi all, Our customer is building a new homepage. Therefore, they use pages, which are generated out of a special module. Like a blog-page out of the blog-module (not only for blogs, also for lightboxes). For that, the programmer is using an url-delimiter for his url-parsing. The url-delimiter is for example a /b/ or /s/. The url would look like this: www.test.ch/de/blog/b/an-article www.test.ch/de/s/management-coaching Does the url-delimiter (/b/ or /s/ in the url) have a negative influence on SEO? Should we remove the /b/ or /s/ for a better seo-performance Thank you in advance for your feedback. Greetings. Samuel
Moz Pro | | brunoe10 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Items listed based on size - Use "inches" or " " "
I sell Decorative nutcrackers of various sizes. At this time, I use the term "inches". I was looking at my competitor the other day, and when I did a search on his site, using the term inches did not provide me with a result. I was forced to use ex. "10" nutcracker" instead of "10 inch nutcracker". Is there a preferable usage for seo purposes. Thanks! my site: http://www.nutcrackerballetgifts.com/category/5/10-Inch-Nutcrackers.html his ite: http://www.kurtadler.com/Search?search=10"+nutcracker
Moz Pro | | NutcrackerBalletGifts0 -
What is the logarithmic scale used for domain authority?
I want to quantify how much better a score of 80 is compared to 60. Or 60 compared to 30 etc.... What is the logarithm base? Thanks, Rik
Moz Pro | | garypropellernet0 -
What do you use for site audit
What tools do you use for conducting a site audit? I need to do an audit on a site and the seomoz web crawler and on page optimization will takes days if not a full week to return any results. In past Ive used other tools that I could run on the fly and they would return broken links, missing htags, keyword density, server information and more. Curious as to what you all use and what you may recommend to use in conjunction with the moz tools.
Moz Pro | | anthonytjm0 -
Is there a tool to upload multiple URLs and gather statistics and page rank?
I was wondering if there is a tool out there where you can compile a list of URL resources, upload them in a CSV and run a report to gather and index each individual page. Does anyone know of a tool that can do this or do we need to create one?
Moz Pro | | Brother220