Use of the tilde in URLs
-
I just signed up for SEOMoz and sent my site through the first crawl. I use the tilde in my rewritten URLs. This threw my entire site into the Notice section 301 (permanent redirect) since each page redirects to the exact URL with the ~, not the %7e.
I find conflicting information on the web - you can use the tilde in more recent coding guidelines where you couldn't in the old.
It would be a huge thing to change every page in my site to use an underscore instead of a tilde int he URL. If Google is like SEOMoz and is 301 redirecting every page on the site, then I'll do it, but is it just an SEOMoz thing?
I ran my site through Firebug and and all my pages show the 200 response header, not the 301 redirect.
Thanks for any help you can provide.
-
Thanks for all the advice! I realized that Google doesn't care about the tilde -- or at least is not doing the same redirect as SEOMoz. Recently one of my older sitemaps was flagged by Google with errors because too many of the files were redirecting. All of my sitemaps would be flagged if pages were redirecting on a wide scale.
My pages generally rank in the top 5 in Google and maybe losing the tilde would get me to #1, so I'll keep it in mind for the future. Thanks again for the help.
-
We use tildes pretty heavily on our new site. They seem to be okay with Google. However I did not want to use them because some foreign keyboards do not include the character... like Mexico.
So... do folks in Mexico type in our URLs by hand? Probably not common... but it is a potential problem. It is missing from other keyboards as well.
We use the tilde because we think it helps break up words we do not want to be seen as "together" in a string. All my product URLs have the product name, separated by dashes, then we use the tilde then comes the product number. We think it may help Google see the product title as a complete string and not include the product number. Not sure if it works or not.
-
Tildes are okay these days, but 'unsafe'.
http://www.cs.tut.fi/~jkorpela/rfc/2396/full.html#2.3
Tilde was (begrudgingly) added to the unreserved character list a while ago, so Google should treat them fine without encoding.
However, if you can avoid using them I would, so leave the old addresses but from now on I'd use a hyphen (in preference to an underscore, still) instead of a tilde if you can.
-
Hi,
Since this is really about the way that the tool works, the quickest and most accurate way of getting the correct answer would be to email help@seomoz.org.
That being said, avoiding special characters would be our company's preferred option. This thread from Google Webmaster Central would be worth a read.
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento: Moz finding URL and URL?p=1 as duplicate. Solution?
Good day Mozzers! Moz bot is finding URL's in the Catalogue pages with the format www.example.com/something and www.example.com/something?p=1 as duplicate (since they are the same page) Whats the best solution to implement here? Canonical? Any other? Cheers! MozAddict
Moz Pro | | MozAddict0 -
How do i fix the problem of having 2 url's splitting my rankings?
please excuse my noobness. i have a nice site, www.soundsenglish.com which I built from scratch and learned by doing. It has lots of nice content and it does ok, my rankings are woeful mostly cos of all the mistakes i made building it...i'll fix that stuff. This stuff i don't know about. from my adsense i get 2 listings www.soundsenglish.com and soundsenglish.com wierdly the second one gets consistently higher paying ads although most of the visitors come through the first but they are both the same landing page same content -as far as i can tell. when i try to find rankings, use the seo tools etc i get diferent scores, so whatever it is, it is splitting the sites - can't be a good thing. i have no idea why this happens and i have some inkling that maybe i need something to do with cannonical redirects or maybe a 301 redirect. both of which i have little idea how to do. If that isn't enough naive blundering about for you, i have a little more... it occurs to me that this prpoblem is probably happening with every page on my site, i.e. the 'juice ' is not getting credited onto that one page. this surely means cannonical redirects but even afterreading up on them idon't quite get it. or rather ido but idon;t get how to apply it to my context.
Moz Pro | | soundsenglish0 -
Does SeoMoz realize about duplicated url blocked in robot.txt?
Hi there: Just a newby question... I found some duplicated url in the "SEOmoz Crawl diagnostic reports" that should not be there. They are intended to be blocked by the web robot.txt file. Here is an example url (joomla + virtuemart structure): http://www.domain.com/component/users/?view=registration and the here is the blocking content in the robots.txt file User-agent: * _ Disallow: /components/_ Question is: Will this kind of duplicated url errors be removed from the error list automatically in the future? Should I remember what errors should not really be in the error list? What is the best way to handle this kind of errors? Thanks and best regards Franky
Moz Pro | | Viada0 -
Mozcape API Batching URLs LIMIT
Guys, there's an example to batching URLs using PHP: http://apiwiki.seomoz.org/php Which is the maximum number of URLs I can add to that batch?
Moz Pro | | Srvwiz0 -
Finding the source of duplicate content URL's
We have a website that displays a number of products. The product has variations (sizes) and unfortunately every size has its own URL (for now anyway). Needless to say, this causes duplicate content issues. (And of course, we are looking to change the URL's for our site as soon as possible) However, even though these duplicate URL's exist, you should not be able to land on them by navigating through the site. In theory, the site should always display the link to the smallest size. It seems that there is a flaw in our system somewhere, as these links are now found in our campaign here on SEOmoz. My question: is there any way to find the crawl path that lead to the URL's that shouldn't have been found, so we can locate the problem?
Moz Pro | | DocdataCommerce0 -
I want to hire someone to write some PHP code using the SEOmoz API.
...but I'm not sure how to go about it. What I need is simple: all I want is to be able to paste a list of URLs (different domains), and have the program return the Page Authority for all those URLs. I understand I can use the free SEOmoz API, particularly the URL Metrics API. Then I want to export the data to an Excel file. That's it. Problem is, I have absolutely no clue how to do it. Obviously I'd pay someone to do it. Pay very well if you can do it professionally and quickly. How can I go about finding the right person for the job? Apologies if this is not the right place to ask this, but I don't know where else to go. Thanks.
Moz Pro | | thegreatpursuit0 -
Using the PA metric from the mozbar
Hey guys, just a quick question about page authority. Often I'll be on a page with a high domain authority but the page authority is 0 as it says there are 0 links from 0 domains to the page. This is despite the fact there is a followed link from the homepage to that page. Is this just a bug in PA, and can I assume that the page will have at least some authoirty as a link from the homepage is pointed to it? Or is there some other factor that may be preventing the page to not have any PA. Thanks
Moz Pro | | SureFire0