Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Ok to Put a Decimal in a URL?
-
I'm in the process of creating new product specific URLs for my company. Some of our products have decimals in them for their names as a unit of measurement.
For example - .The URL for a 050" widget would be something like:
http://www.example.com/product/category/.050-inch-widget
My question is - Can I use a decimal in the URL without ticking off the search engines, and/or causing any other unexpected effects?
-
Thanks guys. This is an interesting case indeed. Maybe it's just me being an SEO, but I tend to look at the URL on a good majority of the pages I hit. I don't want to confuse my users or provide them with inaccurate information.
.4mm and 4mm is a big difference.
I'm going to try out the decimal point and see how it behaves. I'll report back once we get the pages up.
-
I'd be inclined to remove the decimal point.. reduces the URL by 1 character Use the title & meta tags for this
-
As long as it's not a reserved URL character it should be fine. A file name that starts with a dot might cause issues with some web servers but not Google. Avoid using other reserved punctuation, however (i.e. ?, &, etc)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Appending Blog URL inbetween my homepage and product page is it issue with base url?
Hi All, Google Appending Blog URL inbetween my homepage and product page. Is it issue or base url or relative url? Can you pls guide me? Looking to both tiny url you will get my point what i am saying. Please help Thanks!
Technical SEO | | amu1230 -
Duplicate content w/ same URLs
I am getting high priority issues for our privacy & terms pages that have the same URL. Why would this show up as duplicate content? Thanks!
Technical SEO | | RanvirGujral0 -
Are my Domain URLs correctly set up?
Hi Im struggling with this probably easy concept, so I am sure one of you guys out there can answer it fairly easy! My website is over50choices.co.uk and whilst using the free tools in Majestic it said that I had: 77 Referring Domains pointing to www.over50choices.co.uk and only 35 pointing to www.over50choices.co.uk/ And in Moz it said: The URL you've entered redirects to another URL. We're showing results for www.over50choices.co.uk/ since it is likely to have more accurate link metrics. See data for over50choices.co.uk instead? Does this mean that my domains arent set up correctly and are acting as separate domains - should one be pointing to the other? Your help appreciated. Ash
Technical SEO | | AshShep10 -
Still ok to use
This is the flag to prevent google storing a copy of your webpage. I want to use it for good reasons but in 2013 is it still safe to use. My websites not spammy but it's still very fresh with little to no links. Each item I sell takes a lot of research to both buy and sell with the correct info. Once it's sold one I may just come across another and want to hold my advantage of having already done my research and my sold price to myself. Competitors will easily find my old page from a long tail search. Some off my old sold pages keep getting hits and high bounce rates from people using it as reasearch and price benchmark. I want to stop this. So, No archive first, then 301 to category page once sold. Will the two cause a problem in googles eyes?
Technical SEO | | Peter24680 -
Ok to internally link to pages with NOINDEX?
I manage a directory site with hundreds of thousands of indexed pages. I want to remove a significant number of these pages from the index using NOINDEX and have 2 questions about this: 1. Is NOINDEX the most effective way to remove large numbers of pages from Google's index? 2. The IA of our site means that we will have thousands of internal links pointing to these noindexed pages if we make this change. Is it a problem to link to pages with a noindex directive on them? Thanks in advance for all responses.
Technical SEO | | OMGPyrmont0 -
Should I change by URL's
I started with a static website and then moved to Wordpress. At the time I had a few hundred pages and wanted to keep the same URL structure so I use a plugin that adds .html to every page. Should I change the structure to a more common URL structure and do 301 directs from the .html page to the regular page?
Technical SEO | | JillB20130 -
Exact URL Match For Ranking
Has anyone else run into this issue? I have a competitor that purchases domain names for popular inner pages we are trying to rank for. We are trying to build a brand, our competitors have a lower domain authority but rank higher for inner pages in the serps with VERY little content, backlinks/seo work, they host a single page and do a re-direct to their main site. Would this be a good long term strategy? EX. We sell golf clubs our brand name is golfcity (Ex only) and we carry callaway clubs, our competitor is also building a brand but they purchased callawayclubs.net and do a re-direct. They rank on page one for keywords callaway clubs. If I do try to do this does one have an advantage over another? .com. net .org. because Ive seem them all used and rank on page 1. Thank you!!!
Technical SEO | | TP_Marketing0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0