Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Capital Letters in URLS?
-
Remove
-
Having Capital letters in the URLs are not bad for SEO, Google consider this case as negative seo and it will not affect your ranking, but i recommend to use lower case in URL because it is User-friendly and SE friendly, and may be possible that you will have duplicate content issue if search engine see variations of upper and lower case among URLs that all evidently point to the same content. Read matt cutts's advices on URL http://www.seosean.com/blog/matt-cutts-advice-on-urls-page-names
-
I agree with Neil. It's not bad, just a good user practice to keep them lowercase so that's there's no confusion. The best bet for you would to be to use a consistent format and mimic that in your canonical URLs so only that variation gets crawled and indexed.
-
Whilst it's not necessarily "bad" per se, the implications are, so this kind of canonicalisation issue needs to be taken care of using URL rewrites/permanent 301 redirects.
Typically, on a Windows-based server (without any URL rewriting), a 200 (OK) status code will be returned for each version regardless of the combination of upper/lower-case letters used - giving search engines duplicate content to index, and others duplicate content to link to. This naturally dilutes rankings and link equity across the two (or more) identical pages.
There is an excellent section on solving canonicalisation issues on Windows IIS servers in this SEOmoz article by Dave Sottimano.
On a Linux server (without any URL rewriting) you will usually get a 200 for the lower-case version, and a 404 (Not Found) for versions with upper-case characters. Whilst search engines wont index the 404, you are potentially wasting link equity passed to non-existent pages, and it can be really confusing for users, too.
There is a lot of info around the web about solving Linux canonicalisation issues (here is an article from YouMoz). If your site uses a CMS like Joomla or Wordpress, most of these issues are solved using the default .htaccess file, and completely eliminated when you combine this with a well chosen extension or two.
You can help the search engines figure out which version of a page you regard as the original by using the rel="canonical" meta tag in the html . This passes link equity and rankings from duplicate versions to the main, absolute version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can 'Jump link'/'Anchor tag' urls rank in Google for keywords?
E.g. www.website.com/page/#keyword-anchor-text Where the part after the # is a section of the page you can jump to, and the title of that section is a secondary keyword you want the page to rank for?
Algorithm Updates | | rwat0 -
Should we use brand name of product in URL
Hi all, What is best for SEO. We sell products online. Is it good to mention the brand in the product detail page URL key if (part of) the brand is also in the home url? So our URL is: www.brandXstore.com Is it best to do: www.brandXstore.com/brandX-productA.html of just do: www.brandXstore.com/ProductA.html Thanks for quick answering 😉
Algorithm Updates | | RetailClicks1 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
What are the advantages and disadvantages of having multiple folders in URL?
Example: http://www.domain.com.ph/property-for-sale/city/area/ (3 folders) Would it be great if we'll just use http://www.domain.com.ph/property-for-sale-area-city/ (All pages will be under 1 folder)? Thanks in advance! 🙂
Algorithm Updates | | esiow20130 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Should We Switch from Several Exact Match URLs to Subdomains Instead?
We are a company with one product customized for different vertical markets. Our sites are each setup on their own unique domains:
Algorithm Updates | | contactatonce
contactatonce.com (Brand)
autodealerchat.com (Auto Vertical)
apartmentchat.com (Apartment Vertical)
chatforrealestate.com (Real Estate Vertical) We currently rank well on the respective keyword niches including:
- auto dealer chat (exact match), automotive chat, dealer chat
- apartment chat (exact match), property chat, multifamilly chat
- chat for real estate (exact match), real estate chat To simplify the user experience we are considering moving to a single domain and subdomain structure: contactatonce.com
auto.contactatonce.com
apartment.contactatonce.com
realestate.contactatonce.com QUESTIONS:
1. Considering current Google ranking strategies, do we stand to lose keyword related traffic by making this switch?
2. Are there specific examples you can point to where an individual domain and subdomains each ranked high on Google across a variety of different niches? (I'm not talking about Wikipedia, Blogger, Blogspot, Wordpress, Yahoo Answers, etc. which are in their own class, but a small to mid size brand). Thank you,
Aaron0 -
URL is starting to appear capitalized in Google Search Results. How come?
Our domain (www.absoluteautomation.com) has just today started appearing in search results as www.AbsoluteAutomation.com. Any ideas why?
Algorithm Updates | | absoauto0 -
Local SEO url format & structure: ".com/albany-tummy-tuck" vs ".com/tummy-tuck" vs ".com/procedures/tummy-tuck-albany-ny" etc."
We have a relatively new site (re: August '10) for a plastic surgeon who opened his own solo practice after 25+ years with a large group. Our current url structure goes 3 folders deep to arrive at our tummy tuck procedure landing page. The site architecture is solid and each plastic surgery procedure page (e.g. rhinoplasty, liposuction, facelift, etc.) is no more than a couple clicks away. So far, so good - but given all that is known about local seo (which is a very different beast than national seo) quite a bit of on-page/architecture work can still be done to further improve our local rank. So here a a couple big questions facing us at present: First, regarding format, is it a given that using geo keywords within the url indispustibly and dramatically impacts a site's local rank for the better (e.g. the #2 result for "tummy tuck" and its SHENANIGANS level use of "NYC", "Manhattan", "newyorkcity" etc.)? Assuming that it is, would we be better off updating our cosmetic procedure landing page urls to "/albany-tummy-tuck" or "/albany-ny-tummy-tuck" or "/tummy-tuck-albany" etc.? Second, regarding structure, would we be better off locating every procedure page within the root directory (re: "/rhinoplasty-albany-ny/") or within each procedure's proper parent category (re: "/facial-rejuvenation/rhinoplasty-albany-ny/")? From what I've read within the SEOmoz Q&A, adding that parent category (e.g. "/breast-enhancement/breast-lift") is better than having every link in the root (i.e. completely flat). Third, how long before google updates their algorithm so that geo-optimized urls like http://www.kolkermd.com/newyorkplasticsurgeon/tummytucknewyorkcity.htm don't beat other sites who do not optimize so aggressively or local? Fourth, assuming that each cosmetic procedure page will eventually have strong link profiles (via diligent, long term link building efforts), is it possible that geo-targeted urls will negatively impact our ability to rank for regional or less geo-specific searches? Thanks!
Algorithm Updates | | WDeLuca0