How to remove trailing slashes in URLs using .htaccess (Apache)?
-
I want my URLs to look like these:
http://www.domain.com/buy/shoes
http://www.domain.com/buy/shoes/red
Thanks in advance!
-
I am sorry my friend. Somehow I missed your response.
It should have been: RewriteRule ^(.*)/$ /$1 [R,L]
So, you missed the /
And it is also used to remove the trailing slash from URLs just like the one that I gave you earlier.
Best,
Devanur Rafi
-
And what is the difference of this to the one you've answered?
RewriteBase /
RewriteRule ^(.*)/$ $1 [R,L] -
Thanks Devanur! Big help. By the way, I'm still looking for an opinion about this and I'm sure you can help me with this issue:http://moz.com/community/q/how-can-i-301-redirect-a-series-of-pages-ex-page-1-page-2-page-3-to-one-landing-page-new-url
-
Hi, here you go.....
RewriteRule ^(.)/(?.)?$ $1$2 [R=301,L]
Please note that the thing which looks like a capital V in the above rule is actually the backward slash and forward slash together.
Best,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Important is it to Use Keywords in the URL
I wanted to know how important this measure is on rankings. For example if I have pages named "chair.html" or "sofa.html" and I wanted to rank for the term seagrass chair or rattan sofa.. Should I start creating new pages with the targeted keywords "seagrass-chair.html" and just copy everything from the old page to the new and setup the 301 redirects?? Will this hurt my SEO rankings in the short term? I have over 40 pages I would have to rename and redirect if doing so would really help in the long run. Appreciate your input.
White Hat / Black Hat SEO | | wickerparadise0 -
Want to Remove numbers from Old Post URL - Will it effect its Ranking?
Hi. I have a number of posts that are ranking in google for several keywords. However the URLs contain numbers, for example 2011, 2014 and 35. I want to remove these numbers to make the URLs more updated. If I use the 301 redirect for old URL to the new one, will I retain the same ranking for these blogposts Or it can effect the ranking. Does anyone have tried this in the past? I would like to get your opinion on this. Thanks in advance.
White Hat / Black Hat SEO | | techmaish0 -
Partial Manual penalty to a URL
Hi Mozers, I have a website which has got a partial manual penalty on a specific url. That url is of no use to the website now and is going to be taken off in 3 months time as the website is going to be completely redesigned. Till then I dont wont to live with the partial manual penalty for this url. I have few things in mind to tackle this: 1. take out the url from the website now (as the new redesign will take 3 months) 2. take out internal links pointing to this url in question 3. file for reconsideration with google stating we have taken off the url and have not generated any backlinks and the backlinks are organic. (no backlinking activity has been done on this website or the url) Please let me know if this works or i will have to get the backlinks removed then the disavow then the reconsideration. Looking forward for ur response 🙂
White Hat / Black Hat SEO | | HiteshBharucha0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
How should I use the 2nd link if a site allows 2 in the body of a guest post?
I've been doing some guest posting, and some sites allow one link, others allow more. I'm worried I might be getting too many guest posts with multiple links. I'd appreciate your thoughts on the following: 1. If there are 50+ guest posts going to my website (posted over the span of several months), each with 2 links pointing back only to my site is that too much of a pattern? How would you use the 2nd link in a guest post if not to link to your own site? 2. Does linking to .edu or .gov in the guest post make the post more valuable in terms of SEO? Some people recommend using the 2nd link to do this. Thanks!
White Hat / Black Hat SEO | | pbhatt0 -
301 redirect a set of pages to one landing page/URL?
I'm planning to redirect the following pages to one new URL/landing page: Old URLs: http://www.olddomain.com/folder/page/1 http://www.olddomain.com/folder/page/2 http://www.olddomain.com/folder/page/3 http://www.olddomain.com/folder/page/4 http://www.olddomain.com/folder/page/5 http://www.olddomain.com/folder/page/6 New URL: http://www.newdomain.com/new-folder/new-page Code in .htaccess that I will be using: RedirectMatch 301 /folder/page/(.*) http://www.newdomain.com/new-folder/new-page Let me know if this is correct. Thanks!
White Hat / Black Hat SEO | | esiow20130 -
How do you remove unwanted links, built by your previous SEO company?
We dropped significantly (from page 1 for 4 keywords...to ranking over 75 for all) after the Penguin update. I understand trustworthy content and links (along with site structure) are the big reasons for staying strong through the update...and those sites that did these things wrong were penalized. In efforts to gain Google's trust again, we are checking into our site structure and making sure to produce fresh and relevant content on our site and social media channels on a weekly basis. But how do we remove links that were built by our SEO company, those of which could be untrustworthy/irrelevant sites with low site rankings? Try to email the webmaster of that site (using data from Open Site Explorer)?
White Hat / Black Hat SEO | | clairerichards0 -
Stuffing keywords into URLs
The following site ranks #1 in Google for almost every key phrase in their URL path for almost every page on their site. Example: themarketinganalysts.com/en/pages/medical-translation-interpretation-pharmaceutical-equipment-specifications-medical-literature-hippa/ The last folder in this URL uses 9 keywords and I've seen as many as 18 on the same site. Curious: every page is a "default.html" under one of these kinds of folders (so much architecture?). Question: How much does stuffing keywords into URL paths affect ranking? If it has an effect, will Google eventually ferret it out and penalize it?
White Hat / Black Hat SEO | | PaulKMia0