Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to force a trailing slash after the domain name
-
My campaign analysis is predictably listing domain.com and domain.com/ as repeated content. I've searched and searched but cannot find a way to force a trailing slash on the end of the domain name unless there's a file or directory after it..
Is there a way to accomplish this using .htaccess
-
I've gone with this .htaccess from your soulgorithm.com:
Options +FollowSymlinks
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.domain.co.uk [NC]
RewriteRule (.*) http://domain.co.uk/$1 [L,R=301]RewriteCond %{REQUEST_URI} (.)/$
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule (.)/$ $1.php [L]RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule .* %{REQUEST_FILENAME}/ [R=301,L]and I'm now getting the results I'm after. I'm getting similar behaviour to you in Firefox and IE, which explains a lot. I really appreciate the length you've gone to to help me here, so big thank you!
-
Test Site: soulgorithm.com
In the .htaccess file for this site:
Options +FollowSymlinks
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.soulgorithm.com [NC]
RewriteRule (.*) http://soulgorithm.com/$1 [L,R=301]RewriteCond %{REQUEST_URI} (.)/$
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteRule (.)/$ $1.html [L]RewriteCond %{REQUEST_URI} (.)/$
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule (.)/$ $1.php [L]RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.html -f [OR]
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule .* %{REQUEST_FILENAME}/ [R=301,L]Which has the following effect:
soulgorithm.com > soulgorithm.com/
(slash is added, but only shows in IE and looks
likes its being stripped by Firefox but page
still loads fine)
soulgorithm.com/ > soulgorithm.com/
(loads fine, but only shows in IE and lookslikes its being stripped by Firefox but page
still loads fine)
soulgorithm.com/test > soulgorithm.com/test/
(loads fine, slash even shows in FF)
soulgorithm.com/test/ > soulgorithm.com.com/test/
(loads fine)
soulgorithm.com/testdir > soulgorithm.com/testdir/
(loads fine, slash even shows in FF)
soulgorithm.com/testdir/ > soulgorithm.com.com/testdir/
(loads fine, slash even shows in FF)
Let me know if this is what you see. I feel likes its getting close to working.
-
Thanks for sticking with this. Rather than me share the domain, do you know of any example sites using your code (or similar) which add a trailing slash after the domain name? I'd like to rule out my browser stripping it out.
-
Man, my mind is blown right now. I'm not giving up and hopefully someone else can chime in on this discussion and shed some light on this issue.
The code provided should have worked. Let me look into it some more. Also, if you don't mind what is the actual domain name?
-
That's right - nothing in there but the code you supplied.
-
Is this the only thing you have in your htaccess file?
if not, I would remove everything in the file and only have what i posted above, and let me know if it works.
-
Nope. Still no trailing slashes being added.
-
Try just the following:
Let me know if this works for you.
RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !index.php RewriteCond %{REQUEST_URI} !(.*)/$ RewriteRule ^(.*)$ http://www.domain.com/$1/ [L,R=301]
-
Thanks for the reply, but this looks like all the other examples I've found. My .htaccess file looks like this :
DirectoryIndex index.php
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ $1.php [L,QSA]RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.)/$
RewriteRule ^(.)$ http://domain.co.uk/$1/ [L,R=301]But I get the following redirects going on:
domain.co.uk > domain.co.uk (ie nothing happens)
domain.co.uk/ > domain.co.uk (ie slash is removed)
domain.co.uk/page2 > domain.co.uk/page2 (ie nothing happens, but page loads)
domain.co.uk/page2/ > Internal server error
Any ideas?
-
Hi Clive.
Yes, you can easily do this with an .htaccess file, here is the code:
RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.*)/$ RewriteRule ^(.*)$ http://domain.com/$1/ [L,R=301]
Just replace "domain.com" with your proper url for your site. This should be all that is needed.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Images on sub domain fed from CDN
I have a client that uses a CDN to fill images, from a sub domain ( images.domain.com). We've made sure that the sub domain itself is not blocked. We've added a robots.txt file, we're creating an image sitemap file & we've verified ownership of the domain within GWT. Yet, any crawler that I use only see's the first page of the sub domain (which is .html) but none of the subsequent URL's which are all .jpeg. Is there something simple I'm missing here?
Technical SEO | | TammyWood0 -
Redirect typo domains
Hi, What's the "correct" way of redirecting typo domains? DNS A record goes to the same ip address as the correct domain name Then 301 redirects for each typo domain in the .htaccess Subdomains on typo urls still redirect to www or should they redirect to the subdomain on the correct url in case the subdomain exists?
Technical SEO | | kuchenchef0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Umbrella company and multiple domains
I'm really sorry for asking this question yet again. I have searched through previous answers but couldn't see something exactly like this I think. There is a website called example .com. It is a sort of umbrella company for 4 other separate domains within it - 4 separate companies. The Home page of the "umbrella" company website is example.com. It is just an image with no content except navigation on it to direct to the 4 company websites. The other pages of website example.com are the 4 separate companies domains. So on the navigation bar there is : Home page = example.com company1page = company1domain.com company2page= company2domain.com etc. etc. Clicking "home" will take you back to example.com (which is just an image). How bad or good is this structure for SEO? Would you recommend any changes to help them rank better? The "home" page has no authority or links, and neither do 3 out of the 4 other domains. The 4 companies websites are independent in content (although theme is the same). What's bringing them altogether is under this umbrella website - example.com. Thank you
Technical SEO | | AL123al0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Mobile Domain Setup
Hi, If I want to serve a subset of pages on my mobile set from my desktop site or the content is significantly different, i.e. it is not one to one or pages are a summarised version of the desktop, should I use m.site.com or is it still better to use site.com? Many thanks any help appreciated.
Technical SEO | | MarkChambers0 -
301 Redirect with an Exact Domain name Match
My Client had a site that ranked for a pretty competitive two word phrase, but for a variety of reasons had to transfer the site to a different domain name (with none of the previous keywords). We've 301'd everything just fine to the new site, but our traffic for that two word phrase, as well as related long tail traffic, is beginning to drop. Could the drop be related to something that we didn't do well in the transfer? Or is it due to the new domain name now not being an exact match? Sitenote question: Our Google Analytics is still set up for the former domain name and shows data just fine. Is there any reason to switch GA to the new domain? What are the pros/cons? Much thanks in advance!
Technical SEO | | TrevorMcKendrick0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0