Looking for a tool that will display the contents of the htaccess file
-
I am looking for an online tool that will display the contents of the htaccess file. i came across a tool a month ago, but I can not recall the name of the tool. Thanks
-
Hi Irving,
As Yusuf suggested, .htaccess files are not publicly viewable - they're inaccessible by public users by default.
For your own site's .htaccess file, simply downloading the file via FTP and opening with Notepad/similar will work. But I take it you were looking for a tool to check the .htaccess file for public sites. If there is such a tool, it's news to me.
Best,
Mike -
Hi Irving,
Sometimes the .htaccess files are hidden so you just need to change your FTP client to allow the viewing of hidden files
http://www.rackspace.com/knowledge_center/article/how-can-i-see-my-htaccess-file
After than, you should be able to view the contents of a .htaccess file by using notepad or any other text editor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content question
Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK
Technical SEO | | AndyKubrin0 -
.htaccess redirect question
Hi guys and girls Please forgive me for being an apache noob, but I've been trawling for a while now and i can't seem to find a definitive guide for my current scenario. I've walked into a but of a cluster$%*! of a job, to rescue a horribly set up site. One of many, many problems is that they have 132 302redirects set up. Some of these are identical pages but http-https, others are the same but https-http and some are redirects to different content pages with http-http. A uniform redirecting of http to https is not an option so I'm looking to find out the best practice for reconfiguring these 302s to 301s within .htaccess? Thanks in advance 🙂
Technical SEO | | craig.gto0 -
Htaccess Rewrites
Hi, I'm battling with duplicate content and would love to fix some redirections in my htaccess file: 1. I'm trying to use the www version of my site via 301 2. I'm trying to remove ALL /index.php from my url's I currently have the following code, but my home page /index.php is still not redirecting to the root? Options +FollowSymLinks
Technical SEO | | Klement69
Options +Indexes <ifmodule mod_rewrite.c="">RewriteEngine On
#RewriteBase / RewriteCond %{HTTP_HOST} ^funeralcoverfinder.co.za$ [NC]
RewriteRule ^(.*)$ http://www.FuneralCoverFinder.co.za/$1 [R=301,NC,L] RewriteCond %{QUERY_STRING} base64_encode[^(]([^)]) [OR]
RewriteCond %{QUERY_STRING} (<|%3C)([^s]s)+cript.(>|%3E) [NC,OR]
RewriteCond %{QUERY_STRING} GLOBALS(=|[|%[0-9A-Z]{0,2}) [OR]
RewriteCond %{QUERY_STRING} _REQUEST(=|[|%[0-9A-Z]{0,2})
RewriteRule .* index.php [F] RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteCond %{REQUEST_URI} !^/index.php
RewriteCond %{REQUEST_URI} /component/|(/[^.]|.(php|html?|feed|pdf|vcf|raw))$ [NC]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . index.php [L]</ifmodule> Any Advice? - Thanks so much!0 -
Broken links shall i put them in my htaccess file to generate juice
Hi, when i had to rebuild my website after my hosting company made an error, i lost over 10,000 pages and lost many thousands of links coming to my site. What i want to know is, instead of trying to recreate those pages again which would take me a long time, should i put them into my htacess file and have them point back into my site. so for example, if i have a link coming to my site to an article which could be, holidays in benidorm are not selling well, would it be a good idea to have that link pointed at the main benidorm section which is benidorm news. And if i had an article which was people are finding it hard to lose weight, instead of writing a new article could i have the link pointing to my health section? If this is the correct way of doing it to grab back some link juice, would it slow my site down and how many links could i put in my htacess file. So what i am trying to say is, if i put in say 1000 redirects into my htaccess file, would it slow my site down and is this a wise thing to do or should i just let the links go.
Technical SEO | | ClaireH-1848860 -
Limits to 301 in htaccess?
I'm about to launch a redesign of my company's main website, and we've updated most of the URLs to be more user friendly and SEO optimized. I've just finished editing my spreadsheet, and see that I'll need to implement 244 redirects. My question is: Are there performance issues with loading your .htaccess file up with almost 250 301 redirect commands? I've heard a bloated htaccess file can really slow down apache, should I be approaching this a different way, maybe with php?
Technical SEO | | AdoptionHelp0 -
Tracking Links Tool
I think someone may be trying to harm my site by adding spammy links so I want to track the links going to my site on a daily basis. Any tool suggestions? Majestic SEO is great for getting an overall picture of my links, but is not updated daily. Thanks!
Technical SEO | | theLotter0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0