Converting files from .html to .php or editing .htaccess file
-
Good day all,
I have a bunch of files that are .html and I want to add some .php to them.
It seems my 2 options are
- Convert .html to .php and 301 redirect
or
- add this line of code to my .htaccess file and keep all files that are .html as .html
AddType application/x-httpd-php .html
My gut is that the 2nd way is better so as not alter any SEO rankings, but wanted to see if anybody had any experience with this line of code in their .htaccess file as definitely don't wan to mess up my entire site
Thanks for any help!
John
-
Hi John
The first line removes the extension
The second line adds them back in a specific order IE you want PHP to execute first.
If you got it going that is what counts.
Good luck,
Don
-
Thanks so much for this Don.. this is what I added that seemed to work for my server
AddHandler application/x-httpd-php .html .htm
As the AddType caused errors but doing some further research I found the above code.
I wonder if what you propose would accomplish what I did?
Thanks and all the best,
John
-
Hi John,
If the URL's are well indexed and doing well, you "may" not want to change the url. To simply add the ability to run php first you can do it very easily with just what you thought, .htaccess
In fact when I took over as webmaster on my corporate site which was indexed very well I had to do just that.
Add this to your .htaccess file:
RemoveHandler .html .htm
AddType application/x-httpd-php .php .htm .html -
If you really want to go this route, add this to your site .htaccess
RewriteCond %{SCRIPT_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L]So domain.com/file will access file.html
Again, the caveat is there is a short term SEO hit for doing this. Long term, you should be fine.
-
This is a sweet idea.. any tutorial on this? How does it effect existing links directed at the .html and .php pages?
Thanks Keri!
-
Have you considered just rewriting your URLs so they don't use extensions at all? That way, when you use a different technology, you don't need to rewrite your URLs once again. If you look at SEOmoz, you see they don't use .php or .html as extensions, but instead have no extensions.
-
I did option 1 on one of my websites some time ago and works fine, rankings are the same. Takes about 2 moth to get the same visits on all the links again.
-
We use the AddType function all the time when updating websites. It's far easier to do that that to recreate everything and redirect it.
It allows all of your internal navigation to remain as is and it keeps all of your inbound links from becoming redirected links. Also, remember that it has been announced that 301 redirected links lose value over time so this is another reason to not do it the hard way.
-
Just make sure that you don't redirect all HTML files. I suspect that either way is equal. What you are telling in either case i
"Hi Google we have moved but don't worry we have moved here"
-
I would pick #2, where you process .html files with PHP. Changing URLs involves taking a temporary SEO hit and I would not recommend doing it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Log files vs. GWT: major discrepancy in number of pages crawled
Following up on this post, I did a pretty deep dive on our log files using Web Log Explorer. Several things have come to light, but one of the issues I've spotted is the vast difference between the number of pages crawled by the Googlebot according to our log files versus the number of pages indexed in GWT. Consider: Number of pages crawled per log files: 2993 Crawl frequency (i.e. number of times those pages were crawled): 61438 Number of pages indexed by GWT: 17,182,818 (yes, that's right - more than 17 million pages) We have a bunch of XML sitemaps (around 350) that are linked on the main sitemap.xml page; these pages have been crawled fairly frequently, and I think this is where a lot of links have been indexed. Even so, would that explain why we have relatively few pages crawled according to the logs but so many more indexed by Google?
Technical SEO | | ufmedia0 -
How to add specific Tumblr blogs into a disavow file?
Hi guys, I am about to send a reconsideration letter and still finalizing my disavow file. The format of the disavow is Domain:badlink.com (stripping out to the root domain) but what about those toxic links that are located in tumblr such as: badlink.tumblr.com? The issue is that there are good tumblr links we got so I don't want to add just tumblr.com so do you guys think I will have issues submitting badlink.tumblr.com and not tumblr.com? Thank you!
Technical SEO | | Ideas-Money-Art0 -
Creating a CSV file for uploading 301 redirect URL map
Hi if i'm bulk uploading 301 redirects whats needed to create a csv file? is it just a case of creating an excel spreadsheet & have the old urls in column A and new urls in column B and then just convert to csv and upload ? or do i need to put in other details or paremeters etc etc ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Pointing a sub-domain to a sub-folder in htaccess
I have a client who currently uses shopify for there blog. I want to set them up with a separate blog away from the shopify system and host it in Australia. I know the best option is using a subfolder but as the shopify system is an unmoveable CMS can I somehow point my subdomain to a subfolder and get the benefits of the domain name? I could do this by using the rewrite rule in the htaccess file. If I was to do this would it end up cloaking the URL's of the articles?
Technical SEO | | acs1110 -
Htaccess Rewrites
Hi, I'm battling with duplicate content and would love to fix some redirections in my htaccess file: 1. I'm trying to use the www version of my site via 301 2. I'm trying to remove ALL /index.php from my url's I currently have the following code, but my home page /index.php is still not redirecting to the root? Options +FollowSymLinks
Technical SEO | | Klement69
Options +Indexes <ifmodule mod_rewrite.c="">RewriteEngine On
#RewriteBase / RewriteCond %{HTTP_HOST} ^funeralcoverfinder.co.za$ [NC]
RewriteRule ^(.*)$ http://www.FuneralCoverFinder.co.za/$1 [R=301,NC,L] RewriteCond %{QUERY_STRING} base64_encode[^(]([^)]) [OR]
RewriteCond %{QUERY_STRING} (<|%3C)([^s]s)+cript.(>|%3E) [NC,OR]
RewriteCond %{QUERY_STRING} GLOBALS(=|[|%[0-9A-Z]{0,2}) [OR]
RewriteCond %{QUERY_STRING} _REQUEST(=|[|%[0-9A-Z]{0,2})
RewriteRule .* index.php [F] RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteCond %{REQUEST_URI} !^/index.php
RewriteCond %{REQUEST_URI} /component/|(/[^.]|.(php|html?|feed|pdf|vcf|raw))$ [NC]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . index.php [L]</ifmodule> Any Advice? - Thanks so much!0 -
Do index.php extensions count as duplicate content on Joomla sites?
When i run my error report, i see 2 duplicate pages, but both are the main domain and then the /index.php extension. how do i fix this? does it really count as duplicate content?
Technical SEO | | valetseo0 -
Remove a directory using htaccess
Hi, Can someone tell me if there's a way using htaccess to say that everything in a particular directory, let's call it "A", is gone (http 410 code)? i.e. all the links should be de-indexed? Right now, I'm using the robots file to deny access. I'm not sure if it's the right thing to do since Google webmaster tools is showing me the link as indexed still and a 403 error code. Thanks.
Technical SEO | | webtarget0 -
How long does it take for customized Google Site Search to show results from pdf files?
The site in question is http://www.ejmh.eu I am pretty unsatisfied with the results I am getting from the Site Search provided by Google. We have over 160 pdf files in this subfolder: http://www.ejmh.eu/mellekletek The files are the digital versions of articles. When I search for content in those pdf files, Google does not show results. It does show results from older pages, dating back 1-2 years but it is certainly not showing anything from pdf files that I have just put up 3 weeks ago. My questions: If I place a Google Search on a site, does it not automatically display results from ALL the content in the root domain? Is there any correlation between how the Site Search is indexing the files and how Google is indexing the urls in general? Should I just wait and see whether site search performance improves or should I switch to another Search software like Zoom Search? It is vital to have a proper, high-quality search functioning on that site in the very near future. What are your experiences? Any tips are greatly appreciated.
Technical SEO | | Lauroca0