Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Converting files from .html to .php or editing .htaccess file
-
Good day all,
I have a bunch of files that are .html and I want to add some .php to them.
It seems my 2 options are
- Convert .html to .php and 301 redirect
or
- add this line of code to my .htaccess file and keep all files that are .html as .html
AddType application/x-httpd-php .html
My gut is that the 2nd way is better so as not alter any SEO rankings, but wanted to see if anybody had any experience with this line of code in their .htaccess file as definitely don't wan to mess up my entire site
Thanks for any help!
John
-
Hi John
The first line removes the extension
The second line adds them back in a specific order IE you want PHP to execute first.
If you got it going that is what counts.
Good luck,
Don
-
Thanks so much for this Don.. this is what I added that seemed to work for my server
AddHandler application/x-httpd-php .html .htm
As the AddType caused errors but doing some further research I found the above code.
I wonder if what you propose would accomplish what I did?
Thanks and all the best,
John
-
Hi John,
If the URL's are well indexed and doing well, you "may" not want to change the url. To simply add the ability to run php first you can do it very easily with just what you thought, .htaccess
In fact when I took over as webmaster on my corporate site which was indexed very well I had to do just that.
Add this to your .htaccess file:
RemoveHandler .html .htm
AddType application/x-httpd-php .php .htm .html -
If you really want to go this route, add this to your site .htaccess
RewriteCond %{SCRIPT_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L]So domain.com/file will access file.html
Again, the caveat is there is a short term SEO hit for doing this. Long term, you should be fine.
-
This is a sweet idea.. any tutorial on this? How does it effect existing links directed at the .html and .php pages?
Thanks Keri!
-
Have you considered just rewriting your URLs so they don't use extensions at all? That way, when you use a different technology, you don't need to rewrite your URLs once again. If you look at SEOmoz, you see they don't use .php or .html as extensions, but instead have no extensions.
-
I did option 1 on one of my websites some time ago and works fine, rankings are the same. Takes about 2 moth to get the same visits on all the links again.
-
We use the AddType function all the time when updating websites. It's far easier to do that that to recreate everything and redirect it.
It allows all of your internal navigation to remain as is and it keeps all of your inbound links from becoming redirected links. Also, remember that it has been announced that 301 redirected links lose value over time so this is another reason to not do it the hard way.
-
Just make sure that you don't redirect all HTML files. I suspect that either way is equal. What you are telling in either case i
"Hi Google we have moved but don't worry we have moved here"
-
I would pick #2, where you process .html files with PHP. Changing URLs involves taking a temporary SEO hit and I would not recommend doing it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robot.txt : How to block a specific file type in several subdirectories ?
Hello everyone ! I need help setting up a robot.txt. I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site. Block files of a specific file type (for example, .gif) | Disallow: /*.gif$ 2 questions : Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ? Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$ Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files. Let's say I want to block pdf files in all these 3 directories /fileadmin/directory1 /fileadmin/directory1/sub1 /fileadmin/directory1/sub1/pdf Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple : Disallow: /fileadmin/directory1*/ Many thanks in advance for any insight you may have.
Technical SEO | | LabeliumUSA0 -
Multiple robots.txt files on server
Hi! I have previously hired a developer to put up my site and noticed afterwards that he did not know much about SEO. This lead me to starting to learn myself and applying some changes step by step. One of the things I am currently doing is inserting sitemap reference in robots.txt file (which was not there before). But just now when I wanted to upload the file via FTP to my server I found multiple ones - in different sizes - and I dont know what to do with them? Can I remove them? I have downloaded and opened them and they seem to be 2 textfiles and 2 dupplicates. Names: robots.txt (original dupplicate)
Technical SEO | | mjukhud
robots.txt-Original (original)
robots.txt-NEW (other content)
robots.txt-Working (other content dupplicate) Would really appreciate help and expertise suggestions. Thanks!0 -
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
Error report in Bing Evaluated size of HTML....
Hi Whilst checking Bing's SEO analyser I got this error message for our page www.tidy-books.co.uk/childrens-bookcases "Evaluated size of HTML is estimated to be over 125 KB and risks not being fully cached. (Issue marker for this rule is not visible in the current view)" Just wondering what needs to be done about it and what it actually means? Thanks
Technical SEO | | tidybooks0 -
Wordpress versus html and google ranking
My current SEO has always recommended that I take my site to wordpress. I really don't want to move to wordpress. I don't like it... I just like writing code in raw html, css, and script. I feel like I have more control that way. Wordpress just seems like a platform for blogs (I have my blog in wordpress). My question is, do wordpress websites typically rank better? Is there benefit to moving to it?
Technical SEO | | CalicoKitty20000 -
Removing URL Parentheses in HTACCESS
Im reworking a website for a client, and their current URLs have parentheses. I'd like to get rid of these, but individual 301 redirects in htaccess is not practical, since the parentheses are located in many URLs. Does anyone know an HTACCESS rule that will simply remove URL parantheses as a 301 redirect?
Technical SEO | | JaredMumford0 -
I need help with a PHP canonical URL tags
I found a little difficult for me to do a canonical tag in my PHP. On-Page Report Card We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply. I don't know how to tidy my PHP Any suggestion.
Technical SEO | | lnietob0 -
How much impact does bad html coding really have on SEO?
My client has a site that we are trying to optimise. However the code is really pretty bad. There are 205 errors showing when W3C validating. The >title>, , <keywords> tags are appearing twice. There is truly excessive javascript. And everything has been put in tables.</keywords> How much do you think this is really impacting the opportunity to rank? There has been quite a bit of discussion recently along the lines of is on-page SEO impacting anymore. I just want to be sure before I recommend a whole heap of code changes that could cost her a lot - especially if the impact/return could be miniscule. Should it all be cleaned up? Many thanks
Technical SEO | | Chammy0