Converting files from .html to .php or editing .htaccess file
-
Good day all,
I have a bunch of files that are .html and I want to add some .php to them.
It seems my 2 options are
- Convert .html to .php and 301 redirect
or
- add this line of code to my .htaccess file and keep all files that are .html as .html
AddType application/x-httpd-php .html
My gut is that the 2nd way is better so as not alter any SEO rankings, but wanted to see if anybody had any experience with this line of code in their .htaccess file as definitely don't wan to mess up my entire site
Thanks for any help!
John
-
Hi John
The first line removes the extension
The second line adds them back in a specific order IE you want PHP to execute first.
If you got it going that is what counts.
Good luck,
Don
-
Thanks so much for this Don.. this is what I added that seemed to work for my server
AddHandler application/x-httpd-php .html .htm
As the AddType caused errors but doing some further research I found the above code.
I wonder if what you propose would accomplish what I did?
Thanks and all the best,
John
-
Hi John,
If the URL's are well indexed and doing well, you "may" not want to change the url. To simply add the ability to run php first you can do it very easily with just what you thought, .htaccess
In fact when I took over as webmaster on my corporate site which was indexed very well I had to do just that.
Add this to your .htaccess file:
RemoveHandler .html .htm
AddType application/x-httpd-php .php .htm .html -
If you really want to go this route, add this to your site .htaccess
RewriteCond %{SCRIPT_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L]So domain.com/file will access file.html
Again, the caveat is there is a short term SEO hit for doing this. Long term, you should be fine.
-
This is a sweet idea.. any tutorial on this? How does it effect existing links directed at the .html and .php pages?
Thanks Keri!
-
Have you considered just rewriting your URLs so they don't use extensions at all? That way, when you use a different technology, you don't need to rewrite your URLs once again. If you look at SEOmoz, you see they don't use .php or .html as extensions, but instead have no extensions.
-
I did option 1 on one of my websites some time ago and works fine, rankings are the same. Takes about 2 moth to get the same visits on all the links again.
-
We use the AddType function all the time when updating websites. It's far easier to do that that to recreate everything and redirect it.
It allows all of your internal navigation to remain as is and it keeps all of your inbound links from becoming redirected links. Also, remember that it has been announced that 301 redirected links lose value over time so this is another reason to not do it the hard way.
-
Just make sure that you don't redirect all HTML files. I suspect that either way is equal. What you are telling in either case i
"Hi Google we have moved but don't worry we have moved here"
-
I would pick #2, where you process .html files with PHP. Changing URLs involves taking a temporary SEO hit and I would not recommend doing it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I add my html sitemap to Robots?
I have already added the .xml to Robots. But should I also add the html version?
Technical SEO | | Trazo0 -
Where does rel=canonical go? One file that manages sort order, view, filters, etc...
Where do I put the rel=canonical when the search.cfm (using URL re-write) page is the one and only page, just using url parameters to control sort, filter, view, etc. Do I just put the rel=canonical at the top of the search.cfm page? The duplicate content issues I am getting are: https://www.domain.com/tx/austin/ https://www.domain.com/tx/austin/?d=25&h=&s=r&t=&v=l&a= Just want to be clear since Moz Pro is picking up both URL's but it's only really one file, search.cfm Thanks in advance for your help.
Technical SEO | | ErnieB0 -
Getting Google to index a large PDF file
Hello! We have a 100+ MB PDF with multiple pages that we want Google to fully index on our server/website. First of all, is it even possible for Google to index a PDF file of this size? It's been up on our server for a few days, and my colleague did a Googlebot fetch via Webmaster Tools, but it still hasn't happened yet. My theories as to why this may not work: A) We have no actual link(s) to the pdf anywhere on our website. B) This PDF is approx 130 MB and very slow to load. I added some compression to it, but that only got it down to 105 MB. Any tips or suggestions on getting this thing indexed in Google would be appreciated. Thanks!
Technical SEO | | BBEXNinja0 -
HTML Encoding Error
Okay, so this is driving me nuts because I should know how to find and fix this but for the life of me cannot. One of the sites I work for has a long-standing crawl error in Google WMT tools for the URL /a%3E that appears on nearly every page of the site. I know that a%3E is an improperly encoded > but I can't seem to find where exactly in the code its coming from. So I keep putting it off and coming back to it every week or two only to wrack my brain and give up on it after about an hour (since its not a priority and its not really hurting anything). The site in question is https://www.deckanddockboxes.com/ and some of the pages it can be found on are /small-trash-can.html, /Dock-Step-Storage-Bin.html, and /Standard-Dock-Box-Maxi.html (among others). I figured it was about time to ask for another set of eyes to look at this for me. Any help would be greatly appreciated. Thanks!
Technical SEO | | MikeRoberts0 -
How can I make Google Webmaster Tools see the robots.txt file when I am doing a .htacces redirec?
We are moving a site to a new domain. I have setup an .htaccess file and it is working fine. My problem is that Google Webmaster tools now says it cannot access the robots.txt file on the old site. How can I make it still see the robots.txt file when the .htaccess is doing a full site redirect? .htaccess currently has: Options +FollowSymLinks -MultiViews
Technical SEO | | RalphinAZ
RewriteEngine on
RewriteCond %{HTTP_HOST} ^(www.)?michaelswilderhr.com$ [NC]
RewriteRule ^ http://www.s2esolutions.com/ [R=301,L] Google webmaster tools is reporting: Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.0 -
Htaccess file
I need to redirect the web pages which do not exist to 404 error the task need to be done in htaccess file. I am using Linux server. the webpages I want to redirect is my domain name followed by question mark e.g. www.mydomain.com/?dfdds I am using the following snippet in my htaccess file, it redirect to bing.com so far, please tell me how to change the snippet so that it redirect to redirect to 404 error page. ========================== RewriteCond %{QUERY_STRING} . RewriteRule .* http://www.bing.com? [L,R]
Technical SEO | | semer0 -
Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines. I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important. Resources: http://www.ericpender.com/blog/tabs-and-seo http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643 Sample in use: http://www.seomoz.org/article/search-ranking-factors **Old Version: ** http://screencast.com/t/BWn0OgZsXt http://seatgeek.com/boston-celtics-tickets/ New Version with tabs: http://screencast.com/t/VW6QzDaGt http://screencast.com/t/RPvYv8sT2 http://seatgeek.com/miami-heat-tickets/ Notes: Content not displayed stacked on browser when Javascript turned off, but it is in the source code. Content shows up in Google cache of new page in the text version. In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be? Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional). This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource. Are there any issues with hidden text / is this too far down in the html? Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
Technical SEO | | chadburgess0 -
Is having a sitemap.xml file still beneficial?
Hi, I'm pretty new to SEO and something I've noticed is that a lot of things become relevant and irrelevant like the weather. I was just wondering if having a sitemap.xml file for Google's use is still a good idea and beneficial? Logically thinking, my websites would get crawled faster by having one. Cheers.
Technical SEO | | davieshussein0