Are htm files stronger than aspx files?
-
Hello All,
I once read that htm files are considered stronger (SEO wise) than aspx files and I wondered if that is correct.
Obviously, I mean the static part of aspx files for example making my about us page in htm and not aspx. Among the advantages of aspx is the usage of a master page (a template) for the design etc.
Any thoughts?
Thanks
-
File extensions doesn't make any difference. I think you must have read about static page vs dynamic pages.
Generally aspx or php is used for developing a dynamic websites (database driven websites). But, even if you're developing any such website you can deal with by URL re-writing. You can ask you developer/programmer to re-write the URL into SEO friendly static urls, i.e. without query (?) string in URL.
Hope it should answer your query properly
-
One Word: No
-
I agree with you, I love how WordPress does that automatically. You should definitely concider dong that within your .htaccess file
-
I agree completely with Zach and would like to add one little thing. I prefer to rewrite my URLS to remove the extension altogether. (so does SEOmoz, take this page for example http://www.seomoz.org/q/are-htm-files-stronger-than-aspx-files)
this makes it slightly more user friendly as its one less piece of info the the user to remember/type in and makes for a slightly cleaner looking URI.
-
There is no difference for any file extension you use (htm, html, php, asp, aspx). When your coding/programming a website, you need to remember that even though the file is written in ASPX, the file is processed through the server, and outputs HTML. The key to this is that the HTML is valid, or the HTML 5 is using the generally accepted practices. Besides this, the file extension has no bearing on SEO.
I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can some one help how to fix spam problem. please see the attached file
Hi have spam link issue in my website at attaching report please help to fix this so my domain can get better. thanks. aEP1vVy
Intermediate & Advanced SEO | | grbassi0 -
Disavow File and SSL Conversion Question
Moz Community, So we have a website that we are moving to SSL. It has been 4 years since we submitted our disavow file to google via GWT. We decided to go through our backlinks and realized that many domains we are disavowing currently (under Since we are moving to SSL I understand Google looks at this as a new site. Therefore, we decided to go through our backlinks and realized that many domains we are disavowing currently are no longer active (after 4 years this is expected). Therefore, is it ok to create a new disavow file with the new profile on GW (ssl version of our site)? Also, is it ok the new GW disavow file doesn't include urls we previously disavowed with the non https version? Some links from the old disavow we found were disavowed but they shouldn't have been. Moreover, we found new links we wanted to disavow as well. Thanks QL
Intermediate & Advanced SEO | | QuickLearner0 -
Mixing static.htm urls and dynamic urls on a Windows IIS Server?
Hi all, We've had a website originally built using static html with .htm extensions ranking well in Google hence we want to keep those pages/urls. We are on a dedicated sever (Windows IIS). However our developer has custom made a new DYNAMIC section for the site which shows new added products dynamically and allows them to be booked online via shopping cart. We are having problems displaying them both on the same domain even if we put the dynamic section withing its own subfolder and keep the static htms in the root. Is it possible to have both function on IIS (even if they may have to function a little separately)? Does anyone have previous experience of this kind of issue or a way of making both work? What setup do we need to do on the dedicated server.
Intermediate & Advanced SEO | | emerald0 -
Links from swf file widely distributed?
Hello, I just realised that Google is listing as backlinks, links from swf games that we created and distributed widely. We never used this method to have backlinks but as we create the games and give them for free to other sites, we added a link back to our site, if the user who played the game want to visit us. But I am worried that this is interpreted as black hat seo, and this affected our ranking badly. Anyone had this kind of issue? How do you think we should be tackling this? Is this could be affected our site? Thanks for your help on this guys 😉
Intermediate & Advanced SEO | | drimlike0 -
FIle Names
HI Guys, Would it make a difference if I named a URL 2014-ford-fiesta.html or 2014+ford+fiesta.html Thanks!
Intermediate & Advanced SEO | | oomdomarketing0 -
Robots.txt file - How to block thosands of pages when you don't have a folder path
Hello.
Intermediate & Advanced SEO | | Unity
Just wondering if anyone has come across this and can tell me if it worked or not. Goal:
To block review pages Challenge:
The URLs aren't constructed using folders, they look like this:
www.website.com/default.aspx?z=review&PG1234
www.website.com/default.aspx?z=review&PG1235
www.website.com/default.aspx?z=review&PG1236 So the first part of the URL is the same (i.e. /default.aspx?z=review) and the unique part comes immediately after - so not as a folder. Looking at Google recommendations they show examples for ways to block 'folder directories' and 'individual pages' only. Question:
If I add the following to the Robots.txt file will it block all review pages? User-agent: *
Disallow: /default.aspx?z=review Much thanks,
Davinia0 -
Negative impact on crawling after upload robots.txt file on HTTPS pages
I experienced negative impact on crawling after upload robots.txt file on HTTPS pages. You can find out both URLs as follow. Robots.txt File for HTTP: http://www.vistastores.com/robots.txt Robots.txt File for HTTPS: https://www.vistastores.com/robots.txt I have disallowed all crawlers for HTTPS pages with following syntax. User-agent: *
Intermediate & Advanced SEO | | CommercePundit
Disallow: / Does it matter for that? If I have done any thing wrong so give me more idea to fix this issue.0 -
All page files in root? Or to use directories?
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /. For example: /aosta-valley-i6816.html
Intermediate & Advanced SEO | | Peter264
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.html We are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following: /images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.html Would we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory: /images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/ If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/ Just looking for some clarity to our problem! Thank you for your help guys!0