Are htm files stronger than aspx files?
-
Hello All,
I once read that htm files are considered stronger (SEO wise) than aspx files and I wondered if that is correct.
Obviously, I mean the static part of aspx files for example making my about us page in htm and not aspx. Among the advantages of aspx is the usage of a master page (a template) for the design etc.
Any thoughts?
Thanks
-
File extensions doesn't make any difference. I think you must have read about static page vs dynamic pages.
Generally aspx or php is used for developing a dynamic websites (database driven websites). But, even if you're developing any such website you can deal with by URL re-writing. You can ask you developer/programmer to re-write the URL into SEO friendly static urls, i.e. without query (?) string in URL.
Hope it should answer your query properly
-
One Word: No
-
I agree with you, I love how WordPress does that automatically. You should definitely concider dong that within your .htaccess file
-
I agree completely with Zach and would like to add one little thing. I prefer to rewrite my URLS to remove the extension altogether. (so does SEOmoz, take this page for example http://www.seomoz.org/q/are-htm-files-stronger-than-aspx-files)
this makes it slightly more user friendly as its one less piece of info the the user to remember/type in and makes for a slightly cleaner looking URI.
-
There is no difference for any file extension you use (htm, html, php, asp, aspx). When your coding/programming a website, you need to remember that even though the file is written in ASPX, the file is processed through the server, and outputs HTML. The key to this is that the HTML is valid, or the HTML 5 is using the generally accepted practices. Besides this, the file extension has no bearing on SEO.
I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavow File and SSL Conversion Question
Moz Community, So we have a website that we are moving to SSL. It has been 4 years since we submitted our disavow file to google via GWT. We decided to go through our backlinks and realized that many domains we are disavowing currently (under Since we are moving to SSL I understand Google looks at this as a new site. Therefore, we decided to go through our backlinks and realized that many domains we are disavowing currently are no longer active (after 4 years this is expected). Therefore, is it ok to create a new disavow file with the new profile on GW (ssl version of our site)? Also, is it ok the new GW disavow file doesn't include urls we previously disavowed with the non https version? Some links from the old disavow we found were disavowed but they shouldn't have been. Moreover, we found new links we wanted to disavow as well. Thanks QL
Intermediate & Advanced SEO | | QuickLearner0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Moving Entire Domain to New Site with New File Extensions
I have been looking for a while for a good an clear Step by Step guide for moving a site from an old to a new domain... so I guess a good discussion here, could help many web masters have a smooth transition. So in your opinion, beside the obvious, what are the most important steps you must take? Here is what I do: 1. 301 old site to new one and TEST.
Intermediate & Advanced SEO | | dhidalgo1
2. Check Internal Links - Double Check for 404's.
3. Update your Social Profiles with new URL.
4. Let GWT and BWT of the change and request a Crawl.
5. Contact as Many of Webmaster as you possibly can to point your links to your new domain. What's missing? What have you found helpful and/or Effective?0 -
Why do some domains out rank stronger authority domains
Hi, If we take the Moz stats into account here, how comes sometimes weak Moz stat domains out ranking strong Moz stat domains? For example: A inner page with DA56 / PA40 is outranking a Wikipedia inner page with DA100 / PA82. That's a massive difference basically twice as strong on the Wikipedia page but being out ranking. In this case I assume the onpage SEO is playing a big part, but can onpage optimisation be that powerful? And I see this all the time, what SEO factors cause this? Thanks.
Intermediate & Advanced SEO | | Bondara0 -
Should I replace underscores in page file names with hyphens?
Many of the 1000s of pages at our Web store that were established several years ago but still relevant today have underscores separating words in page file names. For example: http://www.audiobooksonline.com/whats_new_compact_disc_audiobooks_audio_books.html Should I replace the underscores with hyphens like this: http://www.audiobooksonline.com/whats-new-compact-disc-audiobooks-audio-books.html or should I duplicate pages with underscores using hyphens and have the older pages with underscores 304 re-directed to the new pages with hyphens?
Intermediate & Advanced SEO | | lbohen0 -
Canonical referencing and aspx
The following pages of my website all end up at the same place:
Intermediate & Advanced SEO | | IPROdigital
http://example.com/seo/Default.aspx
http://example.com/SEO/
http://example.com/seo
http://example.com/sEo
http://example.com/SeO but we have a really messy URL structure throughout the website. I would like to have a neat URL structure, including for offline marketing so customers can easily memorize or even guess the URL. I'm thinking of duplicating the pages and canonical referencing the original ones with the messy URLs instead of a 301 redirect (done for each individual page of course), because the latter will likely result in a traffic drop. We've got tens of thousands of URLs; some active and some inactive. Bearing in mind that thousands of links already point in to the site and even a small percentage drop in traffic would be a serious problem given low industry margins and high marketing spend, I'd love to hear opinions of people who have encountered this issue and found it problematic or successful. @randfish to the rescue. I hope.0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
What is better for SEO - local video file or youtube video?
Should I use a video player and upload the videos for my website or should I put my videos at youtube and use youtube player?
Intermediate & Advanced SEO | | Naghirniac0