What's the SEO impact of url suffixes?
-
Is there an advantage/disadvantage to adding an .html suffix to urls in a CMS like WordPress. Plugins exist to do it, but it seems better for the user to leave it off. What do search engines prefer?
-
After I finished work, I will dig through my history and can hopefully deliver.
Update: I spent 1 hour going through my browser history and I was not able to find it. Kinda freaks me out. Sorry.
-
I would love to see the study if you can find the link later. I agree sometimes study results conflict with prior conceptions and I have been mistaken before, but those study results really sound counter intuitive.
-
There are actually some studies that can be found on the internet that suggest that the CTR correlates with the extension shown on the SERP, namely .html is supposedly having a positive impact on the CTR of up to 200%.
I was trying hard to find the website I read it on, but on the quick I couldn't find it. It contained a "study" on it with eyetracking and it made sense.
I've personally never run a test on it, but I decided to add html to our documents by means of Mod Rewrite.
As far as search engines are concerned, it does not matter at all. But overall, the visitor should be more important and seeing that it does not negatively impact rankings, it's worth a try.
-
When possible, always remove the .html or any technology suffix such as .php, .htm, etc. A few reasons:
-
this information offers no value to users nor search engines
-
it needlessly increases the length of your URL
-
it offers hackers an additional piece of information about your web server files. You want to make the bad guys work as hard as possible
-
it helps a lot with SEO when you change technologies. There is good reason for .html pages to move to .php pages. When that change is made, all the pages on a site need to be 301'd. The entire process is a big waste of link juice for the entire site which could be avoided if the technology extension did not exist. While .html and .php pages are common today, next year .seo pages might be the popular extension.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How google bot see's two the same rel canonicals?
Hi, I have a website where all the original URL's have a rel canonical back to themselves. This is kinda like a fail safe mode. It is because if a parameter occurs, then the URL with the parameter will have a canonical back to the original URL. For example this url: https://www.example.com/something/page/1/ has this canonical: https://www.example.com/something/page/1/ which is the same since it's an original URL This url https://www.example.com/something/page/1/?parameter has this canonical https://www.example.com/something/page/1/ like i said before, parameters have a rel canonical back to their original url's. SO: https://www.example.com/something/page/1/?parameter and this https://www.example.com/something/page/1/ both have the same canonical which is this https://www.example.com/something/page/1/ Im telling you all that because when roger bot tried to crawl my website, it gave back duplicates. This happened because it was reading the canonical (https://www.example.com/something/page/1/) of the original url (https://www.example.com/something/page/1/) and the canonical (https://www.example.com/something/page/1/) of the url with the parameter (https://www.example.com/something/page/1/?parameter) and saw that both were point to the same canonical (https://www.example.com/something/page/1/)... So, i would like to know if google bot treats canonicals the same way. Because if it does then im full of duplicates 😄 thanks.
Technical SEO | | dos06590 -
Do I submit a sitemap for a highly dynamic site or not? If so, what's the best way to go about doing it?
I do SEO for online boutique marketplace. I've been here for about 4 weeks and no one's done there SEO (they've been around for about 5 years), so there's lots to do. A big concern is whether or not to submit a sitemap, and if I do submit one, what's the best way to go about doing one.
Technical SEO | | Jane.com0 -
100's of Footer Links... what is the safe play?
Hello, One of my clients wants to know what you guys think is the best solution. He sells 100's of templates a month that have a footer link on it pointing to our homepage. Anchor links are "keyword" & "Brand Name" Some are different than others. Do we update the templates so those are no-follow links in the footer? Do we just make all the links to: Brand Name and have them follow? I understand Brand Name is the business name but I am also afraid that Brand name is so close to the money making keyword in the industry and Google might think we are trying to game the system. Looking for your expert opinions!
Technical SEO | | MoosaHemani0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Wordpress New Category URL's
Were just about to redesign our site and put all the blogs over to the new site. Previously most blogs have been added to the uncategorised section of the blog and I'm now weighing up the benefit of sifting through all the blogs and changing them to relevant categories. From an SEO perspective would it be better to Leave them in their current category but start afresh with all new blogs by adding them to relevant categories? Work out which blogs should go in which new category and 301 all previous URL's to the new one. Obviously number one will take a lot more time than number two.
Technical SEO | | acs1110 -
New EMD update effected my mom's legit author page? From page 1 in SERP to nowhere for her name
I think my mom's site, MargaretTerry.com was hit by this update for her name "Margaret Terry". Went from bouncing around the first page on google.com and .ca all the time to nowhere on the index. The results are now very strange, a mix of Youtube, linked in, and small book stores that she has done events at recently to promote her first book. I was checking after some of my SEO buddys were freaking out about their EMD's getting hit on Sunday. She is an aspiring author with a book coming out this month. There is obviously no ads or spam content on the site... I have never done SEO for it either except a bit of on page I guess. It sucks that people might be grabbing her book soon and when they Google her name nothing shows up. This couldn't have really happened at a worse time. Not to mention the hours spent building the site to her liking, free of charge of course 🙂 Is there anyone I can contact there to help me out? Shouldn't and EMD that is someones name still rank when you search their name?
Technical SEO | | Operatic0 -
What are the SEO implications of URLs that use a # in them?
I have several clients who have begun to ask questions about sites that are designed to look like a single page. When you click on a link, the URL changes but it uses a # before (i.e. http://www.kelloggs.com/teamusa**/#**/teamusa/athletes/kerri-walsh.html. What are the SEO implications of having a page set up this way? I noticed that Google has indexed this page but the indexed URL does not include a #. Is Google indexing a separate version of this page? Any insights would be really helpful! Thanks
Technical SEO | | VMLYRDiscoverability0 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0