Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Removing .html from URLs - impact of rankings?
-
Good evening Mozzers. Couple of questions which I hope you can help with. Here's the first.
I am wondering, are we likely to see ranking changes if we remove the .html from the sites URLs.
For example
website.com/category/sub-category.html
Change to: website.com/category/sub-category/
We will of course make sure we 301 redirect to the new, user friendly URLs, but I am wondering if anyone has had previous experience of implementing this change and how it has effected rankings.
By having the .html in the URLs, does this stop link juice being flowed back to the root category?
Second question:
If one page can be loaded with and without a forward slash "/" at the end, is this a duplicate page, or would Google consider this as the same page? Would like to eliminate duplicate content issues if this is the case.
For example: website.com/category/ and website.com/category
Duplicate content/pages?
-
Similarly to any link, not just 301:
"The amount of PageRank that dissipates through a 301 is currently identical to the amount of PageRank that dissipates through a link."
So 301s are just fine.
-
Matt Cutts said, in 2013, that about 15% of pagerank is lost through a 301 redirect.
-
Thanks for the speedy answer, I had suspected the same thing so I'm glad we've come to the same conclusion. Thanks for your help.
-
Hi Joshua
subcategory.htm pages will perform just as well as subcategory/ and having .htm in the URL doesn't affect link juice flow at all. .htm or .html are perfectly valid HTML files; however, some prefer having shorter, "nicer" looking URLs. If this is the case and the website is still in the early stages of SEO, then 301 redirect the .htm URLs and make sure every navigation elements links to the non-htm URLs in the future.
In some cases, the slash ending URLs can be considered duplicate pages (even though I'm pretty sure Google will understand the honest mistake), so it's one of the basic SEO recommendations to set redirections and make sure the website navigation doesn't mix the two. Also, SEO tools will keep sending you duplicate page title warnings, so it's better to clean it up as soon as possible.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Remove Product & Category from URLS in Wordpress
Does anyone have experience removing /product/ and /product-category/, etc. from URLs in wordpress? I found this link from Wordpress which explains that this shouldn't be done, but I would like some opinions of those who have tried it please. https://docs.woocommerce.com/document/removing-product-product-category-or-shop-from-the-urls/
Intermediate & Advanced SEO | | moon-boots0 -
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
How recovering the ranking after an hacking
Hello, I'm Alexia and a few months ago (end of March) my site has been hacked: hackers have created more than 30.000 links in Japanese to sell tires. I've successfully removed the hack and after 14 days of struggle even decided to change the domain to Siteground as they've been really keen to help. I still have some problems and I desperately need your tips. In search console, Google is informing about the +30.000 404 errors due to the content created by hackers which is not available anymore. I've been advised to redirect those links to 410 as they might have penalty effects in the SERP I have 50 503 server errors recognised by Google back in April but still there. What should I do to solve them? I still have a lot of traffic from Japan, even if I've removed all the content and ask Googled to disavow spamming backlinks. Do you think I have on page keywords? I don't understand how they can still find me. Those KWs are indexed in analytics, but not effective clicks, as the content is not there anymore. I also asked Google to remove links in search console with the tool removing links but not all of my requests have been accepted. My site disappeared from the organic results even if it hasn't been recognised as hacked in Google (there wasn't any manual actions on the Search Console). What can I do to gain the organic positioning once again? I've just tried to use the “Fetch as Google” option on search console for the entire website. Thank you all and I look forward to your replies. Thanks! Alessia
Intermediate & Advanced SEO | | AlessiaCamera0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Product or Shop in URL
What do you think is better for seo and for sale, I am using woo-ecommerce for health products website. websitename.com/product/keyword OR websitename.com/shop/keyword
Intermediate & Advanced SEO | | MasonBaker0 -
Remove URLs that 301 Redirect from Google's Index
I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?
Intermediate & Advanced SEO | | trung.ngo0 -
Strange URLs, how do I fix this?
I've just check Majestic and have seen around 50 links coming from one of my other sites. The links all look like this: http://www.dwww.mysite.com
Intermediate & Advanced SEO | | JohnPeters
http://www.eee.mysite.com
http://www.w.mysite.com The site these links are coming from is a html site. Any ideas whats going on or a way to get rid of these urls? When I visit the strange URLs such as http://www.dwww.mysite.com, it shows the home page of http://www.mysite.com. Is there a way to redirect anything like this back to the home page?0 -
Will using a service such as Akamai impact on rankings?
Howdy 🙂 My client has a .com site they are looking at hosting via Akamai - they have offices in various locations, e.g UK, US, AU, RU & in some Asian countries. If they used Akamai, would the best approach be to set up seperate sites per country: .co.uk .com .com.au .ru .sg etc Although my understanding is that Googlebot is located in the US so if it crawled any of those sites it would always get a US IP address? So is the answer perhaps to go with Akamai for the .com only which should target the US market and use different / seperate C class hosts for the others? Thanks! Woj
Intermediate & Advanced SEO | | wojkwasi0