Taking out a .html plugin
-
We are currently working on a new site structure and we would like to take out a WP plugin that adds a .html to the end of every URL.
From my understanding it is best to not use this plug in. Once I take out this plug in will I need to do anything for all the external links to count. Will the link juice pass through? if you type my url now without the .html in your browser it adds it in the .html. However, all the external links we built have the .html in the URL link.
Do I need to do any 301 or canonical to pass link juice or will I be fine after taking out the plug in?
-
After removing the plugin, you should configure a 301 redirect sitewide to strip out the .html and redirect to the version without the file extension. This way, both internal and external links won't lead to error pages, and you won't lose any link juice.
You'll also want to make sure your canonical tags are configured to link to the non .html version of each page, if they're hand coded.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm looking for a bulk way to take off from the Google search results over 600 old and inexisting pages?
When I search on Google site:alexanders.co.nz still showing over 900 results. There are over 600 inexisting pages and the 404/410 errrors aren't not working. The only way that I can think to do that is doing manually on search console using the "Removing URLs" tool but is going to take ages. Any idea how I can take down all those zombie pages from the search results?
Intermediate & Advanced SEO | | Alexanders1 -
Improve Site Performance by Removing/Replacing Widgets & Plugins?
We completed a Wordpress redesign of our website in December. The project took about 8 months. Important URLs on the new site are performing slowly according to Google Page Speed Insights. For instance, a key product page gets a score of 18 on mobile and 61 on desktop. Home page scores 37 on mobile and 80 on desktop. My new SEO believes the website is hindered by an excessive number of plugins and widgets. That reducing the number of these may increase performance. Also, my developers were unable to get WT3 Total Cache to work with our InMotion server and have used about 3 plugins for cache. We purchased a real estate theme (wpcasa) and heavily customized it. Any suggestions for improveing performance? If we recoded the website from scratch without a pre existing theme (using the existing design) would that speed up performance? Is there anything we can do remove complexity and improve URL download speeds? We are in a very competitive niche and we need decent performance in order to rank. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan1 -
How can I use AMP html on a CMS
I have been trying to research using AMP to improve our mobile speed. We have a whole lot of sites on the same platform managed by a CMS. From what I have read, AMP html can only be used on static pages. Does that mean we would not be able to incorporate this into the html through our CMS? I would like to implement this across all our homepages to test the effectiveness of it if possible, but there is no way to rebuild all our homepages statically. Any advice is much appreciated!
Intermediate & Advanced SEO | | chrisvogel0 -
How to take out international URL from google US index/hreflang help
Hi Moz Community, Weird/confusing question so I'll try my best. The company I work for also has an Australian retail website. When you do a site:ourbrand.com search the second result that pops up is au.brand.com, which redirects to the actual brand.com.au website. The Australian site owner removed this redirect per my bosses request and now it leads to a an unavailable webpage. I'm confused as to best approach, is there a way to noindex the au.brand.com URL from US based searches? My only problem is that the au.brand.com URL is ranking higher than all of the actual US based sub-cat pages when using a site search. Is this an appropriate place for an hreflang tag? Let me know how I can help clarify the issue. Thanks,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
Redirecting index.html to the root
Hi, I was wondering if there is a safe way to consolidate link juice on a single version of a home page. I find incoming links to my site that link to both mysite.com/ and mysite.com/index.html. I've decided to go with mysite.com/ as my main and only URL for the site and now I'd like to transfer all link juice from mysite.com/index.html to mysite.com/
Intermediate & Advanced SEO | | romanbond
When i tried 301 redirect from index.html to the root it created an indefinite loop, of course. I know I can use a RewriteRule.., but will it transfer the juice?? Please help!5 -
HTML entities and SEO
I recently came across an article on HTML entities that discussed how their appear in search results. The same article also mentioned that their use might be considered spam. Since I know nothing of them (other than what I read in the one article) are they a good or bad idea to make meta descriptions stand out from the crowd?
Intermediate & Advanced SEO | | casper4340 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280