Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multiple 301 redirects for a HTTPS URL. Good or bad?
-
I'm working on an ecommerce website that has a few snags and issues with it's coding.
They're using https, and when you access the website through domain.com, theres a 301 redirect to http://www.domain.com and then this, in turn, redirected to https://www.domain.com.
Would this have a deterimental effect or is that considered the best way to do it. Have the website redirect to http and then all http access is redirected to the https URL?
Thanks
-
My personal rule of thumb - as few redirect jumps as possible. Three main reasons:
1. User journey + Browsers - Sometimes when there are too many redirects taking place, some browsers find it difficult to follow through and would simply not load the page. Also, even if there were only 2-3, the browser may load, but users on slower connections may find it tiresome waiting for content to load.
2. As ThompsonPaul highlights, you COULD lose some link value due to dilution through 301 redirects.
3. Multiple 301 redirects are often used by spammers and I foresee in the near future these causing a lot of ranking headaches. The older the site, the longer the chain might end up - for example, imagine you had a product at:
https://domain.com/product1
Links to that page exist at domain.com/product1The journey would be: domain.com/product1 >http://domain.com/product1 > https://domain.com/product1
Now imagine a year down the line, product 1 is discontinued and you decide to redirect https://domain.com/product1 to domain.com/product2
Imagine your journey now:
domain.com/product1 >http://domain.com/product1 > https://domain.com/product1 > domain.com/product2 >http://domain.com/product2 > https://domain.com/product2
This could carry on indefinitely in the lifetime of the site...
Best solution: Decide what version of the site you want to use and simply try and use only one redirect, not a chain. Periodically check for chained redirects and resolve as you go along. (I try and do this bi annually).
-
To answer your specific question, Jason, yes, there's an issue with those URLs going through two consecutive redirects.
Each redirect, like any link, costs a little bit of "link juice". So running through two consecutive redirects is wasting twice as much link juice as if the origin URL redirects immediately to the final URL without the intermediate step. It's not a massive difference, but on an e-commerce site especially, there's no point in wasting any. (Some folks reckon the loss could be as high as 15% per link/redirect.) Plus, I've occasionally seen problems with referrer data being maintained across multiple redirects (anecdotal).
Hope that answers your specific question?
Paul
-
I agree with Jane. Unless there are reasons why the whole site needs to be secure, it makes more sense for just the areas where sensitive information is being submitted to be SSL encrypted.
http: requests are processed more quickly than https: ones due to the SSL handshake required to produce the cryptographic parameters for the user's session - so your site would be a little quicker if you weren't using SSL.
However, if you do decide to use http: rather than https: for the product & category pages like Jane has suggested - you'd need to ensure that the https: versions of these pages redirect to http:... again to avoid duplicate content.
-
Hi Jason,
To add to what Yusuf has said, is there a specific reason why the whole site has to use SSL, rather than just the parts of the website where sensitive information is passed? If so, I would be tempted to recommend that the e-commerce pages (products, categories, etc.) remain on HTTP URLs.
Cheers,
Jane
-
Hi Jason,
It's fine to 301 redirect from http: to https: and it's quite common for sites that use SSL. It's exactly the same principle as redirecting from a non-www to www (e.g. http://example.com to http://www.example.com) - which is considered to be good practice. But there should only be a single redirect. So you should ensure that http://example.com redirects to https://www.example.com without first redirecting to http://www.example.com.
I would also make sure that all pages (not just the homepage) redirect from http: to https: too to ensure there are no duplicate content issues on the rest of the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Default Wordpress 301 Redirects of JS and CSS files. Bad for SEO & How to Fix?
Hi there: We are developers with some digital marketing expertise, but a current issue has us perplexed. An outside SEO firm has asked us to clean up a large number of 301 redirects. Most of these are 'default' Wordpress behavior that relate to calling the latest version of a JS or CSS file. For instance, a JS file is called with this: https://websitexyz.com/wp-includes/js/wp-embed.min.js?ver=4.9.1 but ultimately redirects to this: https://websitexyz.com/wp-includes/js/wp-embed.min.js. We are being asked to prevent the redirect from happening by, presumably, calling the ultimate file to begin with. The issue is that, as far as we know, there's no easy way to alter WP behavior to call the ultimate file to begin with. Does anyone have any thoughts on this? Thanks.
Intermediate & Advanced SEO | | Daaveey0 -
301 redirect hops from non-https and www
It's best practice to minimize the amount of 301 redirect hops. Ideally only one redirect hop. It's also best practice to 301 redirect (or at least canonical) your non-https and/or your non-www (or www) to the canonical protocol/subdomain. The simplest (and possibly the most common) way to implement canonical protocol/subdomain redirects is through a load balancer or before your app processes the request. Both of which will just blanket 301 to the canonical domain/protocol regardless if the path exists or not In which case, you could have: Two hops. i.e. hop #1 http://example.com/foo to https://example.com/foo, hop #2 https://example.com/foo to https://example.com/bar 301 to a 404. Let's say https://example.com/dog never existed, but somebody for whatever reason linked to it (maybe a typo). If I request https://www.example.com/dog, the load balancer would 301 to a 404 page. Either scenario above should be fairly rare. However, you can't control how people link to you. Should I care about either above scenario? I could have my app attempt to check if the page exists before forwarding, but that code could be complicated.
Intermediate & Advanced SEO | | dsbud0 -
Why is rel="canonical" pointing at a URL with parameters bad?
Context Our website has a large number of crawl issues stemming from duplicate page content (source: Moz). According to an SEO firm which recently audited our website, some amount of these crawl issues are due to URL parameter usage. They have recommended that we "make sure every page has a Rel Canonical tag that points to the non-parameter version of that URL…parameters should never appear in Canonical tags." Here's an example URL where we have parameters in our canonical tag... http://www.chasing-fireflies.com/costumes-dress-up/womens-costumes/ rel="canonical" href="http://www.chasing-fireflies.com/costumes-dress-up/womens-costumes/?pageSize=0&pageSizeBottom=0" /> Our website runs on IBM WebSphere v 7. Questions Why it is important that the rel canonical tag points to a non-parameter URL? What is the extent of the negative impact from having rel canonicals pointing to URLs including parameters? Any advice for correcting this? Thanks for any help!
Intermediate & Advanced SEO | | Solid_Gold1 -
Geo-Redirect: good idea or not?
Hi Mozzers, The background: I have this very corporate .com domain which is used worldwide. Next to that, we have another .com domain which is specifically created for the US visitors. Within the organic rankings, we notice that our corporate domain is ranking much better in the US. Many visitors are arriving on this domain. As it is a corporate domain being used worldwide, they get lost. My questions: I know there are ways to redirect by location. Would it be smart to automatically redirect US visitors for the corporate domain to the commercial US-specific domain? Is it possible to only redirect US visitors and leave the website as it is for visitors from other countries. Won't this harm the corporate website (organically) worldwide? If this would be a good idea, any recommended plugins or concrete procedures? Thank you so much for helping me out!
Intermediate & Advanced SEO | | WeAreDigital_BE
Sander0 -
New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.
Intermediate & Advanced SEO | | BrandBuilder0 -
Can an incorrect 301 redirect or .htaccess code cause 500 errors?
Google Webmaster Tools is showing the following message: _Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. _ Before I contact the person who manages the server and hosting (essentially asking if the error is on his end) is there a chance I could have created an issue with an incorrect 301 redirect or other code added to .htaccess incorrectly? Here is the 301 redirect code I am using in .htaccess: RewriteEngine On RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/.]+/)*(index.html|default.asp)\ HTTP/ RewriteRule ^(([^/.]+/)*)(index|default) http://www.example.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} !^(www.example.com)?$ [NC] RewriteRule (.*) http://www.example.com/$1 [R=301,L] Could adding the following code after that in the .htaccess potentially cause any issues? BEGIN EXPIRES <ifmodule mod_expires.c="">ExpiresActive On
Intermediate & Advanced SEO | | kimmiedawn
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-icon "access plus 1 year"</ifmodule> END EXPIRES (Edit) I'd like to add that there is a Wordpress blog on the site too at www.example.com/blog with the following code in it's .htaccess: BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule> END WordPress Thanks0 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0 -
Magento: URLs for Products in Multiple Categories
I am working in Magento to build out a large e-commerce site with several thousand products. It's a great platform, but I have run into the issue of what it does to URLs when you put a product into multiple categories. Basically, "a book" in two categories would make two URLs for one product: 1) /books/a-book 2) author-name/a-book So, I need to come up with a solution for this. It seems I have two options: Found this from a Magento SEO article: 'Magento gives you the ability to add the name of categories to path for product URL's. Because Magento doesn't support this functionality very well - it creates duplicate content issues - it is a very good idea to disable this. To do this, go to System => Configuration => Catalog => Search Engine Optimization and set "Use categories path for product URL's to "no".' This would solve the issues and be a quick fix, but I think it's a double edged sword, because then we lose the SEO value of our well named categories being in the URL. Use Canonical tags. To be fair, I'm not even sure this is possible. Even though it is creating different URLs and, thus, poses a risk of "duplicate content" being crawled, there really is only one page on the admin side. So, I can't go to all of the "duplicate" pages and put a canonical tag, because those duplicate pages don't really exist on the back-end. Does that make sense? After typing this out, it seems like the best thing to do probably will be to just turn off categories in the URL from the admin side. However, I'd still love any input from the community on this. Thanks!
Intermediate & Advanced SEO | | Marketing.SCG0