HTTPS & 301s
-
Hi
- We have like most set up a redirect from HTTP to HTTPS.
- We also changed our website and set up redirects from .ASP pages to PHP pages
- We are now seeing 2 redirects in place for the whole of the website.
http.www.domain.com > https.www.domain.com (1) >> oldwebpage.asp >> new webpage.php (2)
The question is: Is there anyway of making the redirect 1 and not 2?
thanks
Enver -
Just to make sure I understand. Can you clarify the sequence of the changes and for how long? Do you know if one set of URLs has links to it or was ever indexed.
Let me explain.
It sounds like you had a site that was using http and was an asp site. So you had URLs like
http://www.website.com/file.asp (we will call this URL type A)
You then converted to https so the URLs were like
https://www.website.com/file.asp (we will call this URL type B)
You then updated to a PHP site so now with URLs are like this
https://www.website.com/file.php (we will call this URL type C)
You can setup 301s to go from A to B and then another set to go from B to C. Your question is can you setup a 301 to go from A to C, the answer is yes. You should do this. Anytime you can reduce the number of hops the better.
What you need to think about is, well, that about the A to B and the B to C redirects? Well, I would say at a minimum, you need to eliminate the A to B 301s as you have now decided to skip the "B" and go right to C. That works. What about the B to C 301 redirect? It depends. If you had version B of the website out for a while, and it was indexed by Google and you have links that are built to B version URLs, then yes, you need to leave the B to C redirects. You don't want to lose any of that equity.
Likewise, let's say you have a version D of the site that comes out a year later. You have lots of links into the C version of the site.
https://www.website.com/file.html
You then need the A urls to 301 to the D URLs (and get rid of the A to C 301s), you need the B URL to 301 to the D URLs and so on. In other words, go through another process of cleaning up the 301s and reducing the hops.
Why do all this. Two reasons. There will still be links to the A, B, C versions of the site. Google will still find them and crawl them and you want to get credit for those links to your site. Also, Google keeps an internal log of URLs and will check them from time to time, even if no one is linking to them. You want Google to find the right URL. In either case, if Google hits a version A URL, it would then have to go to version B via a 301 and then to version C. It can do it, but it would rather have 1 hop.
Side note. Try not to use global 301s, where you just 301 a bunch of pages to the home page. That does nothing for you as far as link equity. Try and make the 301s a 1 to 1 relationship as much as possible.
Take a look at this video and this backs up what I just said. The number of hops is discussed at about 3 min in. The whole video is worth watching https://www.youtube.com/watch?v=r1lVPrYoBkA
-
I'm not sure I understand. What is wrong with the ASP -> PHP redirect?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Desktop in http and mobile in https
Any experience or advice you can share of having a mix set of pages/urls in one site/domain https and http e.g. mobile in https and desktop in http , (desktop version) http://mydomain/product1 (mobile version)https://m.mydomain.com/product1 att the same time some mobile pages still in http://m.mydomain.com/sectionA/ thanks
Technical SEO | | arnoldcr0 -
Removing CSS & JS Files from Index
Hi, Google has indexed a few .CSS and .JS files that belong to our WordPress plugins and themes. I had them blocked via robots, but realized this doesn't prevent indexation (and can likely hurt us since Google wants to access these files). I've since removed the robots instructions, submitted a removal request via Search Console, but want to make sure they don't come back. Is there a way to put a noindex tag within .CSS and .JS files? Or should I do something with .htaccess instead?
Technical SEO | | kirmeliux1 -
https v http is there any difference in rankings what is the best for a online chemist store?
Have a client that has a https site do you think its better than http for this kind of site and is there any studies done regarding any difference in rankings?
Technical SEO | | ReSEOlve0 -
Changing a site from http to https
Will my rankings be affected if I change domain from http to https and force redirect?
Technical SEO | | Clickatell20 -
Author & Video Markup on the Same Page
I just have a quick question about using schema.org markup. Is there any situation where you'd want to include both author & video markup on the same page?
Technical SEO | | justinnerd0 -
Client has 3 websites, for various locations & duplicate content is a big issue...Is my solution the best?
Hi guys, I have a client who has 3 websites all for different locations in the same state in Australia. Obviously this is not the best practice but in the meeting he said that each area is quite particular about where they do business. What he means is that people from one area want to do business with a website from that particular area. He has 3 domains and we have duplicate content issues. We are solving these at the moment with the canonical tag however they are redesigning the site soon. My suggestion is that we have 1 domain and sub domains for the other 2 areas. This way the people from that area will see the company is from their area. Also this way we have 1 domain to optimise and build domain authority for. Has anyone else come across this and is my solution the best for this? Thanks! Jon
Technical SEO | | Jon_bangonline0 -
Competitive Domain Analysis & Subdomain Metrics
I have a web site that shows with all zeroes in the Competitive Domain Analysis and subdomain Metrics screen. I don't think this is possible because I have a ton of links out there to this web site from a forum that I visit. Can someone help me understand how this might be. I am hoping it's not some dreaded www vs non www issue because I think I solved that issue for this site. The site is www.nationalcurrencyvalues.com
Technical SEO | | Banknotes0 -
Google cached https rather than http
Google is using a secure version of a page (https) that is meant to be displayed using only http. I don't know of any links to the page using https, but want to verify that. I only have 1 secure page on the site and it does not link to the page in question. What is the easiest way to nail down why Google is using the https version?
Technical SEO | | TheDude0