HTTPS & 301s
-
Hi
- We have like most set up a redirect from HTTP to HTTPS.
- We also changed our website and set up redirects from .ASP pages to PHP pages
- We are now seeing 2 redirects in place for the whole of the website.
http.www.domain.com > https.www.domain.com (1) >> oldwebpage.asp >> new webpage.php (2)
The question is: Is there anyway of making the redirect 1 and not 2?
thanks
Enver -
Just to make sure I understand. Can you clarify the sequence of the changes and for how long? Do you know if one set of URLs has links to it or was ever indexed.
Let me explain.
It sounds like you had a site that was using http and was an asp site. So you had URLs like
http://www.website.com/file.asp (we will call this URL type A)
You then converted to https so the URLs were like
https://www.website.com/file.asp (we will call this URL type B)
You then updated to a PHP site so now with URLs are like this
https://www.website.com/file.php (we will call this URL type C)
You can setup 301s to go from A to B and then another set to go from B to C. Your question is can you setup a 301 to go from A to C, the answer is yes. You should do this. Anytime you can reduce the number of hops the better.
What you need to think about is, well, that about the A to B and the B to C redirects? Well, I would say at a minimum, you need to eliminate the A to B 301s as you have now decided to skip the "B" and go right to C. That works. What about the B to C 301 redirect? It depends. If you had version B of the website out for a while, and it was indexed by Google and you have links that are built to B version URLs, then yes, you need to leave the B to C redirects. You don't want to lose any of that equity.
Likewise, let's say you have a version D of the site that comes out a year later. You have lots of links into the C version of the site.
https://www.website.com/file.html
You then need the A urls to 301 to the D URLs (and get rid of the A to C 301s), you need the B URL to 301 to the D URLs and so on. In other words, go through another process of cleaning up the 301s and reducing the hops.
Why do all this. Two reasons. There will still be links to the A, B, C versions of the site. Google will still find them and crawl them and you want to get credit for those links to your site. Also, Google keeps an internal log of URLs and will check them from time to time, even if no one is linking to them. You want Google to find the right URL. In either case, if Google hits a version A URL, it would then have to go to version B via a 301 and then to version C. It can do it, but it would rather have 1 hop.
Side note. Try not to use global 301s, where you just 301 a bunch of pages to the home page. That does nothing for you as far as link equity. Try and make the 301s a 1 to 1 relationship as much as possible.
Take a look at this video and this backs up what I just said. The number of hops is discussed at about 3 min in. The whole video is worth watching https://www.youtube.com/watch?v=r1lVPrYoBkA
-
I'm not sure I understand. What is wrong with the ASP -> PHP redirect?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Follow & Rel Canon for Product Filters
Our site uses Canonicals to address duplicate content issues with product/facet filtering. example: www.mysite.com/product?color=blue Relcanon= www.mysite.com/product However, our site is also using no follow for all of the "filters" on a page (so all ?color=, etc. links are no follow). What is the benefit of utilizing the no follow on the filters if we have the rel canon in place? Is this an effort to save crawl budget? Are we giving up possible SEO juice by having the no follow and not having the crawler get to the canonical tag and subsequently reference the main page? Is this just something we just forget about? I hope we're not giving up SEO juice by
Technical SEO | | Remke0 -
Best Practices For Angular Single Page Applications & Progressive Web Apps
Hi Moz Community, Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices? In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well? Thanks!
Technical SEO | | znotes0 -
Https & http
I have my website (HTTP://thespacecollective.com) marked on Google Webmaster Tools as being the primary domain, as opposed to https. But should all of my on page links be http? For instance, if I click the Home button on my home page it will take the user to http, but if you type in the domain name in the address bar it will take you to https. Could this be causing me problems for SEO?
Technical SEO | | moon-boots0 -
How to Remove Web Cache snapshot page & other language SEO Title in Google Search Engine?
Hi... Please tell me how to remove web cache link given below. I have changed my SEO title but it can't be changed...Any other methods for without using webmaster tools. Kw3arat
Technical SEO | | Thilak_geo040 -
Keeping external links after moving from http to https?
Hi, Does anyone have experience moving a website to https? I am about to do so. I have 84 linking root domains and around 2k+ external links. If i move a website to https will these links be lost? And how to keep these links? Many thanks, Dusan
Technical SEO | | Chemometec0 -
Photography Sites with Same Developer - Why Is One Ranking & Other Not?
I'm currently confused about the difference in ranking between two competing sites, created by the same agency. http://jmayphoto.com/index2.php#!/home (302 redirected from http://jmayphoto.com...yeah) is not ranking well, and I'm not surprised. However, competitor http://www.shanrenee.com/ is ranking within the top 5 spots for a primary target keyword (dallas wedding photographer) and I don't understand how it's doing so well. I definitely see differences, but not enough to explain how Shan Renee is one page. What am I missing?
Technical SEO | | BrittanyHighland0 -
External Linking & Your sites Link juice
Hey guys, quick question. Does a page lose link juice when it gives link juice? If I link to an outside site, do I lose that same amount of link juice or is it just applied to there site and not removed from mine? I understand that linking to a competitor can in turn help him and hurt me (if he then is seen as more relevant than me to google) but does it have a direct relation to hurting/removing my page link juice? Hope this all makes sense. Thanks
Technical SEO | | SheffieldMarketing0 -
How do you disallow HTTPS?
I currently have a site (startuploans.org) that runs everything as http, recently we decided to start an online application to process loan apps. Now, for one certain section we configured ssl to work (https://www.startuploans.org/secure/). If I go to the HTTPS url for any of my other pages they show up...I was going to just 301 everything from https but because it is in a subdirectiory I can't... Also, canonical URL's won't work either because it's a totally different system and the pages are generated in an odd manor. It's really just 1 page that needs to be disallowed.. Is there any way to disallow all HTTPS requests from robots.txt while keeping all the HTTP requests working as normal?
Technical SEO | | WebsiteConsultants0