Do you still loose 15% of value of inbound links when you redirect your site from http to https (so all inbound links to http are being redirected to https version)?
-
I know when you redesign your on website, you loose about 15% internally due to the 301 redirects (see moz article: https://moz.com/blog/accidental-seo-tests-how-301-redirects-are-likely-impacting-your-brand), but I'm wondering if that also applies to value of inbound links when you redirect your http://www.sitename.com to https://www.sitename.com.
I appreciate your help!
-
Exactly what Thomas said.
-
If it's easy for you to get them to change them, then yes I would, however I wouldn't spend more than a quick email on getting it done.
-
Thanks, Logan. So do you think there is any value in having the top inbound links (highest domain authority) change their links to the https version of the site?
-
There is no value or authority lost with the redirects from non-secure to secure. Google gives slightly more weight to sites that go secure, so the trade off is they don't ding those types of redirects anymore.
Gary Illyes also recently stated that any type of 3xx redirect no longer loses PageRank - take that with a grain of salt though, I only provided this bit of info to put you at ease about redirects for HTTP > HTTPS.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Working out whether a site is http and https
Hi there, I can access the following site with http and https making me think that there will be a duplicate content issue. How can I work out if this is the case? http://ionadebarge.com https://ionadebarge.com Thanks.
Technical SEO | | Bee1591 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Really stuck guys. how do you earn links for an ecommerce site?
Hi seomoz guys So I read a lot on seomoz and other sites, and see a lot of info about earning links, but this all seems to be aimed at blogs and brochure sites. Being blunt, we sell stuff. Our competitors sell the same stuff. We are limited on what we can do product description content wise as our suppliers expect us to use the content they supply. We are now putting together q and a pages based on a great article in the seomoz blog. Does anyone have any great ideas for developing link bait or just getting links to an eCommerce site? We know our service is good, and our product is good, but that's not enough to convince someone to link to us. Help Thanks Paul
Technical SEO | | TheUniqueSEO0 -
Site Wide Links
I have a link on pr 3 home page website placed in the side bar. It is on a WordPress website that spans a couple hundred pages and the side bar is on every page. The majority of the pages are not ranked or have any pr. Can this affect me negatively?
Technical SEO | | raph39880 -
Asking to remove links from other sites
How hard is it to get people to take off links that point to your site that are on theirs? I have about 4 sites that I would like my link OFF of their blogroll because I think I was hit by the penguin update because of that. Do you know if there is anything you can do if they DON'T take it off?
Technical SEO | | SeaC0 -
Https-pages still in the SERP's
Hi all, my problem is the following: our CMS (self-developed) produces https-versions of our "normal" web pages, which means duplicate content. Our it-department put the <noindex,nofollow>on the https pages, that was like 6 weeks ago.</noindex,nofollow> I check the number of indexed pages once a week and still see a lot of these https pages in the Google index. I know that I may hit different data center and that these numbers aren't 100% valid, but still... sometimes the number of indexed https even moves up. Any ideas/suggestions? Wait for a longer time? Or take the time and go to Webmaster Tools to kick them out of the index? Another question: for a nice query, one https page ranks No. 1. If I kick the page out of the index, do you think that the http page replaces the No. 1 position? Or will the ranking be lost? (sends some nice traffic :-))... thanx in advance 😉
Technical SEO | | accessKellyOCG0 -
Redirect not picking up any link juice
Hi, We recently had a domain name change, as we had an established site we had all pages redirected to the new domain. This was over a month ago but despite the redirect SEOmoz doesn't recognise any links to and from the site. Is this due to simply time duration and SEOmoz can't pick up on any redirected info, or could there be a problem with the redirect? Thanks, Adam
Technical SEO | | adamgthorndike0