HTTPS pages - To meta no-index or not to meta no-index?
-
I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages.
I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.
-
Hi Jamie,
If you don’t need the http version accessible and want to force the https you could simply redirect all traffic to the secure site with a 301, transferring all your pagerank to the main site.
If you need both versions of the site accessible, for instance if you only needed https for logged in users, and you only want one version to appear in SERPs the best thing would be to use a canonical tag to consolidate all that SEO juice into the version you wish to rank.
If there’s only a few secure pages with links to other non-secure pages then meta robots noindex,follow would work well, since the SEO juice will flow through those noindexed page and into the rest of your site, but if the whole site is duplicated on both versions this could be a big mistake.
No-indexing an entire https version would be a bad move even if you were using noindex,follow since your internal linking will be to the secure pages. Even though pagerank will be passed through those pages it will eventually come to a dead end or leave through an external links. With the canonical tag, any links pointing to your secure version will pass their SEO juice to the non-secure site, rather than be lost in the noindexed site where it has nowhere to go.
Have a little read of this interview with Matt Cutts a few years back for further clarification, it’s got a good quote about how PR flows through noindexed, followed pages: http://www.stonetemple.com/articles/interview-matt-cutts.shtml
Matt Cutts: A NoIndex page can accumulate PageRank, because the links are still followed outwards from a NoIndex page.
Eric Enge: So, it can accumulate and pass PageRank.
Matt Cutts: Right, and it will still accumulate PageRank, but it won't be showing in our Index. So, I wouldn't make a NoIndex page that itself is a dead end. You can make a NoIndex page that has links to lots of other pages.
So it’ll be different depending on your circumstances but if you’re in doubt, the canonical tag is your best bet as you’re only consolidating those pages in googles eyes. If those pages perform well and you noindex them without sending that PR somewhere useful you could be throwing away all that benefit.
Hope that helps,
Tom
-
Why not rel=
canonical
them?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
Does it make sense to create new pages with friendlier URLs then redirect old pages to new?
Hi Moz! My client has messy URLs. does it make sense to write new clean URLs, then 301 redirect all old URLs to the new ones? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Best practice to prevent pages from being indexed?
Generally speaking, is it better to use robots.txt or rel=noindex to prevent duplicate pages from being indexed?
Intermediate & Advanced SEO | | TheaterMania0 -
Wordpress site, MOZ showing missing meta description but pages do not exist on backend
I've got a wordpress website (a client) and MOZ keeps showing missing meta descriptions. When I look at the pages these are nonsense pages, they do exist somewhere but I am not seeing them on the backend. Questions: 1) how do I fix this? Maybe it's a rel con issue? why is this referring to "non-sense" pages? When I go to the page there is nothing on it except maybe an image or the headline, it's very strange. Any input out there I greatly appreciate. Thank you
Intermediate & Advanced SEO | | SOM240 -
Any downsides of (permanent)redirecting 404 pages to more generic pages(category page)
Hi, We have a site which is somewhat like e-bay, they have several categories and advertisements posted by customers/ client. These advertisements disappear over time and turn into 404 pages. We have the option to redirect the user to the corresponding category page, but we're afraid of any negative impact of this change. Are there any downsides, and is this really the best option we have? Thanks in advance!
Intermediate & Advanced SEO | | vhendriks0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Indexing specified entry pages
Hi,We are currently working on location based info.Basically, when someone searches from Florida they will get specific Florida results and when they search from California they will specific California results.How does this location based info affect crawling and indexing?Lets say we have location info for googlebot, sometimes they crawl from a New York ip address, sometimes they do it from Texas and sometimes from California. In this case google will index 3 different pages with 3 different prices and a bit different text, and I'm afraid they might see these as some kind of cloaking or suspicious movement because we serve different versions of the page. What's the best way to handle this?
Intermediate & Advanced SEO | | SEODinosaur0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640