Http v https Duplicate Issues
-
Hello,
I noticed earlier an issue on my site.
http://mysite.com and https://mysite.com both had canonical links pointing to themselves so in effect creating duplicate content.
I have now taken steps to ensure the https version has a canonical that points to the http version but I was wondering what other steps would people recommend? Is it safe to NOINDEX the https pages? Or block them via robots.txt or both?
We are not quite ready to go fully HTTPS with our site yet (I know Google now prefers this)
Any thoughts would be very much appreciated.
-
Since HTTPS is now a ranking signal, it is better to use the HTTPS version as the canonical. I would personally make every page of the site HTTPS via 301 redirections (or rel=canonical but those can be trickier to implement).
http://site.com --301--> https://site.com
http://site.com/page1/ --301--> https://site.com/page1/
etc.This may require a few changes to the site (internal links shouldn't have unnecessary redirections, adding the HTTPS site to Search Consol (webmaster tools), etc.) so make sure you look around for resources on migration.
If you decide to keep HTTP only, do not noindex or disallow HTTPS because you may have valuable links pointing to HTTPS which help your ranking.
-
Thanks for your replies. Although I'm still confused.
I have areas of the site that are and should be https (checkout etc) and these pages have canonical links pointing to the https version.
The rest of my site however is still on http but the https versions can be accessed via their urls. What I have done today is to add a canonical tag to the https pages to point to the http pages. Is this the correct thing to do to avoid a duplicate content issue?
-
Hi,
I agree with Patrick, if you are not using the https then the safest way to ensure no canonical content is to remove it all together.
If you are using it partially such as checkouts and user areas, then you could 301 redirect the https traffic for the other pages to their https counterparts until you are ready to go full https.
Kind Regards
-
Hi there
If your site is not ready to go fully https, I would hold off on it until you are, unless you have a checkout process or information gathering portion of the site that should be https.
Reason being - the https isn't providing any ranking factor value as it's being canonicalized to the http version of your site, so you're not getting the value.
When you are ready to go https, I recommend taking a look at this Moz resource, specifically the section under SEO checklist to preserve your rankings.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forced Redirects/HTTP<>HTTPS 301 Question
Hi All, Sorry for what's about to be a long-ish question, but tl;dr: Has anyone else had experience with a 301 redirect at the server level between HTTP and HTTPS versions of a site in order to maintain accurate social media share counts? This is new to me and I'm wondering how common it is. I'm having issues with this forced redirect between HTTP/HTTPS as outlined below and am struggling to find any information that will help me to troubleshoot this or better understand the situation. If anyone has any recommendations for things to try or sources to read up on, I'd appreciate it. I'm especially concerned about any issues that this may be causing at the SEO level and the known-unknowns. A magazine I work for recently relaunched after switching platforms from Atavist to Newspack (which is run via WordPress). Since then, we've been having some issues with 301s, but they relate to new stories that are native to our new platform/CMS and have had zero URL changes. We've always used HTTPS. Basically, the preview for any post we make linking to the new site, including these new (non-migrated pages) on Facebook previews as a 301 in the title and with no image. This also overrides the social media metadata we set through Yoast Premium. I ran some of the links through the Facebook debugger and it appears that Facebook is reading these links to our site (using https) as redirects to http that then redirect to https. I was told by our tech support person on Newspack's team that this is intentional, so that Facebook will maintain accurate share counts versus separate share counts for http/https, however this forced redirect seems to be failing if we can't post our links with any metadata. (The only way to reliably fix is by adding a query parameter to each URL which, obviously, still gives us inaccurate share counts.) This is the first time I've encountered this intentional redirect thing and I've asked a few times for more information about how it's set up just for my own edification, but all I can get is that it’s something managed at the server level and is designed to prevent separate share counts for HTTP and HTTPS. Has anyone encountered this method before, and can anyone either explain it to me or point me in the direction of a resource where I can learn more about how it's configured as well as the pros and cons? I'm especially concerned about our SEO with this and how this may impact the way search engines read our site. So far, nothing's come up on scans, but I'd like to stay one step ahead of this. Thanks in advance!
Technical SEO | | ogiovetti0 -
Some Issues about my Blog
I am facing issue regarding to my Blog https://digitalmedialine.com/blog/. As some pages are not Rank in google yet. Can Anyone help me out how to rank those blogs to improve my Traffic. Thanks in Advance.
Technical SEO | | qwaswd0 -
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
Which URL do I request Google News inclusion for: the http or the non-http?
In Google WMT/Search Console, I've marked the non-www. version of my site as the preferred. But I haven't run into a choice between http:// and non-http:// before. Should I choose the one listed at the top, which is the non-http (AND the non-www) version? Thanks! Unknown.png
Technical SEO | | christyrobinson1 -
Http:// to https:// 301 or 302 redirect
I've read over the Q & A in the Community, but am wondering the reasoning behind this issue. I know - 301's are permanent and pass links, and 302s are temporary (due to cache) and don't pass links. But, I've run across two sites now that 302 redirect http:// to https://. Is there a valid reason behind this? From my POV and research, the redirect should 301 if it's permanent, but is there a larger issue I am missing?
Technical SEO | | FOTF_DigitalMarketing1 -
Subdomains Issue
Hi , We have created sub domains of our site to target various Geo´s. For example, geo, uk.site.com, de.site,com and all these sub domains have the same content as main domain. Will it affect our SEO Rankings? How can we solve this if it affects our rankings?
Technical SEO | | mikerbrt240 -
Wordpress duplicate pages
I am using Wordpress and getting duplicate content Crawler error for following two pages http://edustars.yourstory.in/tag/edupristine/ http://edustars.yourstory.in/tag/education-startups/ These two are tags which take you to the same page. All the other tags/categories which take you to the same page or have same title are also throwing errors, how do i fix it?
Technical SEO | | bhanu22170 -
Is 100% duplicate content always duplicate?
Bit of a strange question here that would be keen on getting the opinions of others on. Let's say we have a web page which is 1000 lines line, pulling content from 5 websites (the content itself is duplicate, say rss headlines, for example). Obviously any content on it's own will be viewed by Google as being duplicate and so will suffer for it. However, given one of the ways duplicate content is considered is a page being x% the same as another page, be it your own site or someone elses. In the case of our duplicate page, while 100% of the content is duplicate, the page is no more than 20% identical to another page so would it technically be picked up as duplicate. Hope that makes sense? My reason for asking is I want to pull latest tweets, news and rss from leading sites onto a site I am developing. Obviously the site will have it's own content too but also want to pull in external.
Technical SEO | | Grumpy_Carl0