Http v https Duplicate Issues
-
Hello,
I noticed earlier an issue on my site.
http://mysite.com and https://mysite.com both had canonical links pointing to themselves so in effect creating duplicate content.
I have now taken steps to ensure the https version has a canonical that points to the http version but I was wondering what other steps would people recommend? Is it safe to NOINDEX the https pages? Or block them via robots.txt or both?
We are not quite ready to go fully HTTPS with our site yet (I know Google now prefers this)
Any thoughts would be very much appreciated.
-
Since HTTPS is now a ranking signal, it is better to use the HTTPS version as the canonical. I would personally make every page of the site HTTPS via 301 redirections (or rel=canonical but those can be trickier to implement).
http://site.com --301--> https://site.com
http://site.com/page1/ --301--> https://site.com/page1/
etc.This may require a few changes to the site (internal links shouldn't have unnecessary redirections, adding the HTTPS site to Search Consol (webmaster tools), etc.) so make sure you look around for resources on migration.
If you decide to keep HTTP only, do not noindex or disallow HTTPS because you may have valuable links pointing to HTTPS which help your ranking.
-
Thanks for your replies. Although I'm still confused.
I have areas of the site that are and should be https (checkout etc) and these pages have canonical links pointing to the https version.
The rest of my site however is still on http but the https versions can be accessed via their urls. What I have done today is to add a canonical tag to the https pages to point to the http pages. Is this the correct thing to do to avoid a duplicate content issue?
-
Hi,
I agree with Patrick, if you are not using the https then the safest way to ensure no canonical content is to remove it all together.
If you are using it partially such as checkouts and user areas, then you could 301 redirect the https traffic for the other pages to their https counterparts until you are ready to go full https.
Kind Regards
-
Hi there
If your site is not ready to go fully https, I would hold off on it until you are, unless you have a checkout process or information gathering portion of the site that should be https.
Reason being - the https isn't providing any ranking factor value as it's being canonicalized to the http version of your site, so you're not getting the value.
When you are ready to go https, I recommend taking a look at this Moz resource, specifically the section under SEO checklist to preserve your rankings.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which URL do I request Google News inclusion for: the http or the non-http?
In Google WMT/Search Console, I've marked the non-www. version of my site as the preferred. But I haven't run into a choice between http:// and non-http:// before. Should I choose the one listed at the top, which is the non-http (AND the non-www) version? Thanks! Unknown.png
Technical SEO | | christyrobinson1 -
Which Sitemap to keep - Http or https (or both)
Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
Technical SEO | | ashishb010 -
Duplicated rel=author tags (x 3) on WordPress pages, any issue with this?
Hi,
Technical SEO | | jeffwhitfield
We seem to have duplicated rel=author tags (x 3) on WordPress pages, as we are using Yoast WordPress SEO plugin which adds a rel=author tag into the head of the page and Fancier Author Box plugin which seems to add a further two rel=author tags toward the bottom of the page. I checked the settings for Fancier Author Box and there doesn't seem to be the option to turn rel=author tags off; we need to keep this plugin enabled as we want the two tab functionality of the author bio and latest posts. All three rel=author tags seem to be correctly formatted and Google Structured Data Testing Tool shows that all authorship rel=author markup is correct; is there any issue with having these duplicated rel=author tags on the WordPress pages?
I tried searching the Q&A but couldn't find anything similar enough to what I'm asking above. Many thanks in advance and kind regards.0 -
Disavow Issues
Hi We have a client who was hit by Penguin about 18 months ago. We disavowed all the bad links about 10 months ago however this has not resulted in an uplift in traffic or rankings. The client is asking me whether it would be better to dump the domain and move the website to a fresh domain. Can you provide thoughts / experience on this please? Thanks.
Technical SEO | | EffectiveSEOUK0 -
What online tools are best to identify website duplicate content (plagiarism) issues?
I've discovered that one of the sites I am working on includes content which also appears on number of other sites. I need to understand exactly how much of the content is duplicated so I can replace it with unique copy. To do this I have tried using tools such as plagspotter.com and copyscape.com with mixed results, nothing so far is able to give me a reliable picture of exactly how much of my existing website content is duplicated on 3rd party sites. Any advice welcome!
Technical SEO | | HomeJames0 -
Duplicate Pages , Do they matter ?
I have been told buy the company who created my site that duplicate page warning are not a problem ? my site is small and only has 50 pages ( including product pages etc ) yet the crawl shows over 6500 duplicate pages am I right to be concerned?
Technical SEO | | Gardening4you0 -
Duplicate content /index.php/ issues
I'm having some duplicate content issues with Google. I've already got my .htaccess file working just fine as far as I can tell. Rewriting works great, and by using the site you'd never end up on a page with /index.php. However I do notice that on ANY page of the site you could add /index.php and get the same page i.e.: www.mysite.com/category/article and www.mysite.com/index.php/category/article Would both return the same page. How can I 301 or something similar all /index.php pages to the non index.php version? I have no desire for any page on my site to have index.php in it, there is no use to it. Having quite the hard time figuring this out. Again this is basically just for the robots, the URL's the users see are perfect, never had an issue with that. Just SEOMOZ reporting duplicate content and I've verified that to be true.
Technical SEO | | b18turboef1 -
Duplicate page content issue needs resolution.
After my last "crawl" report, I received a warning about "duplicate page content". One page was: http://anycompany.com and the other was: http://anycompany.com/home.html How do I correct this so these pages aren't competing with each other or is this a problem?
Technical SEO | | JamesSagerser0