HTTP vs HTTPS duplication where HTTPS is non-existing
-
Hey Guys,
**My site is **http://www.citymetrocarpetcleaning.com.au/
Goal: I am checking if there is an HTTPS version of my site (duplication issue)
What I did:
1. I went to Screaming Frog and run https://www.citymetrocarpetcleaning.com.au/. The result is that it is 200 OK (the HTTPS version exists - possible duplication)
2. Next, I opened a browser and manually replace HTTP with HTTPS, the result is "Image 1" which doesn't indicate a duplication. But if we go deeper in Advanced > Proceed to www.citymetrocarpetcleaning.com.au (unsafe) "Image 2", it displays the content (Image 3).
Question:
1. Is there an HTTP vs HTTPs duplication here?
2. Do I need to implement 301 redirection/canonical tags on HTTPS pointing to HTTP to solve duplication?
Please help! Cheers!
-
Hi There!
The best practice here would be to implement those 301 redirects to prevent the possibility of duplicate content. It does not appear that any https://www. pages are in Google's index now (screenshot) but the potential is there with both http://www. and https://www. URLs being able to resolve and the fact that they both contain self-referencing canonical tags.
I would recommend making the switch to https:// given Google's recent announcement regarding the 'Not Secure' warning in Chrome 62 starting October (All http:// pages with forms will be marked as 'not secure' - more on that here). This warning will be triggered on every page due to the 'request a quote' form in the right-hand sidebar. If you are interested in looking into this further, I recommend checking out this 'Secure your site with HTTPS' Search Console Help article.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http vs. https - duplicate content
Hi I have recently come across a new issue on our site, where https & http titles are showing as duplicate. I read https://moz.com/community/q/duplicate-content-and-http-and-https however, am wondering as https is now a ranking factor, blocked this can't be a good thing? We aren't in a position to roll out https everywhere, so what would be the best thing to do next? I thought about implementing canonicals? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Hreflang in vs. sitemap?
Hi all, I decided to identify alternate language pages of my site via sitemap to save our development team some time. I also like the idea of having leaner markup. However, my site has many alternate language and country page variations, so after creating a sitemap that includes mostly tier 1 and tier 2 level URLs, i now have a sitemap file that's 17mb. I did a couple google searches to see is sitemap file size can ever be an issue and found a discussion or two that suggested keeping the size small and a really old article that recommended keeping it < 10mb. Does the sitemap file size matter? GWT has verified the sitemap and appears to be indexing the URLs fine. Are there any particular benefits to specifying alternate versions of a URL in vs. sitemap? Thanks, -Eugene
Intermediate & Advanced SEO | | eugene_bgb0 -
Converting to HTTPS
I have a 10 yr old website that we are just now adding a Symantec SSL with Extended Validation. I've seen some older posts about whether switching to URL's to HTTP effects Google ranking and I understand it may in the short run, but I wondered if anyone had any updated info about how best to go about this. Are there any step by step articles that could walk me through this? Our certificate is already installed now, but we haven't forced it out there yet. If I understand right, we will use the HTTPS on the entire site. I am not very experienced with 301's, but I believe I can set this up in Godaddy.com where our domain is reigstered so that our old HTTP forwards to HTTPS. Also, I don't think this effects anything within GWT so I don't think I have to make any changes there. Am I missing anything? FYI, the prices for this on the Symantec site are pretty high for a small business like ours. I looked around and found https://www.thesslstore.com/, an SSL reseller, had cheaper prices listed on their site. As it turns out, I called to ask a technical question and the sales person offered to email me a custom quote that was even cheaper than what was listed on their site. So if you are dealing with a limited budget, I might recommend you call The SSL Store and get a quote from them. I am not an affiliate and having nothing to do with them, I was just happy with their service and I believe it cost me about 1/2 of the price on the Symantec site. Hope that helps someone.
Intermediate & Advanced SEO | | jacksghost0 -
Duplicate Sub-domains Being Indexed
Hi all, I have this site that has a sub-domain that is meant to be a "support" for clients. Some sort of FAQ pages, if you will. A lot of them are dynamic URLs, hence, the title and most of the content are duplicated. Crawl Diagnostics found 52 duplicate content, 138 duplicate title and a lot other errors. My question is, what would be the best practice to fix this issue? Should I noindex and nofollow all of its subdomains? Thanks in advance.
Intermediate & Advanced SEO | | EdwardDennis0 -
HTTP Header Canonical Tags
I want to be able to add canonical tags to http headers of individual URL's using .htacess, but I can't find any examples for how to do this. The only example I found was when specifying a file: http://www.seomoz.org/blog/how-to-advanced-relcanonical-http-headers N.B. It's not possible to add regular canonical tags to the of my pages as they're dynamically generated. I was trying to add the following to the .htaccess in order to add a canonical tag in the header of the page http://frugal-father.com/is-finance-in-the-uk-too-london-centric/, but I've checked with Live HTTP headers and the canonical line isn't showing : <files "is-finance-in-the-uk-too-london-centric="" "="">Header add Link "<http: frugal-father.com="">; rel="canonical"'</http:></files> Any ideas?
Intermediate & Advanced SEO | | AndrewAkesson0 -
Rel canonical and duplicate subdomains
Hi, I'm working with a site that has multiple sub domains of entirely duplicate content. So, the production level site that visitors see is (for made-up illustrative example): 123abc456.edu Then, there are sub domains which are used by different developers to work on their own changes to the production site, before those changes are pushed to production: Larry.123abc456.edu Moe.123abc456.edu Curly.123abc456.edu Google ends up indexing these duplicate sub domains, which is of course not good. If we add a canonical tag to the head section of the production page (and therefor all of the duplicate sub domains) will that cause some kind of problem... having a canonical tag on a page pointing to itself? Is it okay to have a canonical tag on a page pointing to that same page? To complete the example... In this example, where our production page is 123abc456.edu, our canonical tag on all pages (this page and therefor the duplicate subdomains) would be: Is that going to be okay and fix this without causing some new problem of a canonical tag pointing to the page it's on? Thanks!
Intermediate & Advanced SEO | | 945010 -
Push for site-wide https, but all pages in index are http. Should I fight the tide?
Hi there, First Q&A question 🙂 So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues. While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index. The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc." That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it. I work for a certificate authority. A company that issues SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/ Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https. The bottom line for me is; I have a site of ~800 pages that I will need to switch to https. I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion. So, here are a few general questions. What are the major considerations for such a switch? Are there any less obvious pitfalls lurking? Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions? Is that something that can be done with canonicalization? or would something at the server level be necessary? How is that going to affect my page authority in general? What obvious questions am I not asking? Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible. Any input will be very much appreciated. Thanks, Dennis
Intermediate & Advanced SEO | | dennis.globalsign0