Http and https duplicate content?
-
Hello,
This is a quick one or two.
If I have a page accessible on http and https count as duplicate content?
What about external links pointing to my website to the http or https page.
Regards,
Cornel
-
Cornel
I suggest adding the rel=canonical tag on your page to the version you'd like index. That way both the http and https version of the same page, will display the same "http" canonical and avoid the duplicate content issue and also act as a 301 from your https to http for the bots. The user's would still be able to get to the https homepage if they are logged into your website and or browsing on https, so that they don't toggle back and forth between https and http.
-
The article is still applicable today and here is more info on your topic from a Q&A earlier this year - http://www.seomoz.org/q/duplicate-content-and-http-and-https
You will notice the top endorsed answer (endorsed by DR Pete) uses this method and points to this article
You will also notice a 301 redirect is recommended solution as I mentioned above though they do it the other way around. I would do it to the secure version as that is the one I would imagine you want to be the page that is used as it is secure. Search engines can index both protocols so doing it either way won't harm you...
-
Hi Matt,
Thank you for your response, but I have one more question.
The article is from 2008 is it still applicable today?
Cornel
-
Yes it does
Have a look here
http://www.seomoz.org/ugc/solving-duplicate-content-issues-with-http-and-https
I would 301 redirect the http:// version to the https:// version to take care of any link issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Of Pages As HTTPS vs HTTP
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
Intermediate & Advanced SEO | | vikasnwu1 -
Domain Level Redirects - HTTP and HTTPS
About 2 years ago (well before I started with the company), we did an http=>https migration. It was not done correctly. The http=>https redirect was never inserted into the .htaccess file. In essence, we have 2 websites. According to Google search console, we have 19,000 HTTP URLs indexed and 9,500 HTTPS URLs indexed. I've done a larger scale http=>https migration (60,000 SKUs), and our rankings dropped significantly for 6-8 weeks. We did this the right way, using sitemaps, and http and https GSC properties. Google came out recently and said that this type of rankings drop is normal for large sites. I need to set the appropriate expectations for management. Questions: How badly is the domain split affecting our rankings, if at all? Our rankings aren't bad, but I believe we are underperforming our backlink profile. Can we expect a net rankings gain when the smoke clears? There are a number of other technical SEO issues going on as well. How badly will our rankings drop (temporarily) and for how long when we add the redirect to the .htaccess file? Is there a way to mitigate the rankings impact? For example, only submitting partial sitemaps to our GSC http property? Has anyone gone through this before?
Intermediate & Advanced SEO | | Satans_Apprentice0 -
I added an SSL certificate this morning and now I noticed duplicate content
Ok, so Im a newbie, therefor I make mistakes! Lots of them. I added an SSL certificate this morning bc it was free and I read it can help my rankings. Now I just checked it in screaming frog and saw two duplicate content pages due to the https. So im panicking! What's the easiest way to fix this?? Can I undue an SSL certificate? I guess what's the easiest that will also be best for ranking. Thank you!! Rena
Intermediate & Advanced SEO | | palila0 -
HTTP vs HTTPS duplication where HTTPS is non-existing
Hey Guys, **My site is **http://www.citymetrocarpetcleaning.com.au/ Goal: I am checking if there is an HTTPS version of my site (duplication issue) What I did: 1. I went to Screaming Frog and run https://www.citymetrocarpetcleaning.com.au/. The result is that it is 200 OK (the HTTPS version exists - possible duplication) 2. Next, I opened a browser and manually replace HTTP with HTTPS, the result is "Image 1" which doesn't indicate a duplication. But if we go deeper in Advanced > Proceed to www.citymetrocarpetcleaning.com.au (unsafe) "Image 2", it displays the content (Image 3). Question: 1. Is there an HTTP vs HTTPs duplication here? 2. Do I need to implement 301 redirection/canonical tags on HTTPS pointing to HTTP to solve duplication? Please help! Cheers! uIgJv DsNrA El7aI
Intermediate & Advanced SEO | | gamajunova0 -
Cross Domain duplicate content...
Does anyone have any experience with this situation? We have 2 ecommerce websites that carry 90% of the same products, with mostly duplicate product descriptions across domains. We will be running some tests shortly. Question 1: If we deindex a group of product pages on Site A, should we see an increase in ranking for the same products on Site B? I know nothing is certain, just curious to hear your input. The same 2 domains have different niche authorities. One is healthcare products, the other is general merchandise. We've seen this because different products rank higher on 1 domain or the other. Both sites have the same Moz Domain Authority (42, go figure). We are strongly considering cross domain canonicals. Question 2 Does niche authority transfer with a cross domain canonical? In other words, for a particular product, will it rank the same on both domains regardless of which direction we canonical? Ex: Site A: Healthcare Products, Site B: General Merchandise. I have a health product that ranks #15 on site A, and #30 on site B. If I use rel=canonical for this product on site B pointing at the same product on Site A, will the ranking be the same if I use Rel=canonical from Site A to Site B? Again, best guess is fine. Question 3: These domains have similar category page structures, URLs, etc, but feature different products for a particular category. Since the pages are different, will cross domain canonicals be honored by Google?
Intermediate & Advanced SEO | | AMHC1 -
Scraping / Duplicate Content Question
Hi All, I understanding the way to protect content such as a feature rich article is to create authorship by linking to your Google+ account. My Question
Intermediate & Advanced SEO | | Mark_Ch
You have created a webpage that is informative but not worthy to be an article, hence no need create authorship in Google+
If a competitor comes along and steals this content word for word, something similar, creates their own Google+ page, can you be penalised? Is there any way to protect yourself without authorship and Google+? Regards Mark0 -
Robots.txt & Duplicate Content
In reviewing my crawl results I have 5666 pages of duplicate content. I believe this is because many of the indexed pages are just different ways to get to the same content. There is one primary culprit. It's a series of URL's related to CatalogSearch - for example; http://www.careerbags.com/catalogsearch/result/index/?q=Mobile I have 10074 of those links indexed according to my MOZ crawl. Of those 5349 are tagged as duplicate content. Another 4725 are not. Here are some additional sample links: http://www.careerbags.com/catalogsearch/result/index/?dir=desc&order=relevance&p=2&q=Amy
Intermediate & Advanced SEO | | Careerbags
http://www.careerbags.com/catalogsearch/result/index/?color=28&q=bellemonde
http://www.careerbags.com/catalogsearch/result/index/?cat=9&color=241&dir=asc&order=relevance&q=baggallini All of these links are just different ways of searching through our product catalog. My question is should we disallow - catalogsearch via the robots file? Are these links doing more harm than good?0 -
Best tools for identifying internal duplicate content
Hello again Mozzers! Other than the Moz tool, are there any other tools out there for identifying internal duplicate content? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0