How bad is it to have duplicate content across http:// and https:// versions of the site?
-
A lot of pages on our website are currently indexed on both their http:// and https:// URLs. I realise that this is a duplicate content problem, but how major an issue is this in practice?
Also, am I right in saying that the best solution would be to use rel canonical tags to highlight the https pages as the canonical versions?
-
Thank you both - and sorry for not replying earlier. It sounds like we have some work to do
-
There isn't best solution in this case.
All you need is to prefer some http or https and to migrate everything to preferred protocol. This mean that you shoud:
- Make 301 redirects
- Fix all canonicals
- Do site move in SearchConsole. This is tricky because you need to verify both properties there and move one of them to other.
- Fix all internal links in pages to avoid 301 redirects
- Fix all images/scripts and other content to avoid 301 redirect
In theory this is. As you can see "fix canonical" is just one small step there.
-
The biggest problem with duplicate content (in most cases) is that you leave it up to Google (or search engines in general) to decide to which version they are going to send the traffic.
I assume that if you have a site in https you would want all visits to go to the https version. Rather than using a canonical url (which is merely a friendly request) I would 301 the http to the https version.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
From: http://www. to https://
Hi all, I am changing my hosting for legal and SEO reasons from http://www to https:// . Now I hear different stories on the redirects: 1: should i try and change my backlinks? 2: internally all links will be 301 redirected at first. Than I want to (manually) change them. It;s within Wordpress so there should be a plugin for this. Tips? 3: Will it affect my rankings and for what period? What I now know that at first it will drop little but eventually you will rank higher than before. Thanks so much in advance! Tymen
Technical SEO | | Tymen1 -
Duplicate Content of Reseller Product?
There is a particular product/service that I resell through an API. There are quite a few of them and each one requires a lot of content. The company provides web content for each product but I'm wondering about the SEO implications of using it? Obviously using the content, it will not be unique so I won't be able to rank (easily at least) for these products. Are there any _negative_results that I can get from using this content though? If I simply won't rank for those products it's not an issue since I get traffic elsewhere. Thanks!
Technical SEO | | reliabox0 -
If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.
Technical SEO | | HeadwatersContent0 -
Duplicate content issues with australian and us version of website
Good afternoon. I've tried searching for an answer to the following question but I believe my circumstance is a little different than what has been asked in the past. I currently run a Australian website targeted at a specific demographic (50-75) and we produce a LARGE number of articles on a wide variety of lifestyle segments. All of our focus up until now has been in Australia and our SEO and language is dedicated towards this. The next logical step in my mind is to launch a mirror website targeted at the US market. This website would be a simple mirror of a large number of articles (1000+) on subjects such as Food, Health, Travel, Money and Technology. Our current CMS has no problems in duplicating the specific items over and sharing everything, the problem is in the fact that we currently use a .com.au domain and the .com domain in unavailable and not for sale, which would mean we have to create a new name for the US targeted domain. The question is, how will mirroring this information, targeted towards US, affect us on Google and would we better off getting a large number of these articles 're-written' by a company on freelancer.com etc? Thanks,
Technical SEO | | Geelong
Drew0 -
How do you find bad links to your site?
My website has around 900 incoming links and I have a Google 50 penalty that is sitewide. I have been doing research and from what I can see is that the 50 penalty is usually associated with scetchy links. The penalty started last year. I had about 40 related domains to my main site and each had a simple one page site with a link to the main site. (I know I screwed up) I cleaned up all of those links by removing them. The single page site still exist, but they have no links and several of them still rank very well. I also had an outside SEO person that bought a few links. I came clean with Google and told them everything. I gave them all of my sites and that the SEO person had bought links. I gave them full disclosure and removed everything. I have one site that I can't get the link removed from. I have contacted them numerous times to remove the link and I get no response. I am curious if anyone has had a simular experience and how they corrected the situation. Another issue is that my site is "thin" because its an ecommerce affiliate site and full of affiliate links. I work in the costume market. I'm also afraid that I have other bad links pointing to my site. Dooes anyone know of a tool to identify bad links that Google may be penalizing me for at this time. Here is Google's latest denial of my reconsideration request. Dear site owner or webmaster of XXXXXXXXX.com. We received a request from a site owner to reconsider XXXXXXXX.com for compliance with Google's Webmaster Guidelines. We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from XXXXXXXXXX.com may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visit https://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality
Technical SEO | | tadden0 -
Duplicate content handling.
Hi all, I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites. You can see an example of the page here: http://tinyurl.com/62wghs5 As you can see the search results are on the right. A majority of these results will also appear on my competitors sites. My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page? Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site. My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.
Technical SEO | | Mulith0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0