Duplicate Content, http vs https
-
Hi All!
I just discovered that a client of ours a duplicate content issue. Essentially they have approximately 20 pages that have an http and an https version. Is there a better way to handle this than a simple 301?
Regards,
Frank
-
Hi Frank
A 301 would take care of the problem very well, but where that may not be possible, using a canonical tag would do the trick. If you can make sure that the page, in either http or https form uses a consistent canonical (one version for both), that will protect you from any duplicate issues as well.
For more information on canonicals you can read the Moz guide on them.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Products description from third party vendor creating duplicate content issues?
Hi, I am running my client's e-store. The store sells different products from various vendors. Vendors provide us product descriptions. The problem is that these vendors also give these description to display their products on other similar sites and hence creating duplicate content issue. Thanks.
On-Page Optimization | | Kashif-Amin0 -
What to do about resellers duplicating content?
Just went through a big redevelopment for a client and now have fresh images and updated content but now all the resellers have just grabbed the new images/content and pasted them on their own site. My client is a manufacture that sells directly online and over the phone for large orders. I'm just not sure how to handle the resellers duplicate content. Any thoughts on this? Am I being silly for worrying about this?
On-Page Optimization | | ericnkatz0 -
Duplicate Home Page
Hi, I have a question around best practise on duplicate home pages. The /index.aspx page is showing up as a top referrer in my analytics. I have the rel=canonical tag implemented for the www.mysite.com on both pages. Do I need to 301 the /index.aspx to the mysite.com? I have a lot of links pointing to the /index.aspx (half of those are coming from the mysite.com). www.mysite.com/index.aspx www.mysite.com Many thanks Jon
On-Page Optimization | | JonRaubenheimer0 -
Does Bing consider http://www.domain.com the same as https://www.domain.com?
Bing Webmaster Tools showed me that sometimes it displays https://www.domain.com in its results and sometimes http://www.domain.com. That got me thinking. Does Bing consider https to be a seperate duplicate copy of the http version? IE does my site get knocked down for duplicate content because of this? In Google webmaster tools, I can tell it whether I want https or http. But I dont know how to tell Bing. Any pointers will be appreciated. Thanks Dan
On-Page Optimization | | DanFromUK0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Meta Data definition for multiple pages. Potential duplicate content risk?
Hi all, One of our clients needs to redefine their meta title and description tags. They publish very similar information almost every day, so the structure they propose is the following: Structure 1: Type of Analysis + periodicity + data + brand name Examples 1: Monthly Market Analysis, 1/5/2012 - Brand Name Weekly Technical Analysis, 7/5/2012 - Brand Name Structure 2: Company Name + investment recommendation + periodicity Example 2: Iberdrola + investment recommendation (this text doesn't vary) + 2T12 (wich means 2012, 2nd trimestrer) Regarding meta description they want to follow a similar approach, replicating every time the same info with a slight variation for each publication. I'm afraid this may cause a duplicate content problem because of the resemblance of every "Market Analysis" done or every "Investment recommendation" done in the future. My initial suggestion for them is to define specific and unique meta data for each page, but this is not possible for them given the time it takes to do it for every page. Finally, I ask them to specify the data in each meta title of content published, in order to add something different each time and avoid duplicate content penalty. Will this be enough to avoid duplicate content issues? Thanks in advance for your help folks! Alex
On-Page Optimization | | elisainteractive0 -
Https and secure pages
Hello, I have a facebook badge in my footer. Is it okay if I make the code call on https. It makes the page secure for IE. I have also have done this for images. These secure urls are also being called on non secure pages. But I don't think that matters does it? Code below. Thanks Tyler
On-Page Optimization | | tylerfraser0 -
Duplicate page content errors
Site just crawled and report shows many duplicate pages but doesn't tell me which ones are dups of each other. For you experienced duplicate page experts, do you have a subscription with copyscape and pay $.05 per test? What is the best way to clear these? Thanks in advance
On-Page Optimization | | joemas990