Duplicate pages with http and https
-
Hi all,
We changed the payment part of our site to https from http a while ago. However once on the https pages, all the footer and header links are relative URLs, so once users have reached the payment pages and then re-navigate back to other pages in our website they stay on https. The build up of this happening has led to Google indexing all our pages in https (something we did not want to happen), and now we are in the situation where our homepage listing on Google is https rather than http.
We would prefer the organic listings to be http (rather than https) and having read lots on this (included the great posts on the moz (still feels odd not refering to it as seomoz!) blog around this subject), possible solutions include redirects or a canoncial tags.
My additional questions around these options are:
1. We already have 2 redirects on some pages (long story), will another one negatively impact our rankings?
2. Is a canonical a strong enough hint to Google to stop Google indexing the https versions of these page to the extent that out http pages will appear in natural listings again?
If anyone has any other suggestions or other ideas of how to address this issue, that would be great!
Thanks
Diana
-
Hi Dan, thanks for the link!
-
Hi Diana
This may have been implied, but is it not an option to change your internal linking? It's obviously best to have all your internal links point to the exact locations you want without 301 redirects. So absolute URLs are recommended in this case pointing to final pages.
I think this article may help you: http://www.screamingfrog.co.uk/5-easy-steps-to-fix-secure-page-https-duplicate-content/ - I'd follow the steps there, it's pretty solid!
-Dan
-
a) Some how, the authority passed will be lesser. As under 301 redirects - its not 100% authority which is passed to redirection page. Refer this - http://www.seroundtable.com/archives/021832.html
So, Authority passes on keeps diminishing if the chain of redirects keeps on increasing to reach to that page.
b) Canonical gives google a strong indication that the original and main page is x, Google generally respect this and pass on authority to the canonical page.
Had i been on your side, what i would have actually done is :-
a) Removed all 301 redirect chain i.e A to B to home page etc and make all redirect from A to home page, B to home page and applied canonical to home page
b) Have standalone template for Home page and apply canonical tag to it say http://www.abc.com - so that if i near future, again if something changes. The template override of canonical will be considered the main page.
-
Hi Diana,
-
Check this video from Matt Cutts about several redirects: http://www.youtube.com/watch?v=r1lVPrYoBkA
-
A canonical is strong enough, check this video from Matt Cutts: http://www.youtube.com/watch?v=Cm9onOGTgeM
Good luck!
Gijsbert
-
-
Diana,
1.There are good reasons to limit your 301, especially regarding preserving link juice, but you are OK with chaining three 301s, as Matt Cutts describes here http://www.youtube.com/watch?v=r1lVPrYoBkA
2. yes, you can canonicalize those page to the http version to bring them back into the search results instead of the https version. If you can 301 everything but the payment pages, you could use that method too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Few pages without SSL
Hi, A website is not fully secured with a SSL certificate.
Intermediate & Advanced SEO | | AdenaSEO
Approx 97% of the pages on the website are secured. A few pages are unfortunately not secured with a SSL certificate, because otherwise some functions on those pages do not work. It's a website where you can play online games. These games do not work with an SSL connection. Is there anything we have to consider or optimize?
Because, for example when we click on the secure lock icon in the browser, the following notice.
Your connection to this site is not fully secured Can this harm the Google ranking? Regards,
Tom1 -
Webshop landing pages and product pages
Hi, I am doing extensive keyword research for the SEO of a big webshop. Since this shop sells technical books and software (legal books, tax software and so on), I come across a lot of very specific keywords for separate products. Isn't it better to try and rank in the SERP's with all the separate product pages, instead of with the landing (category) pages?
Intermediate & Advanced SEO | | Mat_C0 -
Our Web Site Is candere.com. Its PA and back link status are different for https://www.candere.com, http://www.candere.com, https://candere.com, and http://candere.com. Recently, we have completely move from http to https.
How can we fix it, so that we may mot lose ranking and authority.
Intermediate & Advanced SEO | | Dhananjayukumar0 -
I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2\. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect?
I'm going through the crawl report and it says I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2/ Now, the author/admin/page/2 I can't even find in WordPress, but it is the same thing as blog/page/2 nonetheless. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect it to blog/page/2?
Intermediate & Advanced SEO | | shift-inc0 -
Different Header on Home Page vs Sub pages
Hello, I am an SEO/PPC manager for a company that does a medical detox. You can see the site in question here: http://opiates.com. My question is, I've never heard of it specifically being a problem to have a different header on the home page of the site than on the subpages, but I rarely see it either. Most sites, if i'm not mistaken, use a consistent header across most of the site. However, a person i'm working for now said that she has had other SEO's look at the site (above) and they always say that it is a big SEO problem to have a different header on the homepage than on the subpages. Any thoughts on this subject? I've never heard of this before. Thanks, Jesse
Intermediate & Advanced SEO | | Waismann0 -
Duplicate page content query
Hi forum, For some reason I have recently received a large increase in my Duplicate Page Content issues. Currently it says I have over 7,000 duplicate page content errors! For example it says: Sample URLs with this Duplicate Page Content http://dikelli.com.au/accessories/gowns/news.html http://dikelli.com.au/accessories/news.html
Intermediate & Advanced SEO | | sterls
http://dikelli.com.au/gallery/dikelli/gowns/gowns/sale_gowns.html However there are no physical links to any of these page on my site and even when I look at my FTP files (I am using Dreamweaver) these directories and files do not exist. Can anyone please tell me why the SEOMOZ crawl is coming up with these errors and how to solve them?0 -
Google Filter? Drop from top first page to bottom second page?
My site has dropped from the first page top spots to the bottom second page, about 2 month ago. From time to time it reappears in the first page, is this some kind of google filter? How do I solve this issue?
Intermediate & Advanced SEO | | Ofer230 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0