Best practices for handling https content?
-
Hi Mozzers - I'm having an issue with https content on my site that I need help with.
Basically we have some pages that are meant to be secured, cart pages, auth pages, etc, and then we have the rest of the site that isn't secured. I need those pages to load correctly and independently of one another so that we are using both protocols correctly.
Problem is - when a secure page is rendered the resources behind it (scripts, etc) won't load with the unsecured paths that are in our master page files currently.
One solution would be to render the entire site in https only, however this really scares me from an SEO standpoint. I don't know if I want to put my eggs in that basket.
Another solution is to structure the site so that secure pages are built differently from unsecured pages, but that requires a bit of re-structuring and new SOPs to be put in place.
I guess my question is really about best practices when using https.
- How can I avoid duplication issues?
- When do I need to use rel=canonical?
- What is the best way to do things here to avoid heavy maintenance moving forward?
-
Thanks for the RE Cyrus. One of my architects and I came to a similar conclusion, but it's definitely good to hear it from another source in the SEO community on the development side of things.
We decided to implement a side-wide rel=canonical to the http URLs to avoid duplication issues, as well as ensure resources are using relative links.
I'm hoping this solves each issue with minimal impact!
-
Hi Cody,
First of all, Google generally doesn't have much trouble today with HTTPS content, and generally treats it and ranks just like anything else.
In fact, I'd say in a couple more years this may be the norm.
As for using rel canonical, you generally want to use it anytime there is a risk of duplicate content. In this case, the important thing is to use the full URL, and not relative URLs. So https://example.com. This should take care of 100% of your duplication issues.
I'm not an expert in https development (but I have a little experience) ithout diving too deep into how you serve your content, it's usually fine to serve file like javascript and images from both secure and non-secure paths. In this instance, you want to make sure your http files are calling relative file paths (as opposed to absolute) and make sure the content loads. 9 times out of 10 this works fine.
Hope this helps. Best of luck with your SEO!
-
Any more input here? Are there any issues with using a sitewide rel=canonical to avoid the duplication of our https URLs?
-
Thanks for the RE, but I'm not sure that answers my question. I'm looking for best practice information about how to build https content. The noindex tip is good. I'll do that. Just wondering how the back end should work to make sure I don't get "insecure content" warnings.
-
Don't go the whole site https route. You are just creating duplicate site nightmares.
Since you are working within a cart and auth pages you need to add a noindex nofollow meta tag on those pages to start with. This way they don't get into the index to start with, also any pages that are in the index now will be dropped. Do not use robots.txt for this, use the meta tag noindex nofollow.
You need to setup 301 redirects on all other pages from the https to the http version for all pages except the cart and auth pages (i.e those pages that are supposed to be https). If Google has found any of those pages that are supposed to be http, then the 301 will correct that, plus you get the user back to the right version of the page for bookmarking and other purposes.
I
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps: Best Practice
What should and what shouldn't go in the sitemap? In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs? Thanks for any advice/ anecdotes 🙂
Intermediate & Advanced SEO | | Fubra0 -
Best way to handle traffic from links brought in from old domain.
I've seen many versions of answers to this question both in the forum, and throughout the internet... However, none of them seem to specifically address this particular situation. Here goes: I work for a company that has a website (www.example.com) but has also operated under a few different names in the past. I discovered that a friend of the company was still holding onto one of the domains that belonged to one of the older versions of the company (www.asample.com) and he was kind enough to transfer it into our account. My first reaction was to simply 301 redirect the older to the newer. After I did this, I discovered that there were still quite a few active and very relevant links to that domain, upon reporting this to the company owners they were suddenly concerned that a customer may feel misdirected by clicking www.asample.com and having www.example.com pop up. So I constructed a single page on the old domain that explained that www.asample.com was now called www.example.com and provided a link. We recently did a little house cleaning and moved all of our online holdings "under one roof" so to speak, and when the rep was going over things with the owners began to exclaim that this was a horrible idea, and that domain should instead be linked to it's own hosting account, and wordpress (or some other CMS) should be installed, and a few pages of content about the companies/subject should be posted. So the question: Which one of these is the most beneficial to the site and the business that are currently operating (www.example.com?) I don't see a real problem with any of these answers, but I do see a potentially un-needed expense in the third solution if a simple 301 will bring about the most value. Anyone else dealt with a situation like this?
Intermediate & Advanced SEO | | modulusman0 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
Best practice for listings with outbound links
My site contains a number of listings for charities that offer various sporting activities for people to get involved in order to raise money. As part of the listing we provide an outbound link for the user to find out more info about each of the charities and their activities. Currently these listings are blocked in the robots.txt for fear that we may be viewed as a 'link farm or spam site' (as there are hundreds of charities listed on the scrolling page) but these links out are genuine and provide benefits and are a useful resource for the user and not paid links. What I'd like to do is make these listings fully crawlable and indexable to increase our search traffic to these listing, but I'm not sure whether this would have a negative impact on our Pagerank with Google potentially viewing all these outbound links as 'bad' or 'paid links', Would removing the listing pages from our robots.txt and making all the outbound links 'nofollow' be the way forward to allow us to properly index the listings without being penalised as some kind of link farm or spam site? (N.B. I have no interest in passing link juice to the external charity websites)
Intermediate & Advanced SEO | | simon_realbuzz0 -
Is it possible for a multi doctor practice to have the practice's picture displayed in Google's SERP?
Google now includes pictures of authors in the results of the pages. Therefore, a single practice doctor can include her picture into Google's SERP (http://markup.io/v/dqpyajgz7jkd). How can a multi doctor practice display the practice's picture as opposed to a single doctor? A search for Plastic Surgery Chicago displayed this (query: plastic surgery Chicago) http://markup.io/v/bx3f28ynh4w5. I found one example of a search result showing a picture of both doctors for a multi doctor practice (query: houston texas plastic surgeon). http://markup.io/v/t20gfazxfa6h
Intermediate & Advanced SEO | | CakeWebsites0 -
Best links to gain?
Hi, Just a quick one to see what peoples thoughts are regarding links. I have just gained a free link on a .gov website in the UK. In one of their offers pages, will this provide any link value or domain trust to me and what can the benefits be SEO wise from having a link on a government domain? The link is just the website url with a a few lines of text detailing our address etc... so its not got any anchor text regarding targetting one of our brands. It is equallly, less or more important to target anchor text links to specific brands or to get good high quality links from trusted sites such as the .gov one I have linking to my root domain? The website is a local council website in the UK. And was listed by a member of their staff, who only list in the offer page if your offering discount to council members etc..., so its not a spammed page or anything like that. What are peoples views on anchor text links vs domain url links? Cheers Will
Intermediate & Advanced SEO | | YNWA0 -
What are best practices for multi-language seo?
We are looking for best practices for multi-language seo. We are interested in any books, message boards or blogs that you can recommend. Thanks.
Intermediate & Advanced SEO | | FareCompare0 -
BEING PROACTIVE ABOUT CONTENT DUPLICATION...
So we all know that duplicate content is bad for SEO. I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source .... so if anyone else copies it they get penalised and not me... Would appreciate your answers... 🙂 regards,
Intermediate & Advanced SEO | | TopGearMedia0