Best practices for handling https content?
-
Hi Mozzers - I'm having an issue with https content on my site that I need help with.
Basically we have some pages that are meant to be secured, cart pages, auth pages, etc, and then we have the rest of the site that isn't secured. I need those pages to load correctly and independently of one another so that we are using both protocols correctly.
Problem is - when a secure page is rendered the resources behind it (scripts, etc) won't load with the unsecured paths that are in our master page files currently.
One solution would be to render the entire site in https only, however this really scares me from an SEO standpoint. I don't know if I want to put my eggs in that basket.
Another solution is to structure the site so that secure pages are built differently from unsecured pages, but that requires a bit of re-structuring and new SOPs to be put in place.
I guess my question is really about best practices when using https.
- How can I avoid duplication issues?
- When do I need to use rel=canonical?
- What is the best way to do things here to avoid heavy maintenance moving forward?
-
Thanks for the RE Cyrus. One of my architects and I came to a similar conclusion, but it's definitely good to hear it from another source in the SEO community on the development side of things.
We decided to implement a side-wide rel=canonical to the http URLs to avoid duplication issues, as well as ensure resources are using relative links.
I'm hoping this solves each issue with minimal impact!
-
Hi Cody,
First of all, Google generally doesn't have much trouble today with HTTPS content, and generally treats it and ranks just like anything else.
In fact, I'd say in a couple more years this may be the norm.
As for using rel canonical, you generally want to use it anytime there is a risk of duplicate content. In this case, the important thing is to use the full URL, and not relative URLs. So https://example.com. This should take care of 100% of your duplication issues.
I'm not an expert in https development (but I have a little experience) ithout diving too deep into how you serve your content, it's usually fine to serve file like javascript and images from both secure and non-secure paths. In this instance, you want to make sure your http files are calling relative file paths (as opposed to absolute) and make sure the content loads. 9 times out of 10 this works fine.
Hope this helps. Best of luck with your SEO!
-
Any more input here? Are there any issues with using a sitewide rel=canonical to avoid the duplication of our https URLs?
-
Thanks for the RE, but I'm not sure that answers my question. I'm looking for best practice information about how to build https content. The noindex tip is good. I'll do that. Just wondering how the back end should work to make sure I don't get "insecure content" warnings.
-
Don't go the whole site https route. You are just creating duplicate site nightmares.
Since you are working within a cart and auth pages you need to add a noindex nofollow meta tag on those pages to start with. This way they don't get into the index to start with, also any pages that are in the index now will be dropped. Do not use robots.txt for this, use the meta tag noindex nofollow.
You need to setup 301 redirects on all other pages from the https to the http version for all pages except the cart and auth pages (i.e those pages that are supposed to be https). If Google has found any of those pages that are supposed to be http, then the 301 will correct that, plus you get the user back to the right version of the page for bookmarking and other purposes.
I
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Content suggestions from MOZ
Hello, I checked moz content suggestions and for one of my keywords “Burgundy bike tours”. It gives me expressions such as “Burgundy France” and “Burgundy wine”. My question is whether I should include the exact expression “Burgundy wine” in a sentence or if include burgundy somewhere in my text and wine somewhere else if it is fine ? PS : What is the real difference between marketmuse and moz ? and why do they sometimes give different suggestions ?
Intermediate & Advanced SEO | | seoanalytics0 -
Handling alternate domains
Hi guys, We're noticing a few alternate hostnames for a website rearing their ugly heads in search results and I was wondering how everyone else handles them. For example, we've seen: alt-www.(domain).com test.(domain).com uat.(domain).com We're looking to ensure that these versions all canonical to their live page equivalent and we're adding meta robots noindex nofollow to all pages as an initial measure. Would you recommend a robots.txt crawler exclusion to these too? All feedback welcome! Cheers, Sean
Intermediate & Advanced SEO | | seanginnaw0 -
Best SEO practice for multiple languages in website
HI, We would like to include multiple languages for our global website. What's the best practice to gain from UI and SEO too. Can we have auto language choosing website as per browsing location? Or dedicated pages for important languages like www.website.com/de for German. If we go for latter, how about when users browsing beside language page as they will be usually in English
Intermediate & Advanced SEO | | vtmoz0 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0 -
How do I Syndicating Content for SEO Benefit?
Right now, I am working on one E-Commerce website. I have found same content on that E-Commerce website from manufacturer website. You can visit following pages to know more about it. http://www.vistastores.com/casablanca-sectional-sofa-with-ottoman-ci-1236-moc.html http://www.abbyson.com/room/contemporary/casablanca-detail http://www.vistastores.com/contemporary-coffee-table-in-american-white-oak-with-black-lacquer-element-ft55cfa.html http://www.furnitech.com/ft55cfa.html I don't want to go with Robots.txt, Meta Robots NOINDEX & Canonical tag. Because, There are 5000+ products available on website with duplicate content. So, I am thinking to add Source URL on each product page with Do follow attribute. Do you think? That will help me to save my website from duplicate content penalty? OR How do I Syndicating Content for SEO Benefit?
Intermediate & Advanced SEO | | CommercePundit0 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0