Confused About Problems Regarding Adding an SSL
-
After reading Cyrus' article: http://moz.com/blog/seo-tips-https-ssl, I am now completely confused about what adding SSL could do to our site. Bluehost, our hosting provider, says if we get their SSL, they just add it to our site and it's up in a few hours: no problem whatsoever. If that's true, that'd be fantastic...however, if that's true, there wouldn't need to be like 10 things you're supposed to do (according to Cyrus' article) to ensure your rankings after the switch.
Can someone clarify this for me?
Thanks,
Ruben
-
Thanks Cyrus!
-
Hi Ruben,
Thanks for writing in. I'm unfamiliar with Bluehost's HTTPs service, but I assume they are taking care of top level issues. You'll still want to go through the checklist to make sure everything is valid and you follow SEO best practices.In short:
- Check your links
- Check your assets (images, CSS, javascript)
- Canonical tags
- Register with Google Webmaster Tools
- Update your sitemaps and robots.txt files
This covers the important stuff. As you noted, a few more tips here: http://moz.com/blog/seo-tips-https-ssl
-
Maybe was obvious to everybody but 301 redirect for every single page is also a fundamental step, otherwise you are going to have broken external links, not to mention WMT which I don't think would be satisfied by just the canonical update.
Sitemap must be updated as well.
We recently switched a website from HTTP to HTTPS and in term of performance there was no difference after the update, at least according to WMT and analytics.
I was kind of scared before to update but at the end everything was smoother than expected, WMT took around 10 days to completely re-index the https version.
But of course we kept finding some non https link embedded here and there in some pages for days and we had to manually edit some content to avoid ssl warning from browsers.
-
I have no idea what CMS you are using but check the server side code generating the link, not just the code sent to the browser.
We recently switched to SSL, and our CMS was already building internal links on pages using the protocol of the http request.
-
Thanks Highland!
-
Great, thanks!
-
Ruben, I had a look at your website and your URLs all have HTTP in them so these would need to be updated all across your site before you make the switch to HTTPS. Because you are using WordPress this should be as simple as updating the site URL to https://www.kempruge.com.
The tip by @Highland about using Firebug is excellent. This will allow you to quickly debug if there are non-HTTPS links remaining - in the WordPress theme or template, for example.
Have a look at the WordPress HTTPS documentation also.
-
Hi Alex,
I'm not really sure if we use a protocol-less linking pattern or not. I don't see http:// in any of our urls, so if that's the criteria I'm guessing we don't? I included a screenshot of one of our URLs. Would you mind telling me if it's clear from the image whether we do or do not?
Thanks for your response. I really appreciate your time and input.
Best,
Ruben
-
One major tip I always point people to is that using protocol-less links for anything external is a great way to make sure your site always supports SSL without issue.
Firebug is a great way to make sure everything is loading HTTPS. Turn it on, switch to the Net tab, and load your page. It will show you every request sent as part of your page. It makes spotting non-SSL requests easy.
You can turn HSTS on yourself if your provider uses Apache and supports htaccess. (sorry I can't link an article, Moz won't let me). If they don't, you will have to have your host enable it on their end.
-
Implementing SSL should be straightforward for the most part
You need to ensure that links around your site (including canonical links) are updated to use HTTPS (so https://example.com/link as opposed to http://example.com/link where example.com is your domain name). If you are already using a protocol-less linking pattern (//example.com/link) you don't need to update the links.
You can also configure your web server to only serve HTTPS. If your web server is Apache you can do this with the SSLRequireSSL directive.
<code><location>SSLRequireSSL</location></code>
HTTPS also causes a significant slow-down as the browser and the server negotiate a secure connection. If your site has already been optimized for speed it should not cause a problem but if in doubt revisit that process and ensure that you are getting the best possible speed for your visitors.
The article by Cyrus has a great checklist to double check everything.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO implication of adding large number of new product pages
If I have an eCommerce website containing 10,000 product pages and then I add 10,000 new product pages using a bulk upload (with limited/basic but unique content), does this pose any SEO risk? I am obviously aware of the risks of adding a large number of low quality content to the website, which is not the case here, however what I am trying to ascertain is whether simply doubling the number of pages in itself causes any risk to our SEO efforts? Does it flag to the Search Engines that something "spammy" is happening (even if its not)
Intermediate & Advanced SEO | | DHS_SH0 -
SEO question regarding rails app on www.site.com hosted on Heroku and www.site.com/blog at another host
Hi, I have a rails app hosted on Heroku (www.site.com) and would much prefer to set up a Wordpress blog using a different host pointing to www.site.com/blog, as opposed to using a gem within the actual app. Whats are peoples thoughts regarding there being any ranking implications for implementing the set up as noted in this post on Stackoverflow: "What I would do is serve your Wordpress blog along side your Rails app (so you've got a PHP and a Rails server running), and just have your /blog route point to a controller that redirects to your Wordpress app. Add something like this to your routes.rb: _`get '/blog', to:'blog#redirect'`_ and then have a redirect method in your BlogController that simply does this: _`classBlogController<applicationcontrollerdef redirect="" redirect_to="" "url_of_wordpress_blog"endend<="" code=""></applicationcontrollerdef>`_ _Now you can point at yourdomain.com/blog and it will take you to the Wordpress site._
Intermediate & Advanced SEO | | Anward0 -
Your advice regarding thin content would be really appreciated
Hi guys, I have embarked on a new site creation. The site is being created from scratch and very custom. Basically the site allows people to review certain products and services. If each review completed by users is seen as a seperate page by google ... is this considered deceptive or a likelihood of being slapped with a thin content penalty? Basically 1 product may have hundreds of reviews naturally over time. Some may be really short and some may be longer. the reason why i would like the user reviews to be seen as seperate pages is because I want google to understand that people are regularly interacting with the main content page. Any advice in this area would be really appreciated.
Intermediate & Advanced SEO | | irdeto0 -
Enormous 7 page drop after switching servers and adding load balancers. Thoughts?
Hello Everyone, I'm a longtime Moz user but I had to switch accounts after switching jobs. I was hoping someone might be able to give me some insight on whats going on if possible. Our startup had first page position for our most valuable keyword: "Crowdfunding real estate" for about 6 or 7 months. Once we launched and switched to a production server behind load balancers, we dropped almost overnight to 7th page and we've been there for about a month. We don't have many links yet and some of the ones we DO have are kind of spammy (no idea where they came from and in process of trying to get them removed) but we thought it'd be strange to see that massive drop. We are even pages below a competitor who has NO links and basically zero content on the page. We don't have any notifications in WMT about a manual penalty or anything. I'd really, really appreciate any advice and If anyone has any ideas, the page is at: PatchofLand.com Thanks, Jason
Intermediate & Advanced SEO | | PatchofLand0 -
Time sensitive: HELP! We are having a problem doing a 301 redirect.....what can we do instead?
Our website has dynamic URLs and we are moving to another server/platform. 301 redirects is looking like a highly unlikely solution. A 3rd party company is handling the back-end of the website which they say works more like a "search engine" than a traditional website. Maybe that explains why they're having a hard time with the 301 redirects. Worst case scenario: we can't use the 301 redirect. What else can we do? We are considering "Indicate your canonical (preferred) URLs by including them in a Sitemap" as Google describes here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139066#2. I'm wondering if this method only applies to duplicate content........and what would happen once the old website results in a 404 page...... HELP! We need to cross over to the new platform as soon as possible.
Intermediate & Advanced SEO | | PatriotOutfitters810 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0 -
How do i get over my alt tage problems at a cateogry level?
At present at a category level, our site does not incorporate images specific to the category you are in and therefore we do not have appropriate alt tags to suffice SEO requirements.It only covers categories you are navigating too. e.g. http://www.towelsrus.co.uk/towels/catlist_fnct561.htm (no image placement available on page for that category, it only shows sub categories Does anyone have any suggestions how we get over this? How big a deal is it to not have image with appropriate keyword driven alt tag? Can you put more than 1 keyword phrase in a alt tag?
Intermediate & Advanced SEO | | Towelsrus0 -
Ad units or % of ads vs content?
When looking at content "above the fold" is it more important to look at ad units or the visual % of unique content to ads? For example, if there are 6 small ad units or one large ad unit that takes up 30% of the page, which is better for search engines? In general, is 50% unique content above the fold with 50% ads adequate or what % do you try to optimize for?
Intermediate & Advanced SEO | | nicole.healthline0