Http to https question (SSL)
-
Hi,
I recently made two big changes to a site - www.aerlawgroup.com (not smart, I know). First, I changed from Weebly to Wordpress (WP Engine hosting with CDN + Cloudflare - is that overkill?) and I added SSL (http to https). From a technical perspective, I think I made a better site: (1) blazing fast, (2) mobile responsive, (3) more secure.
I'm seeing the rankings fluctuate quite a bit, especially on the important keywords. I added SSL to my other sites, and saw no rankings change (they actually all went up slightly).
I'm wondering if anyone has had experience going to SSL and can give me feedback on something I might have overlooked. Again, it's strange that all the other sites responded positively, but the one listed above is going in the opposite direction. Maybe there are other problems, and the SSL is just a coincidence. Any feedback would be appreciated.
I followed this guide: http://moz.com/blog/seo-tips-https-ssl - which helped tremendously (FYI).
-
I'm also a big fan of changing the complete domain to HTTPS. Therefore I'm using HSTS response header to enforce this. The great advantage is that the browsers remember that site as HTTPS and skips any redirect you may have to make from HTTP to HTTPS. So might worth looking at this as well. We are using KeyCDN with force SSL feature enabled.
-
It's a bit overkill, but if you want to get rid of something, you can get rid of wp engine. I have a lot of websites running on cheap $5 hosts + cloudflare and once everything is cached, they are blazing fast.
Regarding the rankings, as Cyrus said, depending on the niche you'll see fluctuations, i have a website where i see movement in the serps every day or every other day.
Website looks nice, clean and professional.
-
Likely a coincidence, or at least highly probably there are other circumstances at play.
If you changed platforms, changed content, links, architecture at all, if there have been any changes in the backlinks, if the competition has made changes (something you can't controll!) if Google has made algorithm changes - even specific to your vertical, then you are bound to see changes in rankings that might be hard to pinpoint or explain.
Attorneys, especially those in certain niches like DUI, are especially tough and prone to fluctuation. Might take some extra investigation on your part.
Regardless, the site looks good and fast. Nice work!
-
Cloudflare is good, particularly with SSL. If it works well (check fetch + render in webmaster tools) then I would keep it.
You shouldn't need W3 Total Cache with WP Engines own caching, so I wouldn't mess around with your site performance any more if it is all working fine. You have good speeds as it is.
-
Would you recommend getting rid of CloudFlare? With 27 requests and a 300kb file size, I just don't think I need it. Especially if it's potentially causing fetch errors.
-
Hi,
Thank you for the detailed response. Yeah, I wondered if a new site + new host (WP Engine) + Cloudflare + SSL all at the same time was just too much.
I use WPEngine, which includes MaxCDN. With that said, WPEngine doesn't allow W3 Total Cache.
Thanks again for the feedback. I appreciate it.
-
Hi,
No, it is not overkill to use a CDN with Cloudflare. For my own site, I used MaxCDN with Cloudflare Railgun with HTTPS. Cloudflare railgun (free with certain hosts) will cache the content that shouldn't be cached, so great for SSL.
Unfortunately, what I found was Cloudflare gave Google fetch errors for certain files, so now I just use Maxcdn plus I like my EV SSL certificate which doesn't work with Cloudflare (unless you have the $200 pm plan).
You may want to check out https://www.besthostnews.com/guide-to-w3-total-cache-settings-with-cloudflare/ as that guide will help optimize your site, although I think WP Engine has it's own caching system.
Looking at your site: http://tools.pingdom.com/fpt/#!/dzqaUq/https://www.aerlawgroup.com/ it looks very lightweight, with only 27 requests. That is about as good as it gets, especially with your very low page size (300kb). I personally think you will struggle to optimize the site more, as quite frankly... your site speed is excellent. Well done!
Regards
Jonathan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Subdomain SEO questions
I have a main site - mysite.com. I just created a subdomain - leadform.mysite.com I plan to use the leadform.mysite.com as a 1 page lead form only. I will link to leadform.mysite.com from mysite.com and also from other websites I own (myothersite.com etc.) - filtering all traffic to this form to capture leads. (Note - the leadform.mysite.com has CNAME to other server that hosts the backend of the form) My questions are: How should I link from mysite.com to leadform.mysite.com? With dofollow or nofollow? (mysite.com has 1000's of pages and would link from every page with "get a quote' type button) 2) How should I link from myothersite.com to leadform.mysite.com? With dofollow or nofollow? Any SEO risk linking to leadform.mysite.com from an outside domain? (myothersite.com has 1000's of pages and would link from every page with "get a quote' type button) Does it make sense to build links from outside sites to leadform.mysite.com directly to try to get that lead capture page to rank on it's own? 4) Does it make sense to link back from leadform.mysite.com back to mysite.com for seo value? With dofollow or nofollow? Thanks in advance for any help.
Intermediate & Advanced SEO | | leadforms0 -
Technical 301 question
Howdy all, this has been bugging me for a while and I wanted to know the communities ideas on this. We have a .com website which has a little domain authority and is growing steadily. We are a UK business (but have a US office which we will be adapting too soon) We are ranking better within google.com than we do on google.co.uk probably down to our TLD. Is it a wise idea to 301 our .com to .co.uk for en-gb enquiries only? Is there any evidence that this will help improve our position? will all the link juice passed from 301s go to our .co.uk only if we are still applying the use of .com in the US? Many thanks and hope this isn't too complicated! Best wishes,
Intermediate & Advanced SEO | | TVFurniture
Chris0 -
Questions on Google Penguin Clean-up Strategy
Hello Moz Community! I was hit with a REAL bad penalty in May 2013, and the date corresponds to Penguin #4. Never received a manual spam action, but the 50% drop in traffic was very apparent. Since then, I've had a slow reduction in traffic, to where I am today... which is almost baseline. Increases in traffic have not occurred regardless of efforts. In researching a little more, I see that my old SEO companies built my links with exact keyterm matches, many of them repeated over and over, verbatim, on different sites. I've heard two pieces of advice that I don't like 1) scrap the site, or 2) disavow all the links. I would rather see if I can get the webmasters to change the link to something generic, or my brand name, before I do either of these. To scrap my site and start new will be damn near impossible because I'm in an extremely competitive niche, and my site has age (since 2007), so rather work with what I have. A couple of questions, for folks who are in the know about this penalty, if I may: This penguin update, #4, on May 22nd, was it ONLY because of the link text? Or was it also because of the link quality? None of the updates before it harmed me, and I believe those were because of the quality? Could it be for links linking from my blog to my site? My blog (ex. www.mysite.com/blog), has close to 1,000 blog posts, and back in the days I would write these really long, keyword stuffed links leading to www.mysite.com. I've been in the process of cleaning these up, and shortening them, and changing them to more generic (click here's), but it is a LONG and painstaking process. If I get webmasters to change text to just the url or brand name, that's better than disavowing, correct? As long the linking site has a decent spam score and PA/DA on OSE? Is having SOME exact anchor text okay on these links? Is it just the abuse that's the problem? If so, how many should I leave? (like 5 max per keyword?) Or should I just change to the url, or disavow altogether, any and all links that have exact keyword matches? I've downloaded my link profile from OSE and Majestic, and will do so from Ahrefs (I believe it is)? Does Webmaster Tools have any section that can help give me insights into the issue? If so, can you point me in the right direction? Can I get partial credit, for some work done? For instance, say a major update, or crawl, happens, and I've only fixed/disavowed 25% percent of the links by then, is there a possibility that I get a small boost in traffic? Or am I in the doghouse till they are all fixed? Say I clean/disavow everything up, will my improvement be seen in the next crawl? Or the next Penguin update? As there may be a substantial difference in time there. 😎 I see AHREFS, has some information on anchor text... any rules of thumb as to percentages of use of a certain anchor text, to see if I'm abusing or not, before I start undertaking all of this? Thanks! Could the penalty have "passed" altogether, and this is just where I rank? Thanks guys, but the last thing I want to do is ditch my site... I will work hard on this, but need some guidance. Much appreciated! David
Intermediate & Advanced SEO | | DavidC.0 -
E-Commerce Panda Question
I'm torn. Many of our 'niche' ecommerce products rank well, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Intermediate & Advanced SEO | | saultienut0 -
Question spam malware causing many indexed pages
Hey Mozzers, I was speaking with a friend today about a site that he has been working on that was infected when he began working on it. Here (https://www.google.ca/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:themeliorist.ca) you can see that the site has 4400 indexed pages, but if you scroll down you will see some pages such as /pfizer-viagra-samples/ or /dapoxetine-kentucky/. All of these pages are returning 404 errors, and I ran it through SEO spider just to see if any of these pages would show up, and they don't. This is not an issue for a client, but I am just curious why these pages are still hanging around in the index. Maybe others have experience this issue too. Cheers,
Intermediate & Advanced SEO | | evan890 -
Parked Domain question
Hi, If a domain has been parked for more than 12 years, and has never been used for a project so far, does this has an impact on SEO or its like having a fresh new domain? Sebi
Intermediate & Advanced SEO | | TheHecksler0 -
Wrong Website Showing Up On Knowledge Graph - Car Dealer SEO Question
Hi Everyone, I have a client who has two website platforms, one of them is mandated by the manufacturer and the other is the one we use and is linked up to our Google Plus/Maps/etc. accounts. The one that is manufacturer mandated is showing up on the Google Knowledge graph and this is not ideal for us. Unfortunately, we cannot get rid of the other site because it is mandated. So how do we go about fixing this issue? I Had a few ideas, and I'd like to know if they would work. If you can think of something that's outside of the box, I'd appreciate it. 1.) Put a rel=canonical across the website 2.) Remove all keywords that might trigger it to show up on the knowledge graph from the URL of the non ideal site 3.) Go for a .net or .us domain. Do these kind of domains have less authority and are less likely to show up in a google search? Thanks!
Intermediate & Advanced SEO | | oomdomarketing0 -
Followup question to rand(om) question: Would two different versions (mobile/desktop) on the same URL work well from an SEO perspective and provide a better overall end-user experience?
We read today's rand(om) question on responsive design. This is a topic we have been thinking about and ultimately landing on a different solution. Our opinion is the best user experience is two version (desktop and mobile) that live on one URL. For example, a non-mobile visitor that visits http://www.tripadvisor.com/ will see the desktop (non-responsive) version. However, if a mobile visitor (i.e. iOS) visits the same URL they will see a mobile version of the site, but it is still on the same URL There is not a separate subdomain or URL - instead the page dynamically changes based on the end user's user agent. It looks like they are accomplishing this by using javascript to change the physical layout of the page to match the user's device. This is what we are considering doing for our site. It seems this would simultaneously solve the problems mentioned in the rand(om) question and provide an even better user experience. By using this method, we can create a truly mobile version of the website that is similar to an app. Unfortunately, mobile versions and desktop users have very different expectations and behaviors while interacting with a webpage. I'm interested to hear the negative side of developing two versions of the site and using javascript to serve the "right" version on the same URL. Thanks for your time!
Intermediate & Advanced SEO | | davidangotti0