Https slower site Versus Non https faster site??
-
Hey all,
I know that everyone is going on about https as a ranking signal (as far as I read it is not a very important ranking signal, but a low ranking signal) but
- Site speed is a ranking signal
- https is now a ranking signal as well
- https makes sites slower
So in view of the above, what's better?
- An https site that is slower
- A non https site that is faster
Thanks!
-
While I do not know how Google treats HTTPS in regards to site speed, WebPageTest.org uses the following to score a site's Time To First Byte:
"The target time is the time needed for the DNS, socket and SSL negotiations + 100ms. A single letter grade will be deducted for every 100ms beyond the target."
Which means WebPageTest does not penalize a site for being secured.
Edit: For redirections, 301 everything and change previously added redirects to point to the HTTPS so you don't end up with chained redirections.
As far as GWT is concerned, I would add both sites (http://site.com and https://site.com) and use the Change of Address feature on the HTTP one to the HTTPS one.
Hope this helps.
-
I would submit another sitemap within GWT for safe measure after everything is changed and 301s in place but I don't believe there is anything else that is needed. Keep an eye on the important marketing blogs because I'm sure there will be more of this sort of information in the coming weeks.
-
So how do you change it all to HTTPS? Anything to do in GWT? Are there any redirections needed? ....?
-
Something else to consider here is that Google specifically said "over time, we may decide to strengthen" that signal. I know Google says a lot of things but with this I'd rather be in front of my competition. And with e-commerce I've seen more than one study that shows SSL and security badges increase conversions and trust. Not sure how helpful this is, just my two cents.
-
Do you have any data on the few miliseconds delay?
Our ecom site in Magento loads in 180-200ms, it took us a lot of effort to get our FTTB sorted!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing text content on a site affects seo?
HI, i have changed some h1 and h2 , changed and added paragraphs,fixed plagiarism,grammar and added some pics with alt text, I have just done it today, I am ranking on second page QUESTION-1 is it gonna affect my 2 months SEO efforts? QUESTION -2 Do I have to submit sitemap to google again? QUESTION-3 does changing content on the site frequently hurts SEO?
Algorithm Updates | | Sam09schulz0 -
Our site dropped by April 2018 Google update about content relevance: How to recover?
Hi all, After Google's confirmed core update in April 2018, we dropped globally and couldn't able to recover later. We found the update is about the content relevance as officially stated by Google later. We wonder how we are not related in-terms of content being ranking for same keywords over years. And we are expecting to find a solution to this. Are there any standard ways to measure the content relevancy? Please suggest! Thank you
Algorithm Updates | | vtmoz0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Why is old site not being deindexed post-migration?
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
Algorithm Updates | | ggpaul5620 -
Are links from inside duplicate content on a 3rd party site pointing back to you worthwhile.
In our niche there are lots of specialist 'profile / portfolio' sites were we can upload content (usually project case studies. These are often quite big and active networks and can drive decent traffic and provide links from high ranking pages. The issue im a bit stuck on is - because they are profile / portfolio based usually its the same content uploaded to each site. But im beginning to get the feeling that these links from within duplicate content although from high ranking sites are not having an effect. Im about to embark on a campaign to re rewrite each of our portfolio items (each one c. 400 words c. 10 times) for each different site, but before i do i wandered if any one has had any experience / a point of view on with wether Google is not valuing links from within duplicate content (bare in mind these arnt spam sites, and are very reputable, mainly because once you submit the content it gets reviewed prior to going live). And wether a unique rewrite of the content solves this issue.
Algorithm Updates | | Sam-P0 -
Site´s Architecture - Categories . What´s the best in my case?
My Dear friends of MOZ, I´ve got you a case that has been driving me crazy for 2 weeks, Im doing an SEO audit for big brand that sells electronics. Since they sell all kind of electronics, and are very popular the site is quite big and has several categories. Now...Im working particularly in a kind of micro-site that sells two kind of products that are very similar but not the same. Lets say in this site they are selling super-light-weight-Laptops and tablets, so if you look the site its a Laptop/Tablet site. But the site is not under a laptop/tablet directory, some pages are under laptop and others in Tablet directory . For example : Home page URL: /light-laptops/home.asp ; Products general page page URL is light-pads/products.asp ; and each single product page is under laptops or pads according the type of product. From my point of view, they should create a new directory called /light-laptops-pads/ and single directories for products, and case studies, etc.. Since they want to show both products together when you click in products (off course they will be creating sub-directories for the two types of products). At the begining I thought they were really mistaken, but now that I see that all light-pad content is in one folder and light-laptops content is in another, and the site jumps from one category to the other I am a little bit confused. PLEASE HELP ME PD: I want to make clear that general categories like products, case studies , contact us, solutions pages are in some cases under /light-pad/ directory and in other cases under /light-laptops / directory PLEASE PARDON MY ENGLISH!
Algorithm Updates | | facupp10 -
Google's reaction to site updates
Hi, Is it safe to assume as soon as Google indexes updates I've made to my site that any ranking changes the updates effected will happen at that same time, or is there ever a lag time before these changes ( if any ) take effect?
Algorithm Updates | | minutiae0 -
What is the best way for a local business site to come up in the SERPs for a town that they are not located in?
At our agency, we work with many local small business owners who often want to come up in multiple towns that are near to their business where they do not have a physical address. We explain to them again and again that with the recent changes that Google in particular has made to their algorithms, it is very difficult to come up in the new "blended" organic and Places results in a town that you don't have a physical address in. However, many of these towns are within 2 or 3 miles of the physical location and well within driving distance for potential new clients. Google, in it's infinite wisdom doesn't seem to account for areas of the country, such as New Jersey, where these limitations can seriously affect a business' bottom line. What we would like to know is what are other SEOs doing to help their clients come up in neighboring towns that is both organic and white hat?
Algorithm Updates | | Mike-i0