Https slower site Versus Non https faster site??
-
Hey all,
I know that everyone is going on about https as a ranking signal (as far as I read it is not a very important ranking signal, but a low ranking signal) but
- Site speed is a ranking signal
- https is now a ranking signal as well
- https makes sites slower
So in view of the above, what's better?
- An https site that is slower
- A non https site that is faster
Thanks!
-
While I do not know how Google treats HTTPS in regards to site speed, WebPageTest.org uses the following to score a site's Time To First Byte:
"The target time is the time needed for the DNS, socket and SSL negotiations + 100ms. A single letter grade will be deducted for every 100ms beyond the target."
Which means WebPageTest does not penalize a site for being secured.
Edit: For redirections, 301 everything and change previously added redirects to point to the HTTPS so you don't end up with chained redirections.
As far as GWT is concerned, I would add both sites (http://site.com and https://site.com) and use the Change of Address feature on the HTTP one to the HTTPS one.
Hope this helps.
-
I would submit another sitemap within GWT for safe measure after everything is changed and 301s in place but I don't believe there is anything else that is needed. Keep an eye on the important marketing blogs because I'm sure there will be more of this sort of information in the coming weeks.
-
So how do you change it all to HTTPS? Anything to do in GWT? Are there any redirections needed? ....?
-
Something else to consider here is that Google specifically said "over time, we may decide to strengthen" that signal. I know Google says a lot of things but with this I'd rather be in front of my competition. And with e-commerce I've seen more than one study that shows SSL and security badges increase conversions and trust. Not sure how helpful this is, just my two cents.
-
Do you have any data on the few miliseconds delay?
Our ecom site in Magento loads in 180-200ms, it took us a lot of effort to get our FTTB sorted!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is old site not being deindexed post-migration?
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
Algorithm Updates | | ggpaul5620 -
Microsites for Local Search / Location Based sites?
Referring to the webinar on SEOMoz about Local Search that was presented by Nifty Marketing (http://www.seomoz.org/webinars/be-where-local-is-going). I have a question my client asked us regarding why we broke out their locations into microsites, and not just used subfolders. So here are the details: The client has one main website in real estate. They have 5 branches. Each branch covers about a 50 mile radius. Each branch also covers a specialized niche in their areas. When we created the main site we incorporated the full list of listings on the main site; We then created a microsite for each branch, who has a page of listings (same as the main site) but included the canonical link back to the main site. The reason we created a microsite for each branch is that the searches for each branch are very specific to their location and we felt that having only a subfolder would take away from the relevancy of the site and it's location. Now, the location sites rank on the first page for their very competitive, location based searches. The client, as we encourage, has had recommendations from others saying this is hurting them, not helping them. My question is this... How can this hurt them when the microsites include a home page specific to the location, a contact page that is optimized with location specific information (maps, text, directions, NAP, call to action, etc.), a page listing area information about communities/events/etc., a page of the location's agents, and of course real estate listings (with canonical back to the main site)? Am I misunderstanding? I understood that if the main site could support the separation of a section into a microsite, this would help local search. Local search is the bread and butter of this client's conversions. AND if you tell me we should go back to having subfolders for each location, won't that seriously hurt our already excellent rankings? The client sees significant visitors from their placement of the location URLs. THANKS!
Algorithm Updates | | gXeSEO
Darlene1 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
We have dynamically updated xml sitemaps which we feed to Google et al. Our xml sitemap is updated constantly, and takes minimal hands on management to maintain. However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone. So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
Algorithm Updates | | linklater0 -
Youtube, Video SEO, & my site
For our business we are building a collection of videos ranging including product info, how-to's, and some funny content. My understanding is that if you embed these onto my site from youtube you don't get any credit for these videos on the web site even if submitting a video sitemap. My thinking is to post these videos to youtube and to host them on our own site and submit a video sitemap including the videos on our site. We would change the name, description, etc. on youtube vs. what's o our web site. Question is - is this the best strategy? Do I get penalized for duplicate content? They are important for both the social aspects of youtube and the content vaue of our web site.
Algorithm Updates | | uwaim20120 -
What is the best way for a local business site to come up in the SERPs for a town that they are not located in?
At our agency, we work with many local small business owners who often want to come up in multiple towns that are near to their business where they do not have a physical address. We explain to them again and again that with the recent changes that Google in particular has made to their algorithms, it is very difficult to come up in the new "blended" organic and Places results in a town that you don't have a physical address in. However, many of these towns are within 2 or 3 miles of the physical location and well within driving distance for potential new clients. Google, in it's infinite wisdom doesn't seem to account for areas of the country, such as New Jersey, where these limitations can seriously affect a business' bottom line. What we would like to know is what are other SEOs doing to help their clients come up in neighboring towns that is both organic and white hat?
Algorithm Updates | | Mike-i0 -
Google removing pages from Index for Panda effected sites?
We have several clients that we took over from other SEO firms in the last 6 months. We are seeing an odd trend. Links are disappearing from the reports. Not just the SEOmoz reports, but all the back link reports we use. Also... sites that pre Panda would show up as a citation or link, have not been showing up. Many are these are not Indexed, and are on large common Y.P or other type sites. Any one think Google is removing pages from the Index on sites based on Panda. Yours in all curiosity. PS ( we are not large enough to produce quantity data on this.)
Algorithm Updates | | MBayes0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0