Does anyone know if an increase in 804 HTTPS errors will affect SEO rankings?
-
We recently moved our whole site over from HTTP to HTTPS and we went from having 106 keywords in the top 3 positions to 80 in just one week. The only thing that I can think of that caused the drop is the HTTPS changes to our site. Any input would be greatly appreciated.
-
Jennifer,
Some of your pages still reference http images - example https://www.tsheets.com/proadvisors-we-trust.php calls image http://cdn.tsheets.com/images/pros/denise-loter-koch.png
"Your connection to www.tsheets.com is encrypted with obsolete cryptography. However, this page includes other resources which are not secure. These resources can be viewed by others while in transit, and can be modified by an attacker to change the look of the page."
This is probably the reason for the 804 errors in Moz.
You should also check your internal links - some of them still point to the http version which is then again redirected to the https version
Example https://www.tsheets.com/infographics/time-tracking-infographic-hr-industry links to http://www.tsheets.com/infographics/time-tracking-infographic which is then redirected to https.
Unrelated to the https - but you might want to optimise the image on https://www.tsheets.com/online-invoicing-and-billing/ (https://www.tsheets.com/online-invoicing-and-billing/images/main-image-billing.png)
rgds
Dirk
-
This is very helpful. Thank you Dirk.
Just for clarification our site is https://www.tsheets.com
-
Hi Jennifer,
Migration to https has certain risks (like any other migration of your site). Without the actual url it's difficult to asses what's wrong with the site.
1. You can check here if the SSL was properly implemented: https://www.ssllabs.com/ssltest/
2. There is an interesting article on the technical migration on the site of Yoast (https://yoast.com/move-website-https-ssl/) - and about the potential SEO impacts here: http://moz.com/blog/seo-tips-https-ssl - even if you have already migrated you could check the different steps & check if you have skipped one.
3. Try crawling the site with Screaming Frog - it has a tab Protocol that can show you if all pages are on https or if some are missing. You can also check if all your internal links are updated to the https version.
4. I guess you have created a WMT for https version of your site - check if specific errors are listed.
5. Check pagespeed with google page speed analyser & webpagetest.org - check your scores. It possible that adding the https also made your site slower.
6. Sample pages in different browsers - do you get security warnings when visiting pages. These messages can really frighten your visitors, and have impact on stats like bounce rate & avg. visit duration, and as result have an impact on your rankings
7. Check vital stats in Analytics - like bounce rate, pages/visit, avg visit duration, avg time on page... - did you see major changes after migration. Also check if you see an increase in 404 pages.
Hope this helps in solving your problem,
Dirk
-
Hi Jennifer
Take a look at the questions and answers here as this has been discussed and referenced a resource I would have posted.
http://moz.com/community/q/804-https-ssl-error
If crawlers see issues with your servers and protocol, that could potentially be a negative checkmark against your site in the SERPS. I would discuss with your web dev and SEO team how to properly implement changes to fix these issues.
Hope this helps! Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I change Tags and Categories in Wordpress blog post, will it negatively affect SEO and cause 404s?
Hi, I have belatedly come to the conclusion that I have been using tags and categories when blogging in wordpress incorrectly. The result is that Google seems to prefer to show my archives and tags in search results rather than the post itself. Not good UX. As the site is only a few months old, am I best to learn my lesson and tag and categorize correctly moving forward or Should I go back in to these posts and clean them up & categorize and tag correctly. If I do this, will it cause 404s and hurt my SEO? Thanks!
Technical SEO | | johnyfiveisalive1 -
Does a tag inside H1 affect SEO?
We are using a tag inside our H1 tag to display a second line of text, usually for extra information and making the tag appeal like a intro text for the visitor and not just a optimized SEO tag. Example: https://www.denhollandsche.nl/grafmonumenten/ Does this technique affect SEO? Should we consider removing it? Natuursteen grafsteengraniet grafmonument voorbeelden
Technical SEO | | stepsstones0 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
GWT crawl errors: How big a ranking issue?
For family reasons (child to look after) I can't keep a close eye on my SEO and SERPs. But from top 10 rankings in January for a dozen keywords I'm now not in top 80 results -- save one keyword for which I'm ~18-20.
Technical SEO | | Jeepster
Not a sitewide penalty: some of my internal pages are still ranking top 3 or so. In GWT, late March I received warning of a rise in server errors:
17 Server Errors/575 soft 404s/17 Not Founds/Access Denied 1/Others 4
I've also got 2 very old sitemaps (from two different ex-SEO firms) & I'm guessing about 75% of the links on there no longer exist. Q: Could all this be behind my calamitous SERPS drop? Or should I be devoting my -- limited -- time to improving my links?0 -
Remove 404 errors
I've got a site (www.dikelli.com.au) that has some 404 errors. I'm using Dreamweaver to manage the site which was built for me by I can't seem to figure out how to remove the 404 pages as it's not showing up in the directory? How would I fix this up?
Technical SEO | | sterls0 -
How Much Long To Wait For Rank After Done SEO - Very Nervous!!!
I have done SEO like PR, AD, Web 2.0, Link Building and so on. for the one keyword but after 1 week i didn't see the right result.. my keyword was stuck in 15-20th position and only 1 time appear on top 10 position and that is for only 1 day. Keyword Competition according to google adwords tool: Global - 1300 Local - 590 Competition - Low please advice..I am so disappointed now. My On page report is A+ according to seomoz tool..
Technical SEO | | xplodeguru0 -
Local Keywords Not Ranking Well in a Geographic Location (but Rank Very Well Outside of Geographic Location)
Has anyone experienced, in the last few months, an issue where a website that once ranked well for 'local' terms in Google stopped ranking well for those terms (but saw a ranking decrease only within the geographic location contained within those keywords)? For example only, some 'root' keywords could be: Chicago dentist Chicago dentists dentist Chicago dentists Chicago What happens is that when a searcher searches from within the geographic area of Chicago, IL, the target website no longer ranks on the 1st page for these types of keyword phrases, but they used to rank in the top 3 perhaps. However, if someone was to search for the same keyword phrases from another city outside of Chicago or set a custom location (such as Illinois or even Milwaukee, WI perhaps) in their Google search, the target website appears to have normal (high) 1st page rankings for these types of terms. My own theory: At first I thought it was a Penguin related issue but the client's rankings overall haven't appeared to have been affected on the date(s) of Penguin updates. Authority Labs and Raven Tools (which uses Authority Labs data) did not detect any ranking decrease and still reports all the local keyword rankings as high on the 1st page of Google. However, when the client themselves goes to check their own rankings (as they are within that affected geographic area), they are no where to be found on the 1st page. :S After some digging I found that (one of) the company's Google Places listings (the main office listing) became an 'unsupported' status in Google Maps. So now I am thinking that this phenomenon is due to the fact that other listings are now appearing in search results for the same location. For example, in this case, an individual dentist's Google Places listing (who works within the dental office) is being displayed instead of the actual dental office's listing. Also, the dentist's name on the Google Places listing is being swapped out by Google with the name of the dental office, but if you click through to the Google Places listing, it shows the name of the individual Dentist. Anyone encounter a similar issue or have any other theories besides the Google Places issue?
Technical SEO | | OrionGroup0 -
Will using http ping, lastmod increase our indexation with Google?
If Google knows about our sitemaps and they’re being crawled on a daily basis, why should we use the http ping and /or list the index files in our robots.txt? Is there a benefit (i.e. improving indexability) to using both ping and listing index files in robots? Is there any benefit to listing the index sitemaps in robots if we’re pinging? If we provide a decent <lastmod>date is there going to be any difference in indexing rates between ping and the normal crawl that they do today?</lastmod> Do we need to all to cover our bases? thanks Marika
Technical SEO | | marika-1786190