Should you Move To HTTPS? Google Thinks So!
-
Just wanted to share some information that HTTPS is now a positive ranking factor.
Here is an article about the fact that it can give a small boost in Rankings: http://searchengineland.com/explainer-googles-new-ssl-https-ranking-factor-works-200492
Here is also a video by Google Saying they think "every site should have a security certificate" and how to employ it. https://www.youtube.com/watch?v=cBhZ6S0PFCY
People may already know about this; just thought it was important to share and couldn't find it thoroughly addressed anywhere on the site.
-
for best practice, you should keep your website with HTTPS, I have done this practice for my blog, and it's working well.
-
Agreed. Re-issuing is a pain in the ass. Although I agree that HTTPS is a good thing, I do think that it's overkill for a lot of smaller sites and sites that only provide non-sensitive information to the public. Google is using it as a "lightweight" indicator right now but chances are they'll put more weight into it over time. Perhaps SSL will be a growth industry with Google putting its weight behind it.
-
I think that is the exact opposite of how security works. But once installed and the site is properly configured you should never have an issue with anything. I just wish more were renewable instead of having to get re-issued.
-
If they are going to sell them to everyone they are going to have to make them simple to implement.
For Example:
Buy one > insert URL > All redirection is taken place for you > Done!
End of Story.
I am not going to beat my head against the wall and check every one of my pages all the time to make sure they are all working correctly, indexed correctly etc etc. I already have enough stuff to do without babysitting all my sites pages more than I already do.
-
Agreed EGOL.
I think I will sit on this one for a bit.
If starting a new site from scratch I would definitely start with a certificate and make the site https from inception.
I just see this as being a huge pain in the butt for most people to set up on already established sites, especially since 95% of website owners are small businesses owners. They don't know how to pull something like this off without some serious time and/or money investment by someone who knows a fair amount about the back end of websites, which I personally don't.
I am a marketer and not a programmer and most business owners are not programmers. Only a small amount of marketers would actually see the value in a switch to https unless they have a shopping cart setup.
-
So Google's SSL was sold to Google by Google? Makes perfect sense lol. "We will rank you higher if you use an SSL. By the way, have you noticed our lovely selection of SSL's?"
-
Close, but look at Googles certificate, http://i.imgur.com/Lw1rlH4.png They are actually a CA now after the issues they had with their certificates before. I think they became a CA when they rolled out the tougher encryption. Soon, I imagine they will start selling them to the public like they do with domain names.
-
Really technical in here lol.
Just wanted to say I could see why they would do it, but it's kind of silly to run out and grab one. Most server and site security can be locked down from the server side if you have your own root account. My guess is that they partnered with someone who sells ssl certificates, and is now saying this to strengthen that alliance or help them make money. I could see this being worth the time and effort if you took sensitive information, but to dangle a carrot saying "we will rank you higher if you use an ssl on your roofing/plumber/seo/etc" website is just silly.
I'm all for security and protecting people's information, dont get me wrong, but it seems like another attempt to control things.
-
I guess the main question about your other tests is did you check to see if the server was utilizing mod_spdy? That is what my whole argument centers around. Serving https up without it is way slower, I realize that. But if the server is running mod_spdy and the browser supports it, it will be just as fast. Just as a point of reference cUrl and no linux only browsers support spdy. Only the main stream chrome, ff, IE, safari, and the kindle browser support it.
I think you are missing the point a little bit. Say you get shared hosting at any provider that I can think of, or say you get a vps, or even a dedicated server. 9 times out of 10 what you are getting is a machine running either WHM or Plesk that is running php 5.4, apache 2.2 and mod_php, and current version mysql. You are getting a slow set up basically. I have dealt with dozens of hosting companies in the last 6 months and I cannot tell you one that has set up mod_spdy on a default deployment. It just has not made it in the standard yet, it is like mod_pagespeed, they rely on you to set it up. With a setup like that, yes, https requests are going to be slower, there is no doubt in that. BUT, that is not what I am talking about running on. Set the machine up running php 5.4, use an apc caching layer, fcgi as an interpreter and install mod_spdy (I don't use mod_pagespeed, I optimize from the start) the same machine can quadruple the requests it was taking before.
I don't know how familiar you are with how http or spdy works, but here is a brief overview. On a server without spdy enabled when you request a page, the browser will make a connection (establish an insecure handshake) and the server will deliver the page. As the page is loading the browser will scan it for resources that it needs to load, then it will start creating a handshake for each resource and then start downloading it. The browser has a max of 8 channels that it can download at once. So if your page has 5 css files, 8 js files, and 30 images, it can only download 8 of those files at once, it creates a que. When one finishes, the next starts. Where all of your time is really ate up is the handshake for each resource.
With spdy the way it gets around this is there is only one handshake (for either ssl or http) in that handshake it sends all of the resources at once. When your browser scans down the page and starts asking for resources no que is created, they are all sent as soon as they are asked for. That is how it is faster, it reduces connections, think of it as a super keep-alive.
-
I tested smaller sites too though, and I expected you were going to say it was the cache but keep in mind I ran the non SSL version first in that example so why in the second test RIGHT after with SSL it's slower even if it was cached. By the way even when clearing cache or testing on one of my many shell (ssh) accounts and one VPN in New York even, same results. I tried with w3m as well. I hear you, but I think it's incorrect to say that HTTPS can and usually does slow down most pages although its a very small difference.
I'm just going by experience and tests, but I'll give you the benefit of the doubt as you seem to know more about setting up networks and SSL/TLS than I do. Sorry if I seemed like a jerk, I just find your conclusion surprising.
It's actually good news for me as I'll be needing HTTPS soon for a site as I will be accepting payment. I already knew the difference is miniscule but personally I still think for most if not all servers implementing encryption is going to slow it down a little bit. The client visiting the site and the server have to go through extra steps. How could it not take longer? Even if it is a small amount, you really think HTTPS is identical in speed to HTTP (if setup correctly on a LAMP (Linux Apache MYSQL PHP) server. There are more calculations going on. Encryption/Decryption takes time...even if its done well...its like pure math. I am a total idiot (no sarcasm) and I can't fathom how you think its a myth....
-
I totally believe that you didn't modify the timings, but you cannot use that as a test. The main reason being that one of two things could be happening. Either your isp has cached the content for the page or Google is edging with them. When you run a https request it bypasses caching and edging. So technically if you were someone like Google or Amazon, it could slow your site down. But like 90% of other people that are single location hosting a site without using edge servers all requests are leading to the same place. Plus you have to factor in network integrity. I have a site launching at the end of the week that will have about 10k users on it at any given time during the daylight hours in the US. We didn't just ramp up the load tester to 10k and call it a day, we have had the thing running since Monday non stop with 15k users on it to see the weak points in the network. We are transferring almost 2tb an hour on our AWS instances to see if the database is going to fold or not. Before we installed mod_spdy (we are only using https on checkout and contact pages) we were spawning way too many frontend instances, 9 if I remember correctly. With mod_spdy we have reduced them down to about 6 with a 7th firing up every now and then, just depending on where the wave of the load goes on the site.
-
Uhh most servers run on Apache not crappy Microsoft server. I can tell you know more about networking though and not saying I'm 100% certain...more like 60% but I can literally see the results myself right now: I just did a test myself:
I did wget on http://www.google.com and its 0.01s seconds for http and 0.04s for https.... I didn't even modify the timing to be more specific....
[ <=> ] 19,425 --.-K/s in 0.01s
2014-08-19 22:19:44 (1.26 MB/s) - 'index.html' saved [19425]
unknownlocal-7:~ unknown_device$ wget https://www.google.com
--2014-08-19 22:19:58-- https://www.google.com/
Resolving www.google.com... 74.125.224.48, 74.125.224.49, 74.125.224.50, ...
Connecting to www.google.com|74.125.224.48|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: 'index.html.2'
[ <=> ] 19,376 --.-K/s in 0.04s
2014-08-19 22:19:58 (501 KB/s) - 'index.html.3' saved [19376]
-
Read this, http://research.microsoft.com/pubs/170059/A comparison of SPDY and HTTP performance.pdf
The only time that http is faster is when mod_pagespeed is run and you are pipelining. Most US isp's do not support pipelining, so the only real option is using a spdy layer, hence why it was created.
I set up a lot of high traffic servers, everyone of them relies on mod_spdy to serve requests because it kills over http1.1. Most sites don't use mod_spdy, most servers do not have it set up, so that is my basis for the myth. If you look at the test results running spdy with ssl is faster than apache running http1.1, which accounts for I would guess at least 70% of all *nix machines on the net.
As for security I agree that setting up SSL certificates wrong is an issue, but that does not mean that there is an issue with using SSL. It is like if someone does not lock their door and someone breaks in the house, it was not the lock's fault. I set up ecommerce site's and boxes for a living and do quite a bit of pen testing, there are not many attacks out their where someone is on a closed non public network that will actually work in the wild. I mean some of the things you list could possibly work, but in most cases it would be easier to drive over to someone's house and torture them to get the information than to try to hack a router at a CA.
-
Yes just not on all pages. NSA only uses it on pages where encryption would be helpful. Not all sites set up the server to allow HTTPS even if you type it, but with a simple extension you can for those sites.
If the server has port 443 open its safe to assume they have HTTPS enabled on at least some pages. You are right though I can't get state.gov's homepage to use HTTPS. So yes state.gov DOES have HTTPS/SSL just not on all pages. Do a port scan with nmap (sudo nmap -A -v state.gov) You'll see 443 is wide open bud. Look:
Starting Nmap 6.46 ( http://nmap.org ) at 2014-08-19 22:01 PDT
NSE: Loaded 118 scripts for scanning.
NSE: Script Pre-scanning.
Initiating Ping Scan at 22:01
Scanning state.gov (72.166.186.151) [4 ports]
Completed Ping Scan at 22:01, 0.09s elapsed (1 total hosts)
Initiating Parallel DNS resolution of 1 host. at 22:01
Completed Parallel DNS resolution of 1 host. at 22:01, 0.05s elapsed
Initiating SYN Stealth Scan at 22:01
Scanning state.gov (72.166.186.151) [1000 ports]
Discovered open port 443/tcp on 72.166.186.151
Discovered open port 80/tcp on 72.166.186.151
Increasing send delay for 72.166.186.151 from 0 to 5 due to 11 out of 17 dropped probes since last increase.
SYN Stealth Scan Timing: About 43.30% done; ETC: 22:02 (0:00:41 remaining)
Increasing send delay for 72.166.186.151 from 5 to 10 due to 11 out of 11 dropped probes since last increase.
SYN Stealth Scan Timing: About 50.65% done; ETC: 22:03 (0:00:59 remaining)
Completed SYN Stealth Scan at 22:02, 76.38s elapsed (1000 total ports)
Initiating Service scan at 22:02
Scanning 2 services on state.gov (72.166.186.151)
Completed Service scan at 22:02, 6.18s elapsed (2 services on 1 host)
Initiating OS detection (try #1) against state.gov (72.166.186.151)
Retrying OS detection (try #2) against state.gov (72.166.186.151)
Initiating Traceroute at 22:03
Completed Traceroute at 22:03, 0.03s elapsed
NSE: Script scanning 72.166.186.151.
Initiating NSE at 22:03
Completed NSE at 22:03, 6.22s elapsed
Nmap scan report for state.gov (72.166.186.151)
Host is up (0.082s latency).
rDNS record for 72.166.186.151: web2.dos.iad.qwest.net
Not shown: 995 filtered ports
PORT STATE SERVICE VERSION
80/tcp open http Apache httpd 2.0.63 ((Unix) JRun/4.0)
|_http-favicon: Unknown favicon MD5: 0713DD3D30C35F2474EFAA320D51F3C9
|_http-methods: No Allow or Public header in OPTIONS response (status code 302)
| http-robots.txt: 1717 disallowed entries (15 shown)
| /www/ /waterfall/ /menu/ /navitest/ /g/
| /documents/organization/*.htm /documents/cat_desc/ /documents/backup/
| organization/revisions organization/193543.pdf organization/193546.pdf
| organization/193567.pdf organization/120733.pdf organization/120738.pdf
|_organization/124827.pdf
| http-title: U.S. Department of State
|_Requested resource was http://www.state.gov/index.htm
81/tcp closed hosts2-ns
113/tcp closed ident
443/tcp open tcpwrapped
8080/tcp closed http-proxy
Device type: storage-misc
Running (JUST GUESSING): Netgear RAIDiator 4.X (98%)
OS CPE: cpe:/o:netgear:raidiator:4
Aggressive OS guesses: Netgear ReadyNAS Duo NAS device (RAIDiator 4.1.4) (98%)
No exact OS matches for host (test conditions non-ideal).
Uptime guess: 56.674 days (since Tue Jun 24 05:51:59 2014)
Network Distance: 1 hop
TRACEROUTE (using port 8080/tcp)
HOP RTT ADDRESS
1 22.27 ms web2.dos.iad.qwest.net (72.166.186.151)
NSE: Script Post-scanning.
Read data files from: /usr/local/bin/../share/nmap
OS and Service detection performed. Please report any incorrect results at http://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 94.63 seconds
Raw packets sent: 2136 (99.048KB) | Rcvd: 48 (2.300KB)
-
Myth? Lol it's testable with curl, wget, lynx, or links. (pretty much any textual browser). I've said its an unnoticeable difference and BARELY slows it down....but it does...and yes Man in the Middle SSL attacks would be done more in a rogue wifi setting.
There are so many attacks on HTTPS not just injections. Also some lame old internet filters wont block the HTTPS equivalent of sites. I agree that its more secure than using HTTP, and I agree the speed difference is minor but a myth??? That's a bit stretching it.
I take that back more I think In a lot of ways HTTPS can make situation for webmaster less secure and not more secure. There are many attacks on HTTPS sites you can do that you can't on regular HTTP. Many people set it up wrong and get a false sense of security. Clients see the lock and think "Oh this site is secure I don't need to practice good security practices."
Also the fact that it's so easy to crack HTTPS even remotely I just don't see it justified unless we get proof that it really helps rankings. I personally like catering to the user more than the search engine. If your blog has sensitive content (I'm sure it does as it probably has registration and login / logout), then it makes sense but there are so many attacks.. http://breachattack.com/ was a popular one back in 2012.
You really think its a good idea for a site with 6 pages ALL static to invest in SSL/HTTPS instead of content? Please give me a link showing one study that shows how much better a site is likely to rank with or without https and I'll shut up.
Please read the rest at: https://www.eff.org/deeplinks/2011/10/how-secure-https-today
More and more vulns are found everyday....We only know about the non zero day vulns and exploits. There are new attacks coming out every few weeks. Look at this link which is from March 2014 and the new attacks. http://it.slashdot.org/story/14/03/07/1711215/https-more-vulnerable-to-traffic-analysis-attacks-than-suspected
- Break into any Certificate Authority (or compromise the web applications that feed into it). As we learned from the SSL Observatory project, there are 600+ Certificate Authorities that your browser will trust; the attacker only needs to find one of those 600 that she is capable of breaking into. This has been happening with catastrophic results.
- Compromise a router near any Certificate Authority, so that you can read the CA's outgoing email or alter incoming DNS packets, breaking domain validation. Or similarly, compromise a router near the victim site to read incoming email or outgoing DNS responses. Note that SMTPS email encryption does not help because STARTTLS is vulnerable to downgrade attacks.
- Compromise a recursive DNS server that is used by a Certificate Authority, or forge a DNS entry for a victim domain (which has sometimes been quite easy). Again, this defeats domain validation.
- Attack some other network protocol, such as TCP or BGP, in a way that grants access to emails to the victim domain.
- A government could order a Certificate Authority to produce a malicious certificate for any domain. There is circumstantial evidence that this may happen. And because CAs are located in 52+ countries, there are lots of governments that can do this, including some deeply authoritarian ones. Also, governments could easily perform any of the above network attacks against CAs in other countries.
In short: there are a lot of ways to break HTTPS/TLS/SSL today, even when websites doeverything right. As currently implemented, the Web's security protocols may be good enough to protect against attackers with limited time and motivation, but they are inadequate for a world in which geopolitical and business contests are increasingly being played out through attacks against the security of computer systems.
-
I think that is one page you don't have to worry about using SSL on. I think the point of many people using SSL is because of them
-
Yes I am. On Nsa for example go to their apply now page....its mandatory https. You can also manually type https:// on any page for almost all sites that have it, or just download httpseverywhere like I do, which makes it so that ALL pages on NSA.gov for example are using SSL /TLS encryption through HTTPS. It does slow down sites but we are talking micro seconds here.
-
SSL doesn't slow things down, that is a myth. Sure if someone is using an outdated browser it will, but loading all of the shims will slow the site down to so the difference will not be noticeable. Run SPDY on your server and you can test it for yourself.
As for a man in the middle attack, that has no SEO bounding. If I am running SSL on my blog that only serves articles with no login, I am not to concerned about a man in the middle attack. An injection type attack maybe, but http is open to more than a SSL site is.
-
-
Ehhh actually most major sites use HTTPS....Just some not on every page. I use a plugin called HTTPS everywhere that forces sites to use HTTPS on pages they don't require https. (but only works if they have it already). Banking sites use it, Google uses it, Facebook does, ummm actually alll major sites do.
Tr opening an untouched browser like Safari or Opera and go to the top sites.......So far everyone of them uses HTTPS. HTTPS isn't a bad thing, it does raise security, and if setup right it doesn't slow down the site that much to worry about, I just personally think its not needed for most sites and I doubt Google is going to give that much juice to people that run out and setup HTTPS/SSL for no reason other than rankings.
Check out http://www.alexa.com/topsites
Even the homepages all are using HTTPS. Chrome doesn't show the http/https part so thats probably why you don't think most sites use it, but almost all the major sites do...You are right with Moz not having it though. And a lot of sites have it but not for all pages. I agree with you too that it's not worth the investment unless you need a higher level of security.
-
I have learned not to behave like a trained dog and jump through hoops at Google's command. They often don't have their ducks in a row when they tell you to do stuff (the case here in WMT), they often change their mind about stuff that they tell you to do (authorship), and they often do things differently without tellin' anybody (nofollow).
If you have read what people who immediately jumped through this https hoop have to say about their experience you know that some of them are cussin' about it. (And, not one of them said that they see any positive difference in their SERPs.)
Amazon does not have https even when they say "Hi EGOL".... ebay doesn't have https even when I am in my bidding account,... and MattCutts.com does not have https. Wikipedia didn't do it. The WhiteHouse didn't do it. The Pope doesn't got it. And, NSA, Department of State and Interpol are still farting around.
Must not be too important because Moz doesn't have it yet.
So, keep fightin' the fight and do this https stuff after Christmas.... if then.
The only websites that I know that have long been completely on https are google, wikileaks and CIA.
-
On all the SEO podcasts and another SEO forum the general consensus was it's not worth it unless your site wold benefit from encryption. HTTPS also slows sites down a tiny bit, which is likely a factor in Google's algorithm. I think any site that takes private info like credit cards, SSN, or even logging in and logging out if there is potentially really personal data stored in the site, they should have HTTPS, but not the cheapest certificate you can find!! Nice that it should give a little boost to domain authority though.
Also HTTPS has its issues. It's very easy to hack in many forms. (I learned a man in the middle trick in school as proof of concept.) Also a lot of people don't have a working CMOS battery so their date and time goes off by years causing certificate errors. Not to mention other reasons people sometimes get certificate errors even though the site paid the bill and its working for everyone else.
My guess and from the SEO experts I've heard from, it's probably not going to give much of a boost. Highly recommend if the site has sensitive data though because then you kill two birds with one stone, but going out and buying a dedicated IP, then buying the expensive SSL cert, it's like around $80 to $200. I would love to see some tests though as if it really does help a lot then I'd invest. But keep in mind it does slow the site down a tiny bit.
Personally I think one of the best things to do right now (after proper keyword research and planning) is building killer content. Also, getting Google Plus reviews and +1's has been getting my attention. So much so a VERY famous SEO podcast asks for it every episode and raves about how much it helps, backed by tests and by ex Google employees. NEVER do fake reviews. Personally I think it's wrong to even ask for a good review, I just tell them it helps out a ton and to please be totally honest. I also ask if they can +1 me and add me to their circle.
SEO is all about content especially right now. Build an excellent site with great content that engages the user and you will get natural links and surely better rankings. I like to pretend Google Staff is always looking over my shoulder and actually read their TOS and try to remember and understand everything. lol
Everyone focuses on unnatural link building which is such a bad idea or at very least a waste of time and money. That's because (my theory) if you are buying links but nobody clicks on them, Google sees the link as pretty well worthless. I've noticed Google not caring as much about the "no follow" tag? Anyways people seem to forget that links should come naturally by building great content. On sites that use No-Follow as long as you think people will use the link the worst case scenario is you get more traffic.
It's like people forgot that incoming links are for getting traffic to the site not solely for ranking which most people don't do correctly the black hat way anyways. Just my two cents. I'm no expert by any means just obsessed with internet marketing for some reason.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting up a remote business with their Google Business Account
Hi all, I have a client who operates a remote business and I need to get her Google business/ brand account set up. She doesn't want it listed under her home address (for obvious reasons) but that is where her business is based out of. Apart from getting a PO box and listing her business under that for Google, does anybody know of any other options or best practices? Thanks!
SEO Learn Center | | Zx30 -
Can you have same product name and description on a local and national site at the same time without getting dinged by google?
My question is regarding a few sites we own which are all related. The local site (http://location.company.com) is a extension of the national main site (http://company.com.) We are in the process of launching these new sites as replacements to older sites we created years ago. My first question regarding SEO is it important to the search engines to have unique product content on both sites even though they promote the same products (with same pictures) and brands or can we keep the same product names and descriptions as I hope? I ask this because essentially they are the same site but with with the exception of the local extension..... We thought by re writing all the content on the main pages of both sites along with different; title tags, page title and meta descriptions that that would make the sites different enough to get away with this. Is that the case? Thank you in advance for your assistance, Jake
SEO Learn Center | | Closetstogo0 -
Google Analytics setup tips and tricks?
I want to see (and possibly compile) other marketers tips and tricks when it came to setting up their Google Analytics. I feel like I sometimes get into a rut when I setup GA for clients and I want to make sure that I'm not getting rusty with it. Some of the things I'm interested in (but not limited to): How do you use dashboards? What funnels do you setup? How do you utilize goals? What are your favorite metrics to report to clients? What are the top items on your checklist when setting up GA? And many others I could include here... I'd be interested in seeing what other marketers include in their GA setup and reporting. These can be super simple or expert level tips and tricks. It's a personal curiosity I have so that I make sure I'm staying up to date with everything but if there is enough participation I will compile everything into a YouMoz post or something as a guide for us to use and share. So please share this question with others if you can. Thanks!
SEO Learn Center | | HashtagJeff2 -
What should I place in the code to connect my html 5 website to Facebook, Google+ etc...?
I know most people use a CMS these days but I created a html 5 website for my small business using Dreamweaver. I'd like to know what, if anything, I should place in the code to link my website with my social media accounts like Facebook, Google+, Twitter, and Pinterest? I've found information about plugins that are useful if you're using a CMS but I'm not. I placed social media buttons on all of the pages of my website already, and when you click on those buttons they go to my social media accounts. But is there anything that should be placed in the code? Thanks for your help
SEO Learn Center | | Ophelia6190 -
Google AdWords Grant Samples?
Can anyone direct us to an actual sample of the 500 word essay required and a list of all the documents required once the client is accepted as a Google for nonprofit community member? Thanks, Jim
SEO Learn Center | | jimmyzig0 -
Wrong Descriptions for Title's on Google SERP?
I put descriptions in for each of my product categories a couple months ago and a few of my category pages rank high. The problem is that most of my category pages have the wrong meta description on the SERP. The descriptions that are shown sometimes have the home page description, which isn't a big deal, but some have descriptions of different products and most have descriptions of sub categories. Below is an example of a subcategory description (specialty feeders) for a main category: Co-matic Power Feeders - Products - Shop Gear Inc <cite>www.shopgearinc.com/products/co-matic-power-feeders/</cite>Specialty Feeders. Specialty Co-matic feeders are built for use on a variety of machines and for specific operations. Shop Gear supplies a variety of specialty **...****Any Suggestions?**Thanks
SEO Learn Center | | kysizzle60 -
Why did one of our keyword score an F but is first place on Google search? The key word is "Barista Classes Los Angeles"
We're working with an SEO company that provides monthly details on what keywords have improved and needs work. One of our keyowords (Barista Classes Los Angeles) climbed up to number one on Google's search box. However, on SEO Moz, the same keyword got an "F" under the On-Page category. Can someone help us out and define why this is happening? This way, we can adjust and see how we can improve. Thanks!
SEO Learn Center | | ckroaster0 -
How do you eliminate negative "Searches Related To This" on Google?
I have a client who when you type in their company name in Google there are some negative terms in the space that says "Searches related to this" ... Is there any way to eliminate these? We are not sure why it is showing up since it has nothing to do with their company but it does look bad...
SEO Learn Center | | JChronicle0