Our site dropped by April 2018 Google update about content relevance: How to recover?
-
Hi all,
After Google's confirmed core update in April 2018, we dropped globally and couldn't able to recover later. We found the update is about the content relevance as officially stated by Google later. We wonder how we are not related in-terms of content being ranking for same keywords over years. And we are expecting to find a solution to this. Are there any standard ways to measure the content relevancy? Please suggest!
Thank you
-
Hi,
Thanks a TON for all the analysis and insights. Just mind blowing info.
Unfortunately we switched to different versions of the site and the recent one will be stable for years and further changes will be handled very carefully without complete transformation.
Our open source crm page dropped from April this year; but the link from capterra was removed in 2018 only. They removed our product from the list and they no more link directly to the websites (you can see the page now). Not sure why we lost traffic for this page all of a sudden even though there is no much ranking difference for main keywords of high search volume. We are going to investigate this and bring back the page to the normal traffic.
Yes, we are trying to rank for "crm" as primary keyword. Do you think that we are not doing well for "crm" as we dropped for "open source crm" page?
Thanks
-
You kind of dropped a bit but not in a way which affects you very much (apparently, according to Ahrefs)
https://d.pr/i/HBzKpj.png (screenshot of estimated SEO keywords and traffic according to Ahrefs)
You did lose a lot of keywords, but many seem to have since recovered and it didn't seem it actually impact your SEO traffic estimates much at all
SEMRush has a neat (relatively) new tool which looks at more accurate traffic estimates across the board (not just limited to SEO):
Again it does show a bit of a dent around April 2018. If I was going to use SEMRush data to look at this, I'd use the traffic analytics tool not the 'normal' SEO estimate charts from SEMRush (which IMO aren't very good, hence using the Ahrefs one in place of that)
This is what your site looked like in Feb 2018 before the keyword drops:
https://web.archive.org/web/20180224042824/https://www.vtiger.com/
This is what your site looked like later in June 2018:
https://web.archive.org/web/20180606021616/https://www.vtiger.com/
Completely different!
This is what your site looks like now: https://www.vtiger.com/
Again radically different. Maybe you just have a bad case of 'disruptive behavior' where Google is unwilling to rank you well, because the site keeps radically changing in terms of design and content. Sometimes doing too many changes too fast can really put Google off! 3 different designs inside of 1 year is pretty crazy
After each change, your home-page's Page Title was completely different:
Feb 2018 version: Customer Relationship Management | CRM Software - Vtiger
June 2018 version: Vtiger CRM | Customer Relationship Management Software
Current version: CRM | Customer Relationship Management System - Vtiger CRM
In my opinion everything that was done around June 2018 was a huge mistake that you are suffering for now and recovering from gradually. The June 2018 design was horrible, way worse than the Feb 2018 or current one (both were better). If a designer doesn't do a good job, don't just 'go ahead' with a terrible site design just because you paid for it
In addition to that in June 2018 your page title didn't 'begin' with the term (or a synonym of the term) "CRM". In Feb 2018 and on the current version, you either opened with "CRM" or a synonym of "CRM" which is better for SEO. The June 2018 version of the site was really bad and also less well optimised as well (that seems really obvious to me)
Part of me actually feels that the Feb 2018 version of the site was best for SEO. It did a better job of making your USPs (value propositions) stand out to the user and search engines. It blended nice, app-styled UX with functionality that was more than just 'button links'
The current version isn't bad, it certainly looks nicer visually - but the June 2018 version was a bit of a house of horrors. It makes sense it would have been active within the boundaries of the times you got dented because, it's just a bit shocking to be honest. In the Feb 2018 version of your site, more of the individual product links were listed in the top-line nav. Now they are still there but 'hidden' in drop-downs, that could be affecting things too
If I look at the technical SEO of the Feb 2018 site I can see it was relatively streamlined in terms of resource deployment:
... but by June 2018, there were way too many resources for one homepage to be pulling in. Not only did it look plainer and uglier than before (and less helpful, with worse SEO) it was probably also laggier to boot:
Ugh! 89 HTTP requests!? Get outta' here
Now things seem a lot better on that front:
So I think this is more evidence that the short-lived June 2018 site was pretty sucky and you guys bailed on it at light-speed (rightly so it was terrible!)
The question: did you see ranking drops for "CRM" related keywords in the period surrounding April 2018? Say for example, in April, May, June and July of 2018?
I'd say that you did, according to an (extremely rough) ranking movements export from Ahrefs:
Actual data export (formatted) here: https://d.pr/f/pwnrIF.xlsx
So which CRM related URL, was responsible for the most CRM related ranking losses which Ahrefs happened to pick up on?
https://d.pr/i/rCQ8LF.png (table image)
https://d.pr/i/7SJPbt.png (ugly bar chart)
Clearly the URL most responsible for all the drops was this one:
https://www.vtiger.com/open-source-crm/
... so how has this URL changed?
Infuriatingly, the Wayback Machine has barely any records of this URL, so closest I can get to ... just before the end of April 2018, is actually December 2017:
https://web.archive.org/web/20171226021957/https://www.vtiger.com/open-source-crm/
... it looks basically the same as it looks now. No major changes. But wait! On the old version of your homepage, the footer links to the open source CRM were bigger and more prominent than they are now. Another thing, those footer links used to be marked up with itemprop=url, now they are not (could that be making a difference? All I can say is that the coding is different)
Another question would be, between April and July 2018 - did you lose any CRM related links that were worth a lot?
Actually, apparently you did lose a few. Check some of these out:
https://d.pr/i/Zg5XER.png (MEGA screenshot, but first page of results only)
https://d.pr/f/NetqVM.png (full export, lost links which may be about 'CRM', April through July 2018 - raw and unformatted export, open the CSV file in Excel!)
Losing a CRM related link from Capterra, online peer review software experts? Yeah that could hit you hard. Most of the Mashable ones are still there, they are just redirected - but the Capterra one:
https://blog.capterra.com/free-and-open-source-crm/
... that could sting. You used to have a link with anchor text like this:
"for a price starting at about $700" - but now it's gone!
You might be thinking, aha Effect - you silly sausage! Clearly it was a comment link that got pushed down or removed by admins / mods, not a 'real' link Google would have been counted. But no I say, and I have proof to back up that denial:
https://web.archive.org/web/20170930101939/http://blog.capterra.com/free-and-open-source-crm/
That is the same post in April 2018, if you Ctrl+F for "for a price starting at about $700" - you will FIND the in-content link, which actually did matter, which Capterra have removed from their content
I am sure that in the link data you will find other such examples of lost quality links. Some will be duds and false-positives (like the Mashable ones) but some will be legit removals
By the way, although the Mashable links to you are still live, Mashable have 302 redirected the old URLs for the blog posts instead of 301 redirecting them. This means those posts, if they were valued and accrued a lot of backlinks - have been cut off from their own backlinks (as 302s pass no SEO juice). As such links contained inside of them are largely nullified (d'oh! Thanks Mashable)
What this illustrates is that, your site changed too much, the way links are formed changed, the design went through a really bad patch and also you've lost some high quality backlinks. An SEO legacy doesn't last forever, links get removed over time
In the end, these convergence of issues are almost assuredly leading your site through a tough spot. That's what I'd imagine, from a very very top-line look into this issue
-
Had a quick look at semrush..
-
Thanks for looking into this. We have dropped post April 18 update as per the historical data we have; and not around Jan/Feb 18.
Could you please let me know where did you get the data? So, I will look into and try to correlate with what we have.
Thank you.
-
Hi
Why do you believe a penalty in April 18. The site looks like a penalty of some sorts in the UK, in Jan/Feb 18 and the US etc. is clear.
Not clear on why April?
Regards
-
We dropped for "crm". Site is vtiger.com. Could you please give some clue on this? It'll be really grateful and helpful.
-
Difficult to say without seeing the site, the content and the keywords. Because different query-spaces and search entities are thematically different, the ways to 'become relevant' to each of them can be highly variable in nature. If I could just see an example, it would be much easier to assess why Google has changed its mind in terms of your site's perceived relevance
What you should know about Google is that they truly believe, all of their updates make Google's search results generally more accurate (and better for users) on average, so a roll-back is extremely unlikely. If you have been pinned by a certain algorithm change, it's likely to keep hurting you until you adhere to Google's 'new standards' (which you might argue are lower in your particular niche, but regardless they're not listening)
Sometimes fairy-tales come true and 'Google glitches' get 'undone', resulting in some sites regaining their lost rankings. This is 0.001% of most situations. Usually what happens is, people get red in the face and angry with Google, argue the toss and see their sites disintegrate as a result. Mathematical algorithms don't care if you're mad or not, they don't care what you expect
With an example, I could give an un-biased 3rd party opinion on why Google is 'doing this' to your site, but it won't result in a quick fix. It will likely result in some weeks of hard graft and further investment
All of the 'standard' ways to measure content relevancy are things like, see how many times your keyword(s) are mentioned in your content. But the highest relevancy you can demonstrate is nothing to do with keyword deployment, it's matching your site's unique 'value proposition' with Google's perception of the values which the searchers (within your query-space) hold
Maybe there's been a shift and they suddenly value price over service, thus Google shakes up their results to suit. I'm not saying keyword deployment isn't part of the issue, what I'm saying is that the most 'relevant' site is the one which the largest proportion of connected searches, wish to find. It's more than just linguistic semantics and keyword-play (hope that makes sense)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Site Links question
Are Google site links only ever shown on the top website? Or is it possible for certain queries for the site in position #2 or #3 or something to have site links but the #1 position not have them? If there are any guides, tips or write ups regarding site links and their behavior and optimization please share! Thanks.
Algorithm Updates | | IrvCo_Interactive0 -
SERP Drop
Hi, I have been trading online since 2006 and over the years I have built up some impressive SERP's for keywords such as "mens underwear' which I was SERP 1 for. However, over the past 6 months I have pretty much dropped off the face of Google for a large proportion of my keywords. I suspect I have been hit by the Panda/Penguin updates and do not know how to recover this. I have a mixture of what I consider to be relevant and healthy links, but there are also a few links in there that Google would no longer like. However, I believe that the majority of my links are OK. What should I do? Thanks i97zo6W.jpg
Algorithm Updates | | UnderMe0 -
Google Local Algorithm Changes?
I was wondering if you have heard about any Google Local algorithm changes. We have about 200 franchise locations. Some of our locations have dropped significantly over the past few weeks. Locations that were showing up in the 1-3 positions are now no longer showing on the first page. This is for very relevant phrases for our main line of business (which is also in our business name)... ‘Phrase, CITY NAME’. These locations have plenty of positive Google reviews. We would typically rank well for a phrase like that based on our relevance. I did some brainstorming. Do you think any of these could have any impact? Google is all about things looking and feeling natural including link building, etc. We have used Yext which made a lot of changes across the web to fix addresses, etc. Do you think Google may be seeing this as unnatural? Too many changes at to many sites in to short a period of time? Along those same lines, do you think Google may be penalizing some of our franchise pages for being to ‘perfect’? It would be ‘natural’ for addresses to have some difference across the web and a bit unnatural to have them all match so perfectly. I know that Google has always stated the business name should be listed in Google Local the way it is listed to the general public. Things such as “Business Name Boston” should be listed as “Business Name”. Each of our franchise locations is named in house to reflect their geo location..... "Business Name Boston", "Business Name St. Louis". Many of our competitors also use the practice of attaching geo terms as well. Do you think we may be getting hit with a penalty now even though we have listed things on Google with the Geo term for years.... and is how WE refer to each location? Is it possible that by working with Yext, we drew attention to this practice? Should we remove our local listings geo term on Google Local? How about across the web? We are in a business that does not require customers to come to our location. Some of our locations have not suppressed the address in their local listings while others have. Many of our competitors have not. Do you think this could play into it? Some of our locations that are not showing in Local have good organic results. Have you heard anything about Google dropping Local if they show in organic? I know Google has been looking at social media more and more and I believe they will continue to do so. If our local pages have no social presence, could this adversely affect things? (I think this is probably not the case…. but wanted to throw it out there) I have noticed that in some cases where Local has dropped, we have multiple offices in that metro area. Is it possible that this could affect things? Have you heard of any Local algorithm changes? I know they are releasing a new dashboard sporadically, could this be in conjunction with a larger Local algorithm change? Our CMS tool does not allow us to change Title/Meta per page (I know... terrible!!). So every page has the same title and same meta description. (We are changing our CMS system! Can't wait!). Could this play into it? Thanks for any feedback!
Algorithm Updates | | MABES1 -
Is it better to build a large site that covers many verticals or many sites dedicated to each vertical
Just wondering from an seo perspective is it better to build a large site that covers many verticals or build out many sites one for each vertical?
Algorithm Updates | | tlhseo0 -
Does anyone know if Google ranks a responsive site, or a specific mobile site higher than each other?
I have heard that Google favors specific .m sites overs responsive designs in it's rankings. Does anyone know if this is true? And, if there is any supporting information. I have been in contact with our account team at Google but haven't had a response on this as yet. I appreciate any help on this. Cheers!
Algorithm Updates | | Fasthosts0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0 -
How did NexTag.com Survive the Algorithm Update?
After going through numerous post-algo update articles I find one price comparison site to have gone though unscratched - NexTag.com Question: What contributed to their success? Was it sheer domain authority, content quality, unique toolset... or something else?
Algorithm Updates | | Dan-Petrovic0