Our site dropped by April 2018 Google update about content relevance: How to recover?
-
Hi all,
After Google's confirmed core update in April 2018, we dropped globally and couldn't able to recover later. We found the update is about the content relevance as officially stated by Google later. We wonder how we are not related in-terms of content being ranking for same keywords over years. And we are expecting to find a solution to this. Are there any standard ways to measure the content relevancy? Please suggest!
Thank you
-
Hi,
Thanks a TON for all the analysis and insights. Just mind blowing info.
Unfortunately we switched to different versions of the site and the recent one will be stable for years and further changes will be handled very carefully without complete transformation.
Our open source crm page dropped from April this year; but the link from capterra was removed in 2018 only. They removed our product from the list and they no more link directly to the websites (you can see the page now). Not sure why we lost traffic for this page all of a sudden even though there is no much ranking difference for main keywords of high search volume. We are going to investigate this and bring back the page to the normal traffic.
Yes, we are trying to rank for "crm" as primary keyword. Do you think that we are not doing well for "crm" as we dropped for "open source crm" page?
Thanks
-
You kind of dropped a bit but not in a way which affects you very much (apparently, according to Ahrefs)
https://d.pr/i/HBzKpj.png (screenshot of estimated SEO keywords and traffic according to Ahrefs)
You did lose a lot of keywords, but many seem to have since recovered and it didn't seem it actually impact your SEO traffic estimates much at all
SEMRush has a neat (relatively) new tool which looks at more accurate traffic estimates across the board (not just limited to SEO):
Again it does show a bit of a dent around April 2018. If I was going to use SEMRush data to look at this, I'd use the traffic analytics tool not the 'normal' SEO estimate charts from SEMRush (which IMO aren't very good, hence using the Ahrefs one in place of that)
This is what your site looked like in Feb 2018 before the keyword drops:
https://web.archive.org/web/20180224042824/https://www.vtiger.com/
This is what your site looked like later in June 2018:
https://web.archive.org/web/20180606021616/https://www.vtiger.com/
Completely different!
This is what your site looks like now: https://www.vtiger.com/
Again radically different. Maybe you just have a bad case of 'disruptive behavior' where Google is unwilling to rank you well, because the site keeps radically changing in terms of design and content. Sometimes doing too many changes too fast can really put Google off! 3 different designs inside of 1 year is pretty crazy
After each change, your home-page's Page Title was completely different:
Feb 2018 version: Customer Relationship Management | CRM Software - Vtiger
June 2018 version: Vtiger CRM | Customer Relationship Management Software
Current version: CRM | Customer Relationship Management System - Vtiger CRM
In my opinion everything that was done around June 2018 was a huge mistake that you are suffering for now and recovering from gradually. The June 2018 design was horrible, way worse than the Feb 2018 or current one (both were better). If a designer doesn't do a good job, don't just 'go ahead' with a terrible site design just because you paid for it
In addition to that in June 2018 your page title didn't 'begin' with the term (or a synonym of the term) "CRM". In Feb 2018 and on the current version, you either opened with "CRM" or a synonym of "CRM" which is better for SEO. The June 2018 version of the site was really bad and also less well optimised as well (that seems really obvious to me)
Part of me actually feels that the Feb 2018 version of the site was best for SEO. It did a better job of making your USPs (value propositions) stand out to the user and search engines. It blended nice, app-styled UX with functionality that was more than just 'button links'
The current version isn't bad, it certainly looks nicer visually - but the June 2018 version was a bit of a house of horrors. It makes sense it would have been active within the boundaries of the times you got dented because, it's just a bit shocking to be honest. In the Feb 2018 version of your site, more of the individual product links were listed in the top-line nav. Now they are still there but 'hidden' in drop-downs, that could be affecting things too
If I look at the technical SEO of the Feb 2018 site I can see it was relatively streamlined in terms of resource deployment:
... but by June 2018, there were way too many resources for one homepage to be pulling in. Not only did it look plainer and uglier than before (and less helpful, with worse SEO) it was probably also laggier to boot:
Ugh! 89 HTTP requests!? Get outta' here
Now things seem a lot better on that front:
So I think this is more evidence that the short-lived June 2018 site was pretty sucky and you guys bailed on it at light-speed (rightly so it was terrible!)
The question: did you see ranking drops for "CRM" related keywords in the period surrounding April 2018? Say for example, in April, May, June and July of 2018?
I'd say that you did, according to an (extremely rough) ranking movements export from Ahrefs:
Actual data export (formatted) here: https://d.pr/f/pwnrIF.xlsx
So which CRM related URL, was responsible for the most CRM related ranking losses which Ahrefs happened to pick up on?
https://d.pr/i/rCQ8LF.png (table image)
https://d.pr/i/7SJPbt.png (ugly bar chart)
Clearly the URL most responsible for all the drops was this one:
https://www.vtiger.com/open-source-crm/
... so how has this URL changed?
Infuriatingly, the Wayback Machine has barely any records of this URL, so closest I can get to ... just before the end of April 2018, is actually December 2017:
https://web.archive.org/web/20171226021957/https://www.vtiger.com/open-source-crm/
... it looks basically the same as it looks now. No major changes. But wait! On the old version of your homepage, the footer links to the open source CRM were bigger and more prominent than they are now. Another thing, those footer links used to be marked up with itemprop=url, now they are not (could that be making a difference? All I can say is that the coding is different)
Another question would be, between April and July 2018 - did you lose any CRM related links that were worth a lot?
Actually, apparently you did lose a few. Check some of these out:
https://d.pr/i/Zg5XER.png (MEGA screenshot, but first page of results only)
https://d.pr/f/NetqVM.png (full export, lost links which may be about 'CRM', April through July 2018 - raw and unformatted export, open the CSV file in Excel!)
Losing a CRM related link from Capterra, online peer review software experts? Yeah that could hit you hard. Most of the Mashable ones are still there, they are just redirected - but the Capterra one:
https://blog.capterra.com/free-and-open-source-crm/
... that could sting. You used to have a link with anchor text like this:
"for a price starting at about $700" - but now it's gone!
You might be thinking, aha Effect - you silly sausage! Clearly it was a comment link that got pushed down or removed by admins / mods, not a 'real' link Google would have been counted. But no I say, and I have proof to back up that denial:
https://web.archive.org/web/20170930101939/http://blog.capterra.com/free-and-open-source-crm/
That is the same post in April 2018, if you Ctrl+F for "for a price starting at about $700" - you will FIND the in-content link, which actually did matter, which Capterra have removed from their content
I am sure that in the link data you will find other such examples of lost quality links. Some will be duds and false-positives (like the Mashable ones) but some will be legit removals
By the way, although the Mashable links to you are still live, Mashable have 302 redirected the old URLs for the blog posts instead of 301 redirecting them. This means those posts, if they were valued and accrued a lot of backlinks - have been cut off from their own backlinks (as 302s pass no SEO juice). As such links contained inside of them are largely nullified (d'oh! Thanks Mashable)
What this illustrates is that, your site changed too much, the way links are formed changed, the design went through a really bad patch and also you've lost some high quality backlinks. An SEO legacy doesn't last forever, links get removed over time
In the end, these convergence of issues are almost assuredly leading your site through a tough spot. That's what I'd imagine, from a very very top-line look into this issue
-
Had a quick look at semrush..
-
Thanks for looking into this. We have dropped post April 18 update as per the historical data we have; and not around Jan/Feb 18.
Could you please let me know where did you get the data? So, I will look into and try to correlate with what we have.
Thank you.
-
Hi
Why do you believe a penalty in April 18. The site looks like a penalty of some sorts in the UK, in Jan/Feb 18 and the US etc. is clear.
Not clear on why April?
Regards
-
We dropped for "crm". Site is vtiger.com. Could you please give some clue on this? It'll be really grateful and helpful.
-
Difficult to say without seeing the site, the content and the keywords. Because different query-spaces and search entities are thematically different, the ways to 'become relevant' to each of them can be highly variable in nature. If I could just see an example, it would be much easier to assess why Google has changed its mind in terms of your site's perceived relevance
What you should know about Google is that they truly believe, all of their updates make Google's search results generally more accurate (and better for users) on average, so a roll-back is extremely unlikely. If you have been pinned by a certain algorithm change, it's likely to keep hurting you until you adhere to Google's 'new standards' (which you might argue are lower in your particular niche, but regardless they're not listening)
Sometimes fairy-tales come true and 'Google glitches' get 'undone', resulting in some sites regaining their lost rankings. This is 0.001% of most situations. Usually what happens is, people get red in the face and angry with Google, argue the toss and see their sites disintegrate as a result. Mathematical algorithms don't care if you're mad or not, they don't care what you expect
With an example, I could give an un-biased 3rd party opinion on why Google is 'doing this' to your site, but it won't result in a quick fix. It will likely result in some weeks of hard graft and further investment
All of the 'standard' ways to measure content relevancy are things like, see how many times your keyword(s) are mentioned in your content. But the highest relevancy you can demonstrate is nothing to do with keyword deployment, it's matching your site's unique 'value proposition' with Google's perception of the values which the searchers (within your query-space) hold
Maybe there's been a shift and they suddenly value price over service, thus Google shakes up their results to suit. I'm not saying keyword deployment isn't part of the issue, what I'm saying is that the most 'relevant' site is the one which the largest proportion of connected searches, wish to find. It's more than just linguistic semantics and keyword-play (hope that makes sense)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google considers the cached content of a page if it's redirected to new page?
Hi all, If we redirect an old page to some new page, we know that content relevancy between source page and this new page matters at Google. I just wonder if Google is looking at the content relevancy of old page (from cache) and new page too. Thanks
Algorithm Updates | | vtmoz0 -
Fred Google Update & Ecommerce Sites
Hi I've seen a couple areas of our site drop in average rankings for some areas since the 'Fred' update. We don't have ads on our site, but I'm wondering if it's 'thin' content - http://www.key.co.uk/en/key/ We are an ecommerce site and we have some content on our category pages - which is a bit more generic about the section/products within that section - but how can it not be if it's a category page with products on? I am working on adding topic based content/user guides etc to be more helpful for customers, but I'd love some advice on generating traffic to category pages. Is it better to rank these other topic/user guide pages instead of the category page & then hope the customer clicks through to products? Advice welcome 🙂
Algorithm Updates | | BeckyKey0 -
Does Google like pricing information?
Over the last year I have noticed a trend in a couple of industries. Google seems to prioritise landing pages with pricing information in the content. This seems more important than it used to. One industry is high end industrial machines. Traditionally there isn't a price list as everything is bespoke for the customer. Low end machines that display an off the shelf price are now ranking higher than they used to. This is frustrating because the different machines meet different customer requirements. However, both sorts of customers are likely to use the same search terms. Has anyone else noticed this trend?
Algorithm Updates | | Brighton-Soundsystem0 -
Duplicate Product Pages On Niche Site
I have a main site, and a niche site that has products for a particular category. For example, Clothing.com is the main site, formalclothing.com is the niche site. The niche site has about 70K product pages that have the same content (except for navigation links which are similar, but not dupliated). I have been considering shutting down the niche site, and doing a 301 to the category of the main site. Here are some more details: The niche sites ranks fairly well on Yahoo and Bing. Much better than the main site for keywords relevant to that category. The niche site was hit with Penguin, but doesn't seem to have been effected much by Panda. When I analyze a product page on the main site using copyscape, 1-2 pages of the niche site do show, but NOT that exact product page on the niche site. Questions: Given the information above, how can I gauge the impact the duplicate content is having if any? Is it a bad idea to do a canonical tag on the product pages of the niche site, citing the main site as the original source? Any other considerations aside from duplicate content or Penguin issue when deciding to 301? Would you 301 if this was your site? Thanks in advance.
Algorithm Updates | | inhouseseo0 -
Significant Drop In Traffic Early April
I had a downward slide in Google organic traffic at the beginning of April (around 8-10th). It doesn't coincide with any of the algo updates, and I can't figure out what the cause it. Has anyone else experienced such a drop? Any suggestions as to finding the source/cause? TIA
Algorithm Updates | | inhouseseo0 -
Recovered from penguin/panda but which one?
So the good news is that for the first time since April 24th, one of our websites is back in the search results as of around December 12 but I am still unsure as whether it was panda or penguin (or both) that was impacting the site?? Note this was not a manual penalty. I diagnosed it as a penguin issue (drop on April 24th, aggressive on-page optimisation, around 10% of links from spammy directories like addyourfreelinks.com with anchor text built by a questionable agency), but on further advice it was thought that panda was also an issue because it is a hotel microsite so there was duplication with our own brand site and across third party travel sites and there were a number of pages with bare content. I figured it was a good time to clean everything up to address both. Here is a summary of actions taken: submitted disavow file on October 24th with all questionable links including actions taken and comments. Since then I have cleaned up some content so it is less aggressively targeting certain keywords. Amended several third party listings with duplicate content No follow,indexed pages that were directly duplicated with our brand site and over the last month have built a few good quality links. Cleaned up 404's in webmaster tools over the last week I have searched to see if there were any algorithm updates around December 12 but cannot find any mentions. Thoughts?
Algorithm Updates | | jay.raman0 -
Duplicate content advice
Im looking for a little advice. My website has always done rather well on the search engines, although it have never ranked well for my top keywords on my main site as they are very competitive, although it does rank for lots of obscure keywords that contain my top keywords or my top keywords + City/Ares. We have over 1,600 pages on the main site most with unique content on, which is what i attribute to why we rank well for the obscure keywords. Content also changes daily on several main pages. Recently we have made some updates to the usability of the site which our users are liking (page views are up by 100%, time on site us up, bounce rate is down by 50%!).
Algorithm Updates | | jonny512379
However it looks like Google did not like the updates....... and has started to send us less visitors (down by around 25%, across several sites. the sites i did not update (kind of like my control) have been unaffected!). We went through the Panda and Penguin updates unaffected (visitors actually went up!). So i have joined SEOmoz (and loving it, just like McDonald's). I am now going trough all my sites and making changes to hopefully improve things above and beyond what we used to do. However out of the 1,600 pages, 386 are being flagged as duplicate content (within my own site), most/half of this is down to; We are a directory type site split into all major cities in the UK.
Cities that don't have listings on, or cities that have the same/similar listing on (as our users provide services to several cities) are been flagged as duplicate content.
Some of the duplicate content is due to dynamic pages that i can correct (i.e out.php?***** i will noindex these pages if thats the best way?) What i would like to know is; Is this duplicate content flags going to be causing me problems, keeping in mind that the Penguin update did not seem to affect us. If so what advise would people here offer?
I can not redirect the pages, as they are for individual cities (and are also dynamic = only one physical page but using URL rewriting). I can however remove links to cities with no listings, although Google already have these pages listed, so i doubt removing the links from my pages and site map will affect this. I am not sure if i can post my URL's here as the sites do have adult content on, although is not porn (we are an Escort Guide/Directory, with some partial nudity). I would love to hear opinions0 -
What determines rankings in a site: search?
When I perform a "site:" search on my domains (without specifying a keyword) the top ranked results seem to be a mixture of sensible top-level index pages plus some very random articles. Is there any significance to what Google ranks highly in a site: search? There is some really unrepresentative content returned on page 1, including articles that get virtually no traffic. Is this seriously what Google considers our best or most typical content?
Algorithm Updates | | Dennis-529610