Search ranking for a term dropped from 1st/2nd to 106th in 3 months
-
Hello all,
Just a couple notes first. I have been advised to be vague on the search term we've dropped on (in case this page ranks higher than our homepage for it). If you search for my name in Google though you should be able to figure out where I work (I'm not the soccer player).
While I am looking for an answer, I've also posted this question on a couple other forums (see https://www.webmasterworld.com/google/4934323.htm and https://productforums.google.com/forum/?utm_medium=email&utm_source=footer#!msg/webmasters/AQLD7lywuvo/2zfFRD6oGAAJ) which have thrown up more questions than answers. So I have posted this as a discussion.
We've also been told we may have been under a negative SEO attack. We saw in SEMRush a large number of backlinks in October/November/December - at about the same time we disavowed around 1m backlinks (more on this below) but we can't see this reflected in Moz. We just got off a call with someone at Moz to try and work this out and he suggested we post here - so here goes...
On 4th October for the search term 'example-term' we dropped from number 2 to number 9 on Google searches (this was confirmed in Google Search Console).
We also paid an external SEO consultant to review our site and see why we are dropping on the term 'example-term'.
We've implemented everything and we're still dropping, the consultant thinks we may have been penalised in error (as we are a legitimate business and we're not trying to do anything untoward).
In search console you could see from the graphs on the term we used to rank 1st and 2nd (you could go back 2 or 3 years and still see this).
The thing we do find confusing is that we still rank very highly (if not 1st) for 'example-term + uk' and our brand name - which is very similar to 'example-term'.
Timeline of events of changes:
-
2nd October 2018 midday: Added a CTA using something called Wisepops over the homepage - this was a full screen CTA for people to pledge on a project on our site helping with the tsunami in Indonesia (which may have had render blocking elements on).
-
4th October: we added a Google MyBusiness page showing our corporate headquarters as being in the UK (we did flag this on the Google MyBusiness forums and both people who responded said adding a MyBusiness page would not affect our drop in rankings).
-
4th October: dropped from number 2 to number 9 on Google searches (this was confirmed in Google Search Console)
-
4th October: Removed the Wisepops popup
-
5th November: Server redirect so anything coming in on / was redirected to a page without a /
-
12th November: Removed around 200 junk pages (so old pages, test cms pages etc that were live and still indexed). Redirects from any 404s resolved
-
19th November: Updated site maps and video site maps to reflect new content and remove old content. Reviewed the whole site for duplicate meta tags and titles and updated accordingly with unique ones. Fixed issues in Google Search Console for Google search console for 404 and Mobile usability. Removed embedded YouTube video from homepage.
-
11th December: Removed old content and content seen as not useful from indexing; 'honey pot' pages, old blog, map pages, user profile pages, project page ‘junk pages which have little SEO value’ (comments, contact project owner, backers, report project) from indexing, added ‘no-follow’ to widgets linking back to us
-
3rd January 2019: Changed the meta title from to remove 'example-term' (we were concerned it may have been seen as keyword stuffing)
-
7th January: Disavow file updated to refuse a set of external sites powered by API linking to us (these were sites like example-term.externalsite.co.uk which used to link to us showing projects in local areas - our SEO expert felt may be seen as a ‘link farm’)
-
11th January: Updated our ‘About us’ page with more relevant content
-
15th January: Changed homepage title to include 'example-term' again, footer links updated to point to internal pages rather than linking off to Intercom, homepage ordering of link elements on homepage changed (so moving external rating site link further down the page, removing underlines on one item that was not a link, fixed and instance where two h1 tags were used), removed another set of external Subdomains (i.e. https://externalsite.sitename.co.uk) from our system (these were old sites we used to run for different clients which has projects in geographical areas displayed)
-
18th January: Added the word 'example-term' to key content pages
We're at a loss as to why we are still dropping. Please note that the above changes were implemented after we'd been ranking fine for a couple years on the 'example-term' - the changes were to try and address the drop in ranking. Any advice would be greatly appreciated.
-
-
Thanks for the info! It's good to get a bigger picture of the nefarious 'globe' network which seems to link to every site on the entire internet, with absolutely zero value-add whatsoever for end users. It's interesting to see that you guys got hit by some variants of that pure-spam domain, which didn't seem to hit us. Clearly the problem is far more widespread than we had at first anticipated
We also disavowed a whole load of non-globe related domains, those weren't in our export
What I'm talking about in terms of the 'targeted' methodology, is not the deployment of the disavow - but the decision making process before the disavow file was compiled. We really made sure that, we got a very granular view of each and every link before deciding whether to disavow or not. We had rows of metrics against each link, before we decided whether to keep or disavow any particular link
In almost all situations, once we reached deployment we used to domain-level disavow directives. There were only 1-2 exceptions, where the client had good editorial pieces on a site - yet also spammy banner / sidebar links from paid advertising. In such situations we used a mixture of disavow directives, to try (as hard as we could) to let to good links through the net. That being said, very few people will be in that same situation. In the majority of cases, if you don't want one link from a domain - you don't want any!
-
This is really useful thank you. We've reviewed our spammy backlinks and noticed we also have a load of links from the Globe network also.
Looks like a few of the urls we're seeing were not in your disavow list, so I've listed them below so you can update your disavow file if needed.
earth.firm.in
theglobe.shop
advertisewebpages.org
searchingweb.org
accent-rugs.search-web.us
search-web.us
theworld.gen.in
globe.clothing
the-seek.net
theglobe.ru
www.search-internet.net
theglobe.capital
theglobe.co.za
theglobe.insure
theglobe.rocks
the-internet.co
www.internet-advertising.us
www.internetads.us
advertise.country
advertise-web-pages.org
internet-seek.org
the-web.in
theworld.capital
advertise.loans
acne.search-web.us
ad-net.net
advertise.contractors
advertisewebpage.net
arizona-mortgages.search-web.us
globe.video
jitensha.seek-web.net
online-seek.com
seekinternet.net
submit-urls.org
theglobe.exchange
theglobesearch.com
the-globe.today
the-globe.tv
theworld.diamonds
theworlds.marketing
the-world.tv
web-advertisement.com
websearch.world
www.advertising.recipes
www.earth.shopping
www.web-page.org
www.websearch.cz
www.web-seek.net
advertise.cologne
theglobe.bid
web-seek.org
the-internet.in
theseek.org
advertise.cruises
kitsukekyoshitsu.seek-web.net
theglobe.education
advertising.shoes
advertise.condos
advertise-webpages.com
advertise-website.org
seek-internet.com
seek-web.org
theglobe.org.in
theglobe.yt
the-world.site
globe.ru.com
auto-insurance.search-web.us
theglobe.loans
globe.com.de
www.theglobe.ru
theworld.estate
advertise-web-page.net
globe.com.ar
globe.pe
theglobe.ee
worlds.games
searching-web.com
advertise.computer
theglobe.cn.com
add-urls.net
globe.br.com
theglobe.ae
theglobe.sk
web-advertising.net
netfind.eu
theglobe.international
theglobe.gr
theglobe.fi
advertise.jewelry
searchinginternet.net
search-pages.org
submit-page.com
submit-pages.com
submitwebpages.com
theglobe.bz
theglobe.cl
theglobe.email
theglobe.gallery
theglobe.my
the-globe.siteWhen you said you disavowed the links in a targeted way - was that doing each link or grouped domain one by one in a disavow file?
Thanks again
-
So firstly, remember that Google's rankings are a competitive environment. It might be that others are rising as the query-space has been identified as lucrative by a number of competitors, rather than that you are 'dropping'
Another factor to consider is algorithmic devaluation. If you haven't had a message from Google within Search Console saying that you have had a penalty of some kind, then Google aren't adjusting your rankings to be lower than they were before.
When sites which previously gave you SEO authority are deemed as 'manipulative' by Google, the pipe from their site to your site (which was previously sending across ranking power) is switched off, so you drop. No one has edited your rankings to be lower, it's just that previously 'suspect' links have been switched off by Google. From Google's POV those links should never have contributed to your rankings, so it's not an attack on you - it's Google balancing the table to 'how things should always have been'
I recently wrote an in-depth post on this phenomenon, you can find it here as my primary answer to the asked question. I recommend you have a read of that one!
I can confirm that at our agency, from late Summer last year to the end of the year (Autumn to Winter period) we did notice an increase in terms of negative SEO attacks. 2-3 of our client's sites were hit and on one of our client's websites, the attack actually worked and drained some of their ranking positions a little. We recovered from it pretty fast via accurate disavow work. The main offending network was this crappy one which as you can see is just a series of spam domains linked together with billions of pages listed, in Google's least-favourite manipulative 'link-list' format
For reference we purged a load of globe-related domains:
- https://d.pr/f/PLkscH.txt (list of globe-related domains we disavowed)
I'm giving you the above as our timelines somewhat converge for very similar issues, actually if you'd be open to it I'd like to compare lists of disavowed spam domains to see if it was part of the same attack
This list isn't exhaustive, we actually did a much more thorough job of the work than just that. We fetched tens or hundreds of thousands of backlinks from all relevant tools (SEMRush, Ahrefs, Moz Link Explorer, Majestic SEO, Google Search Console) and aggregated all the data. We then used Google Analytics (site-visits / sessions metrics) and URL Profiler (fetching metrics like Citation Flow, Trust Flow, Page Authority, Domain Authority, Ahrefs Rating - all from different data sources) and boiled each link down to a single 'SEO Authority' metric
Once we had that we began deciding which links were 'fake' ore 'negative SEO' links and we disavowed them in a very, very targeted way
The problem is that, when you get penalties or algorithmic devaluations, Google won't explicitly tell you which links are the problem. If you get too aggressive and do the disavow work in a non-data-led, non-targeted way, you can end up disavowing some links which were giving you some SEO ranking power. That makes you dip down further
Even with out solid tools and methodology, we _still _usually experience slight dips from disavow work. But after it's done, limiters on performance are removed and then you can begin to see it trend up again. Especially if you replace some of the bad links with good ones (or compensate for having less authority by introducing better content), you very quickly start to see the site recovering
IMO it sounds like you have had:
- Spammy inbound links and / or negative SEO
- Which led to algorithmic devaluations, not a penalty
- Which was then back-plated with low quality disavow work
- Which then hit you harder than was necessarry
- Which then nullified your content efforts
I'm not a gambling man, but if I had to roll some dice - that's what I'd say
This is the kind of lengths we were going to, in order to get an accurate disavow which killed negative links whilst preserving decent ones:
- https://d.pr/i/o4GM8p.png (screenshot of Excel)
This particular sheet has over 5,000 rows of data, but before we began our cull it had many more (into the tens or hundreds of thousands of rows of data, from memory)
A lot of the colouration is conditional formatting, designed to make stuff stand out. There were also rules saying stuff like, actually if this link is already a no-follow it therefore can't be a risk so don't disavow (basic logic)
If this doesn't look like the lengths to which your agency or freelance partner went to (with very sensitive disavow work) then the work wasn't done right
Sorry that I haven't provided a clear-cut, out of the box answer to your query. Hopefully the knowledge and resources which I have shared here, will be some use to you on your **quest for restored **rankings
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm changing title tags and meta tags, url, will i loose my ranking?
Hi Guys QUESTION: I'm currently going through a re-design for my new website that was published in November 2014 - since launching we found there were many things we needed to change, our pages were content thin being one of the biggest. I had industry experts that came in and made comments on the title tags lacking relevance for eg: our title tag for our home page is currently "Psychic Advice" most ideal customers don't search "Psychic Advice" they search more like "Online Psychic Reading" or Psychic Readings" I noticed alot of my competitors also were using title tags such as Online Psychic Readings, Free Psychic Readings etc so it brings me to my question of "changing the title tags around. The issue is, im ranking for two keywords in my industry, online psychics and online psychic readings in NZ. 1. Our home page and category pages are content thin.... so hoping that adding the changes will create perhaps some consistency also with the added unique and quality content. Here is the current website: zenory. co.nz and the new one is www.ew-zenory.herokuapp.com which is currently in development I have 3 top level domains com,com.au, and co.nz Is there anyone that can give me an idea if I were to change my home page title tag to **ZENORY | Online Psychic Readings | Live Psychic Phone and Chat ** If this will push my rankings down though this page will have alot more valuable content etc? For obvious reasons im going to guess it will make drop, I'm wondering though if it is worth changing the title tags and meta descriptions around or leaving it as is if its already doing well? How much of a difference do title tags and meta descriptions really make? Any insight into this would be great! Thanks
White Hat / Black Hat SEO | | edward-may1 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Google says 404s don't cause ranking drops, but what about a lot of them
Hello, According to Google here, 404s don't cause rankings to go down. Our rankings are going down and we have about 50 or so 404s (though some may have been deindexed by now). We have about 300 main products and 9000 pages in general on this Ecommerce site. There's no link equity gained by 301 redirecting the 404s. A custom 404 page has been made linking to the home page. There's nothing linking to the pages that are 404s Provided that no more 404s are created, can I just ignore them and find the real reason our rankings are going down?
White Hat / Black Hat SEO | | BobGW0 -
Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company. As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking. I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot. What do you think?
White Hat / Black Hat SEO | | eTundra0 -
Powered by/Credit backlinks and nofollow
Pseudo question: I have a website that has 100K pages. On about 50K of those pages I have information that is fed to me via an outside 3rd-party website. Now, I like to give credit where credit is due, so I add a backlink to the website that is feeding me this content. A simple backlink like so: Information provided by: Company ABC Now, this 3rd-party website wants me to remove the nofollow tags from the backlink, but I am very, very skeptical because to me, sending ~50K dofollow backlinks to a single site might make the Google monster upset with me. This 3rd-party site is being very hard-headed about this, to the point where I am thinking of terminating the relationship all together. I digress. Scoured the net before writing this, but couldn't really find anything directly related to my issue. Thoughts? Is a nofollow required here? We're not talking 1 or 2 links here; we're talking tens of thousands (50K is low; it will probably be upwards of 100K when all is said and done as my site has many, many pages). Thanks in advance.
White Hat / Black Hat SEO | | THB0 -
EMD with 3.3million broad match searches got hit hard by Panda/Penguin
k, so I run an ecommerce website with a kick ass domain name. 1 keyword (plural)
White Hat / Black Hat SEO | | SwissNinja
3.3 million broad match searches (local monthly)
3.2 million phrase match
100k exact match beginning of march I got a warning in GWT about unnatural links. I feel pretty certain its a result of an ex-employee using an ALN listing service to drip spun article links on splogs. This was done also for another site of mine, which received the same warning, except bounced back much sooner (from #3 for EMD w/ 100k broad, 60k phrase and 12k exact, singular keyword phrase) I did file reinclusion on the 2nd (smaller) domain. Received unnatural warning on 4/13 and sent reconsideration on 5/1 (tune of letter is "I have no clue what is up, I paid someone $50 and now Im banned) As of this morning, I am not ranking for any of my terms (had boucned back on main keyword to spot #30 after being pushed down from #4) now back to the interesting site....
this other domain was bouncing between 8-12 for main keyword (EMD) before we used ALN.
Once we got warning, we did nothing. Once rankings started to fall,we filed reinclusion request...rankings fell more, and filed another more robustly written request (got denials within 1 week after each request)until about 20 days ago when we fell off of the face of the earth. 1- should I take this as some sort of sandbox? We are still indexed, and are #1 for a search on our domain name. Also still #1 in bing (big deal) 2- I've done a detailed analysis of every link they provide in GWT. reached out to whatever splog people I could get in touch with asking them to remove articles. I was going to file another request if I didn't reappear after 31 days after I fell off completely. Am I wasting my time? there is no doubt that sabatoge could be committed by competition by blasting them with spam links (previously I believed these would just be ignored by google to prevent sabatoge from becoming part of the job for most SEOs) Laugh at me, gasp in horror with me, or offer some advice... I'm open to chat and would love someone to tell me about a legit solution to this prob if they got one thanks!0 -
Page Rank is 0
Hi. Can you please point me in the right direction concerning a site whose default page has a PR of 0? There does not appear to be any errors in the robots.txt file (that I can tell). When I ran a duplicate content check by searching the title tag and first sentance in quotes it did not return more than 2 sites. When I ran a site: it is reporting 287,000 results. Does this mean that they purchased links and have now been penalized? Or where should I go from here? Thank you for any feedback and assistance.
White Hat / Black Hat SEO | | JulB0 -
Using Canonical Tags to Boost Lead Form Ranking
I presently have a number of whitepapers that bring traffic to our site. If a visitor elects to download the whitepaper they are taken to a lead form with an abstract of the whitepaper. The abstract is present because the visitor may or may not have come to the lead form directly. I imagine this would be a "no no," but how do you feel about placing a canoncial tag on a whitepaper that points to the lead form w/ abstract? The obvious idea being to take the umph of a whitepaper to direction search visitors directly to the lead form.
White Hat / Black Hat SEO | | shoffy0