The wrath of Google's Hummingbird, a big problem, but no quick solution?
-
One of our websites has been wrongfully tagged for penalty and has literally disappeared from Google. After lot's of research, it seems the reason was due to a ton of spammy backlinks and irrelevant anchor text. I have disavowed the links, but the results are still not rebounding back. Any idea how long the wrath of Google gods will last?
-
Just checking in here. How have things been going? Just wanted to add some info;
- did you actually receive a "manual action" from Google? If not, a reconsideration request is not needed, it's an algo not a manual penalty
- Hummingbird will not do this. The "date" of Hummingbird is not really a specific point in time, and it's definitely not the day Google announced it. The Hummingbird algo didn't act like Panda, Penguin or any other penalty.
- Disavowing links can take up to six months to have an effect. This info is straight from John Muller at Google. Google has to actually crawl the websites that are listed in the disavow file before applying the disavow.
- It's not good enough to only disavow. Google wants to see some effort to remove links as well.
- You may not bounce back unfortunately (tough to say without seeing the site). But if you were artificially ranking too high with bad links, now you might be ranking where you should have all along.
Overall the best thing to do is remove and disavow as much as possible and work on moving forward as best possible to acquire new links and make the site better overall.
-
Yiannis,
Good point. Actually it seems that since the Hummingbird update, the rankings have disappeared. So I am going through all the SEO checkpoints and it seemed that spam backlinks was the viable reason for the site suffering in rankings. There is no notice that the site has been penalized so it appears to be due to the automatic due to the algorithm update. It has been 2 months since clearing the issues, but still no luck in rebounding on the rankings
-
Reconsideration request applies on manual spam penalties and not algorithmic updates. this has been covered extensively by Matt Cutts in numerous videos.
We dont know if he is manually penalised or not. in the case you are follow advises from the two gentlemen above.
IF you arent and removed all spammy links it will take a while for you to get your rankings back. Some people say you have to wait a big update but I would say give it a month and in the meantime try to main traffic from other channels ie. social media, email marketing etc.
-
Everything that Alex said PLUS fill a reconsideration request. Wait a couple of weeks for Google's response. Your request MUST explain all the steps you made to clean up your site and be in compliance with Google's TOS (with proof and all)!
-
Since you lost those backlinks, you probably won't be able to bounce back to exactly where the site left off before those penalties. Did you get a manual action or was it done by the crawlers? Also, did you get a partial or full site penalty?
First double check your Google Webmaster Tools and see what the status of the penalty is and make sure that's removed. There's a Manual Actions area they added in the last few months which lets you see the stats and details of actions taken against your site by Google. If everything is resolved there, start improving your content and get back to link building. Since a good number of your old links are gone, they'll need to be replaced with better quality links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Organic Ranking & Traffic Dropped
Hello, We have been struggling to keep our website (http://goo.gl/vS37qA) ranking well in Google since April 30, 2015. For some reason at that time, there were around 15000 blocked pages (mainly Magento layered navigation pages) showing in Google's Search Console. We used canonical tags, and now all these pages have been removed from Google's index and Google Search Console. We didn't do anything that is against Google's Guidelines. Currently in Google Search Console we see:- Around 50 crawl errors- no malware- no blocked pages - no other error messages in both Webmasters tool.We have never practiced black hat SEO, paid for links, or used tactics that Google penalizes. We noticed in the last few months there are around 1000 Chinese/Russian/Japanese links points to our website, and we have used the disavow tool to notify Google of these attacks.Any help would be greatly appreciated in advance!
White Hat / Black Hat SEO | | NancyH0 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Google is giving one of my competitors a quasi page 1 monopoly, how can I complain?
Hi, When you search for "business plan software" on google.co.uk, 7 of the 11 first results are results from 1 company selling 2 products, see below: #1. Government site (related to "business plan" but not to "business plan software")
White Hat / Black Hat SEO | | tbps
#2. Product 1 from Palo Alto Software (livePlan)
#3. bplan.co.uk: content site of Palo Alto Software (relevant to "business plan" but only relevant to "business plan software" because it is featuring and linking to their Product 1 and Product 2 sites)
#4. Same site as #3 but different url
#5. Palo Alto Software Product 2 (Business Plan Pro) page on Palo Alto Software .co.uk corporate site
#6. Same result as #5 but different url (the features page)
#7. Palo Alto Software Product 2 (Business Plan Pro) local site
#8, #9 and #10 are ok
#11. Same as #3 but the .com version instead of the .co.uk This seems wrong to me as it creates an illusion of choice for the customer (especially because they use different sites) whereas in reality the results are showcasing only 2 products. Only 1 of Palo Alto Software's competitors is present on page 1 of the search results (the rest of them are on page 2 and page 3). Did some of you experience a similar issue in a different sector? What would be the best way to point it out to Google? Thanks in advance Guillaume0 -
Hackers are selling fake 'Likes' on FB, Instragram
An interesting article on how to get social media buzz: http://www.huffingtonpost.com/2013/08/16/fake-instagram-likes_n_3769247.html
White Hat / Black Hat SEO | | ChristopherGlaeser0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Google-backed sites' link profiles
Curious what you SEO people think of the link profiles of these (high-ranking) Google-backed UK sites: http://www.opensiteexplorer.org/domains?site=www.startupdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.lawdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.marketingdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.itdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.taxdonut.co.uk Each site has between 40k and 50k inlinks counted in OSE. However, there are relatively few linking root domains in each case: 273 for marketingdonut 216 for startupdonut 90 for lawdonut 53 for itdonut 16 for taxdonut Is there something wrong with the OSE data here? Does this imply that the average root domain linking to the taxdonut site does so with 2857 links? The sites have no significant social media stats. The sites are heavily inter-linked. Also linked from the operating business, BHP Information Solutions (tagline "Gain access to SMEs"). Is this what Google would think of as a "natural" link profile? Interestingly, they've managed to secure links on quite a few UK local authority resources pages - generally being the only commercial website on those pages.
White Hat / Black Hat SEO | | seqal0 -
Influence of users' comments on a page (on-page SEO)
Do you think when Google crawls your page, it "monitors" comments updates to use this as a ranking factor? If Google is looking for social signs, looking for comments updates might be a social sign as well (ok a lot easier to manipulate, but still social). thx
White Hat / Black Hat SEO | | gt30