Do you have to wait after disavowing before submitting a reconsideration request
-
Hi all
We have a link penalty at the moment it seems. I went through 40k links in various phases and have disavowed over a thousand domains that date back to old SEO work. I was barely able to have any links removed as the majority are on directories etc that no one looks after any more etc and / or which are spammy and scraped anyway.
According to link research tools link detox tool, we now have a very low risk profile (I loaded the disavowed links into the tool for it to take into consideration when assessing our profile). I then submitted a reconsideration request on the same day as loading the new disavowed file (on the 26th of April). However today (7th May) we got a message in webmaster central that says our link profile is still unnatural. Aaargh.
My question: is the disavow file taken into consideration when the reconsideration request is reviewed (ie is that information immediately available to the reviewer)? Or do we have to wait for the disavow file to flow through in the crawl stats? If so, how long do we have to wait?
I've checked a link that I disavowed last time and it's still showing up in the links that I pull down from Webmaster Central, and indeed links that I disavowed at the start of April are still showing up in the list of links that can be downloaded.
Any help gratefully received. I'm pulling my hair out here, trying to undo the dodgy work of a few random people many months ago!
Cheers,
Will
-
You seem to have a good handle on the issue but you might consider getting an experienced SEO in for at least a second opinion. We can only give very general help here on the Q&A, as we don't have access to your data
They do say to wait at least a few weeks for results
Cheers
S
-
Hi Stephen
I've been using the links downloaded from Webmaster (as directed to by Matt Cutts in one of his videos IIRC) plus also the data set from Link Research Tools. Is that insufficient? I've only got so many hours in the day as my day job is running this company...I figured taking the links that Google gave me would surely be enough...but these days who knows. G seems to want to make people jump through a lot of hoops...
-
Hey Marcus
Thanks for your input. Yeah, we have a lot of links but then we've been around for 7 years and weirdo scrapers and random replicants of DMOZ alone contribute a zillion links without us even having done anything. Not saying we didn't do link building back in the day (we did, just like everyone else, in what was at the time a white hat fashion but apparently no longer is) but we have had no permanent marketing team at all for the last two years as we've focused on some B2B parts of our business. So frustrating that bad links just kept growing and we're supposed to be responsible for them!
Anyway, as you say, will need to go in a bit harder I guess. eg just because a site is PR0, I didn't remove it before, as some random person with a no marks blog who used our birthday balloon picture on their blog didn't deserve to be disavowed as far as I thought. But, well, I can't take any chances now so will just have to bin anything under PR1 and take another look at links from themed websites (eg should I disavow other blogs that have added us to their blogroll unsolicited even if they're in our vertical? It's hard to tell. What about genuine flower directories? Who knows?).
What's really frustrating is that the whole message from Matt Cutts is "you really shouldn't use this tool" (ref disavow) as you could damage your site but 1. barely anyone takes links down when requested as far as I can tell and 2. given the amount of junk that's been pointed at our site that we're not responsible for (though we are are responsible for some), then I think the contention that very few people would need to use it is a bit optimistic and there's therefore a danger or people like me totally shooting themselves in the foot, given there are no clear rules on the grey areas I mention above.
PS understood that it's not some magic solution and we'll rank #1 for everything afterwards. I just want to get it cleared up and be able to get back to my day job. God knows how a smaller business than us would cope with something like this. Seems to me it pushes the advantage even further in the direction of bigger companies with the resources to manage a screw up like this.
Anyway, blah blah. Time to get the machete out.
-
In my experience, if you have this message again, you still have links they don't like. 35% of linking domains is not a great deal and as Stephen said, whilst Link Detox gives you a good starting place you really do have to audit these links in a brutal fashion.
You have 15000 external links from 2000 sites - that's a hell of a lot of links for a semi popular blog let alone a site that does not really publish any content that would attract links.
If you are holding onto links as you think they are 'ok' or because they 'don't look too bad' then you may need to get a whole lot more aggressive with what you remove.
Also, just because you remove the manual penalty, don't expect things to be amazing afterwards.
An alternative approach to finding the bad links and getting them removed is to identify the good ones and consider getting them repointed to a new URL and starting again with a rebrand / new URL. It can be easier to get a response from the good sites than it can be getting a response from the bad ones.
Failing that get a whole lot more aggressive with what you remove.
Hope that helps!
Marcus
-
How sure are you you have a full dataset of links? What did you use as you database for links to start cleaning from? (I would expect ahrefs, GWT, seomoz + majestic etc)
S
-
Well, I also went through all the links manually which was the world's most boring task, then followed up with a healthcheck. Gah.
We've disavowed about 35% of all linking domains now...
-
I doubt its a time thing, it's more likely that they still see dirty links that you have not disallowed
That's the problem with these jump one the bandwagon tools like Link detox et al - they give you a nice score but that doesn't mean anything
404ing burnt pages and starting again may be a much quicker process than messing around with link disavowal
How many domains were linking and how many domains did you disallow?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO & How long does it take for Google to disavow
Following on from a previous problem of 2 of our main pages completely dropping from index, we have discovered that 150+ spam, porn domains have been directed at our pages (sometime in the last 3-4 months, don't have an exact date). Does anyone have exerpeince on how long it may take Google to take noticed of a new disavow list? Any estimates would be very helpful in determining our next course of action.
Intermediate & Advanced SEO | | Vuly1 -
Best tools for submitting contact forms of 1000 websites?
For a new B2B service we have identified websites that we would like to make aware of our service.
Intermediate & Advanced SEO | | lcourse
There are about 1000 websites for which it was not possible to retrieve emails, and where we need to do the outreach using the websites contact pages. Do you know of any tools that save time or outsource companies specialized in such a service? We do not want to fully automize the process but a human should do a visual check that form is properly filled. What I imagine could save time would be tools that already load from a list of URLs the next pages already in the background of the browser and good autoform fillers. Any recommendations?0 -
How necessary is it to disavow links in 2017? Doesn't Google's algorithm take care of determining what it will count or not?
Hi All, So this is a obvious question now. We can see sudden fall or rise of rankings; heavy fluctuations. New backlinks are contributing enough. Google claims it'll take care of any low quality backlinks without passing pagerank to website. Other end we can many scenarios where websites improved ranking and out of penalty using disavow tool. Google's statement and Disavow tool, both are opposite concepts. So when some unknown low quality backlinks are pointing and been increasing to a website? What's the ideal measure to be taken?
Intermediate & Advanced SEO | | vtmoz0 -
Disavow Experts: Here's one for ya ....
Not sure how to handle this one. Simply because there are SO MANY .... I want to be careful not to do something stupid ... Just a quick 3 minute video explanation: https://youtu.be/bVHUWTGH21E I'm interested in several opinions so if someone replies - please still chime in. Thanks.
Intermediate & Advanced SEO | | HLTalk0 -
Possible problem with new site (GWT no queries/very low index vs. submitted)
Hi everyone, I recently launched a new website for a small business loan company in the Dallas area. The site has been live for roughly a month and a half. I submitted everything to GWT as usual, including my sitemap. I am not sure what's going on with the site, as there is no activity from GWT in the impressions or queries. The submit vs. index is 24/3 (and hasn't moved). Also the queries graph on the overview stops at 3/18/2015... On another note, when I go to Crawl > Sitemaps, it shows that there were pages indexed during the month of march and then on April 3 it drops from 17 to 2 and never increases. Google says there are no errors or issues found, but I feel like there's something wrong. When I do site:, my URLs do pop up which makes me believe there's just a problem with my GWT. With that being said, I'm not happy THINKING there's something wrong. I need to actually know what the problem is. The only thing I can think of that I have done is purchase SSL for the site, but when I search what pages are indexed using www. it shows all the HTTPS URLS, so that would tell me that the site is getting indexed without a problem? Does anyone have a clue as to what might be happening? I will attach some screen shots so that you can get a better idea... KQ2366i D5xBNZf mF7kkgW
Intermediate & Advanced SEO | | jameswesleyhunt0 -
How long should you wait between submitting link removal requests?
I'm in the process of trying to clear up a spammy link profile for a site I'm working on. I'm using the excellent data from MOZ and the list of links from Google Webmaster Tools to come up with a list of sites and Remove'em to manage the process and before I go to Google I want to make sure the file I am going to submit for the disavow process is as strong as possible. I am aware that I need to contact webmasters about three times to do the removal request properly. How long between requests should there be and how long should I wait between submitting a final removal request and submitting the file to the disavow tool? Any advice welcome. Thanks.
Intermediate & Advanced SEO | | johanisk0 -
Submitting sitemaps every 7 days
Question, if you had a site with more than 10 million pages (that you wanted indexed) and you considered each page to be equal in value how would you submit sitemaps to Google? Would you submit them all at once: 200 sitemaps 50K each in a sitemap index? Or Would you submit them slowly? For example, would it be a good idea to submit 300,000 at a time (in 6 sitemaps 50k each). Leave those those 6 sitemaps available for Google to crawl for 7 days then delete them and add 6 more with 300,000 new links? Then repeat this process until Google has crawled all the links? If you implemented this process you would never at one time have more than 300,000 links available for Google to crawl in sitemaps. I read somewhere that eBay does something like this, it could be bogus info though. Thanks David
Intermediate & Advanced SEO | | zAutos0 -
Is Google's reinclusion request process flawed?
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process. As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google. Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty? This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more. If anyone could contribute from the search quality team, we would love to hear your thoughts! Cheers, Joe
Intermediate & Advanced SEO | | BrowserMediaLtd0