Resubmitting disavow file after penalty removal
-
Hi,
We had a manual penalty for links removed about a year ago. The disavow file we submitted was pretty extensive and we took the machete approach, as recommended by Matt Cutts.
Recently we took a look over the file again and are of the firm conviction that some of the domains are entirely legit and the links are not manipulated. We would like to resubmit the disavow file excluding these domains so Google picks up the links again.
Does anyone have experience of this and if so what were the results?
Thanks
-
Been through the disavow on my site-
While missing out on a few possible links, sucks.....are they high enough quality to warrant the possible negative consequences of lost revenue from the penalty coming back?
Personally speaking unless that's a link from the Wall Street Journal or CNN, I wouldn't be poking the sleeping bear, it's just not worth it.
-
Thanks and you make a very good point about the exact match anchor text.
-
We were having exactly the same thought today, that there may be valid links in the disavow file that weren't necessarily a problem.
On doing some research we found out that you can "undisavow" a link by removing from the file and re-submitting but also bear in mind that if your penalty was based on exact anchor text in the links then removing them without being reconfigured may bring the problem back.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recent 2017 Disavow Experience - How long is it taking?
Hello All A client's site recently got hit with links from an XXX neighborhood. My clients site is on the periphery of adult entertainment, think Maxim Magazine, but not in the porn space. These links could be natural, or pushed by a competitor, we definitely did not solicit them. Regardless, dozens of links were established and then found by Google starting in February and a few very important keyword rankings disappeared about 2 months later (after Google found more and more XXX links). The linked to page is the only one that was really hit and it's not a manual action - seems completely algorithmic. We have disavowed all that we can put our finger on but I'm trying to provide guidance as to how long it has taken others to see some type of recovery...?
White Hat / Black Hat SEO | | seoaustin0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
What do you say in your emails to horrible sites to remove your links?
Morning guys, I've the unenviable task of having to rectify poor link building (a previous company's work, not mine) which inevitably means emailing tons and tons of horrible directories with links to the client from as far back as 5/6 years ago. I'm sure many of you are in the same boat so it begs the question: What have you said to these types of sites that is effective in getting them to remove the links? This could even be a two/three-parter: If you've had little joy in requesting removals, have you dis-avowed the links, and what (if any) effect did it have? Thanks, M.
White Hat / Black Hat SEO | | Martin_S0 -
Footer Link in International Parent Company Websites Causing Penalty?
Still waiting to look at the analytics for the timeframe, but we do know that the top keyword dropped on or about April 23, 2012 from the #1 ranking in Google - something they had held for years, and traffic dropped over 15% that month and further slips since. Just looked at Google Webmaster Tools and see over 2.3MM backlinks from "sister" compainies from their footers. One has over 700,000, the rest about 50,000 on average and all going to the home page, and all using the same anchor text, which is both a branded keyword, as well as a generic keyword, the same one they ranked #1 for. They are all "nofollows" but we are trying to confirm if the nofollow was before or after they got hit, but regardless, Google has found them. To also add, most of sites are from their international sites, so .de, .pl, .es, .nl and other Eurpean country extensions. Of course based on this, I would assume the footer links and timing, was result of the Penguin update and spam. The one issue, is that the other US "sister" companies listed in the same footer, did not see a drop, in fact some had increase traffic. And one of them has the same issue with the brand name, where it is both a brand name and a generic keyword. The only note that I will make about any of the other domains is that they do not drive the traffic this one used to. There is at least a 100,000+ visitor difference among the main site, and this additional sister sites also listed in the footer. I think I'm on the right track with the footer links, even though the other sites that have the same footer links do not seem to be suffering as much, but wanted to see if anyone else had a different opinion or theory. Thanks!
White Hat / Black Hat SEO | | LeverSEO
Jen Davis0 -
How can I recover from an 'unnatrual' link penalty?
Hi I believe our site may have been penalised due to over optimised anchor text links. Our site is http://rollerbannerscheap.co.uk It seems we have been penalised for the key word 'Roller Banner' as the over optimised anchor text contains key word 'Roller Banner' or 'Roller Banners'. We dropped completely off page 1 for 'Roller Banner', how would I recover from this error?
White Hat / Black Hat SEO | | SO_UK0 -
Where can i see ejemple of disavow files to adapt mine in order to send to google
Can i send a disavow file to google as CSV file. Where can i see ejemple of disavow files to adapt mine in order to send to google
White Hat / Black Hat SEO | | maestrosonrisas0 -
How to transform an excel file on a txt file to send the Google Dissavow
I have a disallow file made on excel with lots of columns off information. I want to transform to txt file saving it from excel, but the result file seems understandable Can someone helpme on how to transform an excel file on the Google Dissavow file format for the final import
White Hat / Black Hat SEO | | maestrosonrisas0