Manual Penalty Reconsideration Request Help
-
Hi All,
I'm currently in the process of creating a reconsideration request for an 'Impact Links' manual penalty.
So far I have downloaded all LIVE backlinks from multiple sources and audited them into groups;
-
Domains that I'm keeping (good quality, natural links).
-
Domains that I'm changing to No Follow (relevant good quality links that are good for the user but may be affiliated with my company, therefore changing the links to no follow rather than removing).
-
Domains that I'm getting rid of. (poor quality sites with optimised anchor text, directories, articles sites etc.).
One of my next steps is to review every historical back link to my website that is NO LONGER LIVE. To be thorough, I have planned to go through every domain (even if its no longer linking to my site) that has previously linked and straight up disavow the domain (if its poor quality).But I want to first check whether this is completely necessary for a successful reconsideration request?
My concerns are that its extremely time consuming (as I'm going through the domains to avoid disavowing a good quality domain that might link back to me in future and also because the historical list is the largest list of them all!) and there is also some risk involved as some good domains might get caught in the disavowing crossfire, therefore I only really want to carry this out if its completely necessary for the success of the reconsideration request. Obviously I understand that reconsideration requests are meant to be time consuming as I'm repenting against previous SEO sin (and believe me I've already spent weeks getting to the stage I'm at right now)... But as an in house Digital Marketer with many other digital avenues to look after for my company too, I can't justify spending such a long time on something if its not 100% necessary.
So overall - with a manual penalty request, would you bother sifting through domains that either don't exist anymore or no longer link to your site and disavow them for a thorough reconsideration request? Is this a necessary requirement to revoke the penalty or is Google only interested in links that are currently or recently live?
All responses, thoughts and ideas are appreciated
Kind Regards
Sam
-
-
Thanks again for your response Gary.
With regards to how many reffering domains and backlinks, it depends on how much i trust various bits of software (eg. Majestic SEO) when they tell me if the link is live or not.
In total there's about 3,200 referring domains historically with over 350,000 backlinks (lots of spam). Looking at whats live today, thats about 600 domains and 30,000 backlinks or so.
So far I've audited all links (from whats live) into keeping, changing to no follow or removing. Ive reached out to all no follows successfully and I've justified in depth the list of domains I'm keeping. I'm now in the process of reaching out to the poor quality links (first wave) and have covered about 200 referring domains.
The main question here is just exactly what to do with the rest of the links that majestic and GWT are telling me are no longer live (after checking some examples, there are some live that say they aren't live on majestic). Initially I was just going through them and throwing poor quality ones (even if they no longer link) straight into the disavow file to be safe. But since, I've worked with my developer to create a script to check which of the 2,500 none live domains are still live (and therefore cutting down my time considerably).
So overall, I am confident with my approach on links that are live (as this is the standard approach) and I am being as thorough as is possible. But when I wrote this question initially I was unsure whether I had to deal with the 'none live' domains (mainly because I didn't know whether to fully trust Majestic when its saying that they're not live) and so I wanted to check whether it was something I needed to do because it would be extremely time consuming.
Hopefully you understand where I'm coming from with this?
Sam
-
Thanks for your response Richard.
This is however an extremely generic response to quite a specific question. I didn't ask what a reconsideration request does!
-
So sorry for the delay getting back to you, its been a crazy week and didnt notice the response.
"Note that this is a manual penalty though, so fortunately no waiting for Penguin refreshes."
OK, just to let you know, once they lift the manual penalty, you still need to wait for a Penguin refresh. my penalty was lifted in May 2013 the vast majority of crap links had not been crawled and took a very long time for Google to do so. For the disavow file to take effect it needs to crawl each of those pages with your disavow file in mind and change them to a nofollow. Once a healthy amount is crawled you will then be in good standing when the Penguin algo is run. If Penguin runs before you have an acceptable level of healthiness you will not be released form Penguin and will have to wait for the next. So it took us until Oct 17th 2014 for us to finally get released. This was WITH John Muellers help!
My advice is don't be too picky with what you keep. Go through everything, mine was 20,000 Referring domains with 250k links! We had a 10 year history of business online and at one point also attacked with negative seo. So was a big job
"Providing I've given all possible evidence I can about the links being live or not to Google, do you think that disavowing all poor quality links that APPEAR to be no longer live is good enough in Google's eyes? Obviously for all links that are still live (as far as i can see) I have outreached to at least 3 times and disavowed if I can't get in touch."
Yes, create a report to show the work you have done, whats removed, who you have contacted, who did not respond. I did an Excel spreadsheet, one domain per line, with a few fields like, last contacted, date, removed etc..
There are lots of programmes out there that help with this now. Not so easy when your the first and there are no tools for it!
Also its best to do domain instead of links, how many links do you have pointing to your site?
-
A good reconsideration request does three things:
- Explains the exact quality issue on your site.
- Describes the steps you’ve taken to fix the issue.
- Documents the outcome of your efforts.
-
Actually, I agree with you. What you're describing are sites that look like the link has been deleted, but where the link actually still exists. My answer was regarding sites where the link actually has been deleted and doesn't exist.
-
Thanks for your response Gary.
That does make sense and to be honest is something that worries me! I am putting faith into software here (ie. I haven't gone through every single domain manually and checked that the link is still live) which is telling me whether the link is still live or not. If Google's software tells them otherwise when they review my reconsideration request, then all my other efforts are most likely wasted. I take it from this that you would advise addressing the none active domains too?
Note that this is a manual penalty though, so fortunately no waiting for Penguin refreshes.
Providing I've given all possible evidence I can about the links being live or not to Google, do you think that disavowing all poor quality links that APPEAR to be no longer live is good enough in Google's eyes? Obviously for all links that are still live (as far as i can see) I have outreached to at least 3 times and disavowed if I can't get in touch.
cheers
Sam
-
Sorry I have to disagree,
There are many sites, specifically directory sites that list websites and as more sites get listed they push your link to page 3, 4, 5. It looks like the link does not exist but it does on another page.
Some sites are that are crappy also have poor connections/bandwidth etc... So they go up and down and overload all the time. Just because its down now does not mean its down later when Google crawls it.
When I did my now famous! link clean up these were both issues that came up when I got help from John Mueller at Google.
It sucks because its just a hell of a lot of work, but based on how long it takes for a penguin update to come about, I would make sure you get it right FIRST TIME or you could wait more than a year to see returns.
Feel free to ask me anything.
Best of luck
Gary
-
Yes, I would be very surprised if Google wanted you to do anything with links that no longer exist.
-
Thanks for your response, Adam.
Would you say the same for domains that are still live but no longer contain links to your site?
Thanks
-
No, I would not spend time on links/domains that no longer exist. (I've never heard of that being necessary.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help with structure for optimizing Photography Website SEO
Hey guys , I am building a photography website and currently I have it setup the following way for my image galleries : https://ricfrancophotography.com/portfolio/norway-landscape-photography/#!gallery6618-6577 This provides me with an individual link for each of the images in my photography gallery but as a gallery these have obviously no content and I figured the best way would be to add the images I want to work my SEO for to individual blog posts . So here is what I did so far : - Added a link to the caption of each image inside the lightbox that is linking to the individual blog post - In order to not break the navigation I made the post with the content for each image open in a modal popup (it changes the link in the top bar but once closed it goes back to the gallery) . - I made the image inside the post link back to the fullsize image in the lightbox gallery when clicked instead of linking to the .jpg in wp-content/uploads. Now, I have some questions regarding whether this is a good practice in terms of SEO and if the fact of having duplicate images or this structure is going to hurt my SEO any way . Although both images are in different urls they ultimately link to each other this way : Blog Image --> Gallery Image url --> wp-content/uploads/file.jpg Is there a better approach for this ? Thanks
Intermediate & Advanced SEO | | ricfranco0 -
Recovered from Manual Penalty but rankings still suck
Hi All, We got a penalty Last March 2014 ( Side Wide Link - unnatural links) which we recovered from quickly and this changed to Partial Match penalty (impact links) which we recovered from back in December 2014. Our Site profile has been cleaned up but our rankings still suck for some of our main keywords (+500) . Also our traffic and local rankings still suck in some cases. From an SEO point of view our site is pretty good, we've done everything google has recommended including schema.org, mobile responsive, unique content (which we write regulary) and we only have a few duplicate pages. Our domain authority is better than our competitors but yet our rankings and traffic are still no way as good as theirs. Do anyone know if recovering from an impact links penalty take longer than 4 months . I know that google says than it discounts those links but I get the feeling google may be looking at an old dataset due to not rerunning panda & penguin since our penalty was removed and this may be whats affecting things. Does anyone have any ideas? I am more than happy to post my url if someone fancies taking a quick look ? to see if it's anything obvious ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Help with Robots.txt On a Shared Root
Hi, I posted a similar question last week asking about subdomains but a couple of complications have arisen. Two different websites I am looking after share the same root domain which means that they will have to share the same robots.txt. Does anybody have suggestions to separate the two on the same file without complications? It's a tricky one. Thank you in advance.
Intermediate & Advanced SEO | | Whittie0 -
The "webmaster" disallowed all ROBOTS to fight spam! Help!!
One of the companies I do work for has a magento site. I am simply the SEO guy and they work the website through some developers who hold access to their systems VERY tightly. Using Google Webmaster Tools I saw that the robots.txt file was blocking ALL robots. I immediately e-mailed out and received a long reply about foreign robots and scrappers slowing down the website. They told me I would have to provide a list of only the good robots to allow in robots.txt. Please correct me if I'm wrong.. but isn't Robots.txt optional?? Won't a bad scrapper or bot still bog down the site? Shouldn't that be handled in httaccess or something different? I'm not new to SEO but I'm sure some of you who have been around longer have run into something like this and could provide some suggestions or resources I could use to plead my case! If I'm wrong.. please help me understand how we can meet both needs of allowing bots to visit the site but prevent the 'bad' ones. Their claim is the site is bombarded by tons and tons of bots that have slowed down performance. Thanks in advance for your help!
Intermediate & Advanced SEO | | JoshuaLindley0 -
Rankings gone, no WMT errors, help!
Hi, Client Google rankings have been seriously hit. We have done everything we know of to see why this is the case, and there is no obvious explanation. The client dominated search terms, and are no down on page 7/8 for these search terms. There are no errors in WMT, so we can not resubmit for reconsideration. This is a genuine client and their business has been seriously affected. Can anybody offer help? Thanks in advance!
Intermediate & Advanced SEO | | roadjan0 -
Google contradictory communications about manual action being applied
Hello,
Intermediate & Advanced SEO | | mylittlepwny
we received a manual action (partial match) for pure spam for a site of ours. The date is not sure, because we didn't receive any notification in mail or inside Google Webmaster Tools dashboard, so all we can say for sure is that we noticed that the manual action page wasn't empty anymore in 10/03/2013. Some context: our Google traffic got a big hit on 07/20/2013, losing around 60% out of 250k visits per day. At first we thought it was an algorithmic penalisation related to Panda update. It already happened a few times in the past: losing part of Google traffic and having it back usually a couple of months after, often even better than before. We were really surprised at first to be deemed as pure spam given that the domain is ours since it was created 7 years ago, that we have never employed black hat techniques and that our efforts were always put into building valuable pages for users instead of using spam techniques to deceive them. But after noticing the manual action, we obviously thought that this was the actual reason for our traffic sudden drop. So we tried to figure out from the 4 URLs that Google reported as examples of the pure spam affected pages, what issues on our site could have been misinterpreted for pure spam. We also checked all the webmaster guidelines and fixed the issues we thought we could not be fully compliant with. All this process lasted for 3 months, after which we submitted our reconsideration request on 12/16/2013.
On 01/07/2013 we got the following answer: We've reviewed your site and found no manual actions by the webspam team that would directly affect your site's ranking in Google's search results. You can use the Manual Actions page in Webmaster Tools to view actions currently applied to your site.
Of course, there may be other issues with your site that could affect its ranking. Google determines the order of search results using a series of computer programs known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking will happen from time to time as we make updates to present the best results to our users.
If your site isn't appearing in Google search results, or if it's performing more poorly than it once did, check out our Help Center to identify and fix potential causes of the problem. Now we are really puzzled because Google is saying 2 opposite things: We still have a pure spam manual action, and we don't have a manual action (as per their newest response to our reconsideration request).
We could find online a few cases somehow similar to our own, with Google apparently giving contradictory communications about manual actions, but none of them helped to build a clear explanation. I don't want to enter into the merits of the reasons of the penalisation or whether it was or wasn't deserved, but rather knowing if anyone had the same experience or has any guess on what happened.
What we could think of is some bug or problem related to synching between different pieces of Google but still, after some days, the manual action notice is always there on Google Webmaster Tools and nothing changed in our traffic. We are now thinking about sending a second reconsideration request asking to update our Google Webmaster Tools manual actions page accordingly to our current actual status.
What do you think? thank you very much0 -
Whats Next, noobie needs some help :)
Hey Everyone, I have been a member of SEOMOZ for about 4 months and love it, it really has helped me out. my first website I am trying to get well ranked in the new zealand market is www.shopezy.co.nz I have chosen: sell online free ecommerce website ecommerce website builder Are paid links the way forward for me? should I aim for more keywords? should I pay for help? Just looking for a lttle direction, if someone could help. thanks.
Intermediate & Advanced SEO | | bonmaklad0 -
Help me choose a new URL structure
Good morning SEOMoz. I have a huge website, with hundreds of thousands of pages. The websites theme is mobile phone downloads. I want to create a better URL structure. Currently an example url is /wallpaper/htc-wildfire-wallpapers.html My issue with this, first and foremost is it's a little spammy, for example the fact it's in a wallpaper folder, means I shouldn't really need to be explicit with the filename, as it's implied. Another issue arises with the download page. For example /wallpaper/1234/file-name-mobile-wallpaper.html Again it's spammy but also the file ID, is at folder level, rather than within the filename. Making the file deeper and loses structure. I am considering creating sub domains, based on model, to ensure a really tight silo. i.e htc.domain.com/wallpaper/wildfire/ and the download page would be htc.domain.com/wallpaper/file-name-id/ But due to restrictions with the CMS, this would involve a lot of work and so I am considering just cleaning up the url structure without sub domains. /wallpaper/htc/wildfire/ and the download page would be /wallpaper/file-name-id/ What are your thoughts? Somebody suggested having the downloads in no folder at all, but surely it makes sense for a wallpaper, to be in a wallpaper folder and an app to be in an app folder? If they were not in a folder, I'd need to be more explicit in the naming of the files. Any advice would be awesome.
Intermediate & Advanced SEO | | seo-wanna-bs0