Reconsideration Request a Success!
-
Hi all,
Well I've finally gotten been able to get the penalty removed judging by this email:
"Dear site owner or webmaster of xxx,
We received a request from a site owner to reconsider xxx for compliance with Google's Webmaster Guidelines.
Previously the webspam team had taken manual action on your site because we believed it violated our quality guidelines. After reviewing your reconsideration request, we have revoked this manual action. It may take some time before our indexing and ranking systems are updated to reflect the new status of your site.
Of course, there may be other issues with your site that could affect its ranking without a manual action by the webspam team. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If your site continues to have trouble in our search results, please see this article for help with diagnosing the issue.
Thank you for helping us to maintain the quality of our search results.
Sincerely,
Google Search Quality Team"
This was after a reconsideration request was sent prior to disavow tool being released. In addition I also applied a disavow of all the links I was unsuccessful in removingwithout contacting Google and letting the original reconsideration request run it's course.
I am making this post just to let everyone know that the hard work pays off and Google is just trying to make sure you are doing your best in removing the links. As 'Ryan Kent' always emphasizes, you must really be diligent and honest when trying to remove links. You also need to keep documentation, I anchored contact pages and email addresses, 1st, 2nd, 3rd, and even 4th attempt dates.
Now with the disavow tool, I believe if you do a "good faith" in removing the links, and it is well documented, you can use the disavow tool after multiple attempts, correlating both the disavow links and the spreadsheet sent to Google is and should be very important in a reconsideration request.
Good luck!
Also I received the message from WMT, and wondering does anyone know how long is 'some time' before site is reindexed? So far our organic traffic is still about the same prior. So I would like to hear what other's experience are after a successful reconsideration.
Feel free to ask any questions!
-
I sent a request on Oct 14th and got a response on Oct 18th. I've been removing links around Aug.
Responses whether denial or success seemed to range 1-2 weeks.
-
Interesting but How long have you waited to be reconsidered?
-
Nice work! Hope you get back to the top quickly.
-
Yep, One of the things that have surprised me was previous reconsideration requests that allowed me to communicate directly through email.
It was surprising to actually talk to someone from Google via email about a free service. The emails were pretty personal and one time even gave a link example.
-
That's great to hear - and with it having been manual action that was taken, it's nice to know these are actually handled by real people - as Matt Cutts states here.
Andy
-
Great news William! Good job. I imagine the wait time is going to be at least whatever your normal crawl rate is.
Thanks for sharing your story.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Error: The URL you entered does not appear to be returning a page successfully
I keep getting the above error when I'm trying to look at the page optimization for a new blog page we have uploaded. It was posted over a week ago, so I assumed it would be indexed by now. Any thoughts on why it isn't working? The page is: http://www.esg.co.uk/blog/blog/2015/why-is-air-quality-testing-so-important/#.VmlBmLiLRpg Thanks
Technical SEO | | Charley_Tangerine0 -
Manual Action - When requesting links be removed, how important to Google is the address you're sending the requests from?
We're starting a campaign to get rid of a bunch of links, and then submitting a disavow report to Google, to get rid of a manual action. My SEO vendor said he needs an @email domain from the website in question @travelexinsurance.com, to send and receive emails from vendors. He said Google won't consider the correspondence to and from webmasters if sent from a domain that is not the one with the manual action penalty. Due to company/compliance rules, I can't allow a vendor not in our building to have an email address like that. I've seen other people mention they just used a GMAIL.com account. Or we could use a similar domain such as @travelexinsurancefyi.com. My question, how critical is it that the domain the correspondence with the webmasters be from the exact website domain?
Technical SEO | | Patrick_G0 -
404 not found page appears as 200 success in Google Fetch. What to do to correct?
We have received messages in Google webmaster tools that there is an increase in soft 404 errors. When we check the URLs they send to the 404 not found page:
Technical SEO | | Madlena
For example, http://www.geographics.com/images/01904_S.jpg
redirects to http://www.geographics.com/404.shtml.
When we used fetch as Google, here is what we got: .
#1 Server Response: http://www.geographics.com/404.shtml
HTTP/1.1 200 OK Date: Thu, 26 Sep 2013 14:26:59 GMT
What is wrong and what shall we do? The soft 404 errors are mainly for images that no longer exist in the server. Thanks!0 -
What do you think of this reconsideration request?
Just about to send a reconsideration request to Google for my site: seoco.co.uk and would like your input. I was going to include information about each URL I found and the steps I have taken but there is not room. What do you think of this: “Hi guys, i got an unnatural links message from you back in February and since then my website rankings have fallen dramatically. I spoke to someone at SEOmoz and they said that my website probably got penalised for directory links so I have gone out and tried to get rid of all the low quality ones that I am responsible for and some that I am not. Altogether I was able to identify about 218 low quality directory links. I attempted to contact every one of the directory owners twice over a two week period and I was able to get about 68 removed. I have used the disavow tool to devalue the rest. Trying to get rid of all of those bad links was hard work and I have definitely learned my lesson. Rest assured I will not be submitting to anymore directories in the future. Please can you give me another chance? If my site still violates the guidelines please could you point out some of the bad links that are still there?” What do you think? Can you think of anything else I should say? Dave
Technical SEO | | Eavesy0 -
When to re-submit for reconsideration?
Hi! We received a manual penalty notice. We had an SEO company a couple of years ago build some links for us on blogs. Currently we have only about 95 of these links which are pretty easily identifiable by the anchor text used and the blogs or directories they originate from. So far, we have seen about 35 of those removed and have made 2 contacts to each one via removeem.com. So, how many contacts do you think need to be made before submitting a reconsideration request? Is 2 enough? Also, should we use the disavow tool on these remaining 65 links? Every one of the remaining links is from either a filipino blog page or a random article directory. Finally, do you think we are still getting juice from these links? i.e. if we do remove or disavow these anchor text links are we actually going to see a negative impact? Thanks for your help and answers!! Craig
Technical SEO | | TheCraig0 -
Removal request for entire catalog. Can be done without blocking in robots?
Bunch of thin content (catalog) pages modified with "follow, noindex" few weeks ago. Site completely re-crawled and related cache shows that these pages were not indexed again. So it's good I suppose 🙂 But all of them are still in main Google index and shows up from time to time in SERPs. Will they eventually disappear or we need to submit removal request?Problem is we really don't want to add this pages into robots.txt (they are passing link juice down below to product pages)Thanks!
Technical SEO | | LocalLocal0 -
Pages not Indexed after a successful Google Fetch
I am trying to understand why google isn't indexing key content on my site. www.BeyondTransition.com is indexed and new pages show up in a couple of hours. My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform). These pages are reached via a search page, but no direct navigation from the home page. When I link to an event page from an indexed page it doesn't show up in search results. When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results.... So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated Thanks Denis
Technical SEO | | beyondtransition0