Reconsideration Request a Success!
-
Hi all,
Well I've finally gotten been able to get the penalty removed judging by this email:
"Dear site owner or webmaster of xxx,
We received a request from a site owner to reconsider xxx for compliance with Google's Webmaster Guidelines.
Previously the webspam team had taken manual action on your site because we believed it violated our quality guidelines. After reviewing your reconsideration request, we have revoked this manual action. It may take some time before our indexing and ranking systems are updated to reflect the new status of your site.
Of course, there may be other issues with your site that could affect its ranking without a manual action by the webspam team. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If your site continues to have trouble in our search results, please see this article for help with diagnosing the issue.
Thank you for helping us to maintain the quality of our search results.
Sincerely,
Google Search Quality Team"
This was after a reconsideration request was sent prior to disavow tool being released. In addition I also applied a disavow of all the links I was unsuccessful in removingwithout contacting Google and letting the original reconsideration request run it's course.
I am making this post just to let everyone know that the hard work pays off and Google is just trying to make sure you are doing your best in removing the links. As 'Ryan Kent' always emphasizes, you must really be diligent and honest when trying to remove links. You also need to keep documentation, I anchored contact pages and email addresses, 1st, 2nd, 3rd, and even 4th attempt dates.
Now with the disavow tool, I believe if you do a "good faith" in removing the links, and it is well documented, you can use the disavow tool after multiple attempts, correlating both the disavow links and the spreadsheet sent to Google is and should be very important in a reconsideration request.
Good luck!
Also I received the message from WMT, and wondering does anyone know how long is 'some time' before site is reindexed? So far our organic traffic is still about the same prior. So I would like to hear what other's experience are after a successful reconsideration.
Feel free to ask any questions!
-
I sent a request on Oct 14th and got a response on Oct 18th. I've been removing links around Aug.
Responses whether denial or success seemed to range 1-2 weeks.
-
Interesting but How long have you waited to be reconsidered?
-
Nice work! Hope you get back to the top quickly.
-
Yep, One of the things that have surprised me was previous reconsideration requests that allowed me to communicate directly through email.
It was surprising to actually talk to someone from Google via email about a free service. The emails were pretty personal and one time even gave a link example.
-
That's great to hear - and with it having been manual action that was taken, it's nice to know these are actually handled by real people - as Matt Cutts states here.
Andy
-
Great news William! Good job. I imagine the wait time is going to be at least whatever your normal crawl rate is.
Thanks for sharing your story.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Error: The URL you entered does not appear to be returning a page successfully
I keep getting the above error when I'm trying to look at the page optimization for a new blog page we have uploaded. It was posted over a week ago, so I assumed it would be indexed by now. Any thoughts on why it isn't working? The page is: http://www.esg.co.uk/blog/blog/2015/why-is-air-quality-testing-so-important/#.VmlBmLiLRpg Thanks
Technical SEO | | Charley_Tangerine0 -
Manual Action - When requesting links be removed, how important to Google is the address you're sending the requests from?
We're starting a campaign to get rid of a bunch of links, and then submitting a disavow report to Google, to get rid of a manual action. My SEO vendor said he needs an @email domain from the website in question @travelexinsurance.com, to send and receive emails from vendors. He said Google won't consider the correspondence to and from webmasters if sent from a domain that is not the one with the manual action penalty. Due to company/compliance rules, I can't allow a vendor not in our building to have an email address like that. I've seen other people mention they just used a GMAIL.com account. Or we could use a similar domain such as @travelexinsurancefyi.com. My question, how critical is it that the domain the correspondence with the webmasters be from the exact website domain?
Technical SEO | | Patrick_G0 -
Noindex Success?
Has anyone had success implementing noindex/follow to pages from their site which has been hit by a Panda penalty? Our site has a lot of duplicate content for products descriptions that we had permission to use from our distributor (who is also online). We went ahead and noindex/follow those pages in the hopes that google will focus on the products that we carry that do have original descriptions (about 1/3 of our products). We didn't want to just remove those products since they are actually beneficial to our customers. Most of the duplication of content is in the form of ingredients lists.
Technical SEO | | dustyabe0 -
Has anyone seen direct improvement after April 23 by requesting reinclusion?
Using the open site explorer I have figured out that my former seo agency was buying name spam (mostly Asian sites)for my main keywords and did the same in a private network of blogs. I don't speak any eastern languages and seo Super Dude has left the planet. So... I don't really have much to report to the Google Webmaster folks. How much time - effort- cash do invest in removal requests vs, redo the whole darn site and hope for the best? All the best. Tom
Technical SEO | | tvw1300 -
Removal request for entire catalog. Can be done without blocking in robots?
Bunch of thin content (catalog) pages modified with "follow, noindex" few weeks ago. Site completely re-crawled and related cache shows that these pages were not indexed again. So it's good I suppose 🙂 But all of them are still in main Google index and shows up from time to time in SERPs. Will they eventually disappear or we need to submit removal request?Problem is we really don't want to add this pages into robots.txt (they are passing link juice down below to product pages)Thanks!
Technical SEO | | LocalLocal0 -
Ambiguous Response to Google Reconsideration Request
Hello, On 9/11/12, we submitted a reconsideration request to Google for http://macpokeronline.com, at the time we received penalties from both penguin and manual removal. We have since worked on cleaning up our link profile, and got this response from Google: We received a request from a site owner to reconsider how we index the following site: http://www.macpokeronline.com/. We've now reviewed your site. When we review a site, we check to see if it's in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site. If your site still doesn't appear in our search results, check our Help Center for steps you can take. I honestly don't even know how to take this, we always showed up #1 while doing a site search, so it is kind of irrelevant to us in this case. Is this the reply of them accepting our request? Thanks Zach
Technical SEO | | Zachary_Russell0 -
Is the full URL necessary for successful Canonical Links?
Hi, my first question and hopefully an easy enough one to answer. Currently in the head element of our pages we have canonical references such as: (Yes, untidy URL...we are working on it!) I am just trying to find out whether this snippet of the full URL is adequete for canonicalization or if the full domain is needed aswell. My reason for asking is that the SEOmoz On-Page Optimization grading tool is 'failing' all our pages on the "Appropriate Use of Rel Canonical" value. I have been unable to find a definitive answer on this, although admittedly most examples do use the full URL. (I am not the site developer so cannot simply change this myself, but rather have to advise him in a weekly meeting). So in short, presumably using the full URL is best practise, but is it essential to its effectiveness when being read by the search engines? Or could there be another reason why the "Appropriate Use of Rel Canonical" value is not being green ticked? Thank you very much, I appreciate any advice you can give.
Technical SEO | | rmkjersey0