When to file a Reconsideration Request
-
Hi all,
I don't have any manual penalties from Google but do have a unnatural links message from them back in 2012. We have removed some of the spammy links over the last 2 years but we're now making a further effort and will use the disavow tool once we've done this.
Will this be enough once I submit the file or should I / can I submit a Reconsideration Request as well? Do I have to have a manual penalty item in my webmaster account to be able to submit a request?
Thanks everyone!
-
Hi Oleg,
When I click that link google asks me to check for manual penalties and when I do it shows no penalties.
I guess I will disavow links anyway, so I will go ahead with that. Thanks for your help!
-
My guess is that the penalties expired then. If you received letters, there would be a notification under the "Manual Actions" tab. They expire every year I believe.
If that's the case, I wouldn't worry about it.
EDIT: try accessing your site via https://www.google.com/webmasters/tools/reconsideration
-
Thanks for your reply. So I can't see any items in the manual actions tab under search traffic, so how would I submit the RR?
-
Unnatural link messages = manual penalty, so yes. You should submit a RR. I would go through all of your links, disavow all of the bad ones that you weren't able to remove, then submit the RR. If you do it right, you shouldn't have any problems having the penalty lifted (its been a while and you seem to have been doing a lot to remedy the issue)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Htaccess file help
Hi, thanks for looking i am trying (and failing) to write a htaccess for the following scenario <colgroup><col width="679"></colgroup> http://www.gardening-services-edinburgh.com/index.html http://www.gardening-services-edinburgh.com http://www.gardening-services-edinburgh.com/ | so that all of these destinations goto the one resource any ideas? thanks andy
Technical SEO | | McSEO0 -
What do you think of this reconsideration request?
Just about to send a reconsideration request to Google for my site: seoco.co.uk and would like your input. I was going to include information about each URL I found and the steps I have taken but there is not room. What do you think of this: “Hi guys, i got an unnatural links message from you back in February and since then my website rankings have fallen dramatically. I spoke to someone at SEOmoz and they said that my website probably got penalised for directory links so I have gone out and tried to get rid of all the low quality ones that I am responsible for and some that I am not. Altogether I was able to identify about 218 low quality directory links. I attempted to contact every one of the directory owners twice over a two week period and I was able to get about 68 removed. I have used the disavow tool to devalue the rest. Trying to get rid of all of those bad links was hard work and I have definitely learned my lesson. Rest assured I will not be submitting to anymore directories in the future. Please can you give me another chance? If my site still violates the guidelines please could you point out some of the bad links that are still there?” What do you think? Can you think of anything else I should say? Dave
Technical SEO | | Eavesy0 -
Request to seomoz staff
Hi SeoMoz personnel I am pro member for few months here in the community. You have all googles in the rank tracking tool and the keyword analysis tool. Except from one. You dont have the google.com.cy and I was wondering if is problem to add it. If no its ok I track my results on the google.gr and google.com. But I would love if you could add it. Is also one plus for me when I promote the software to my local clients.... Thank you
Technical SEO | | nyanainc0 -
A week ago I asked how to remove duplicate files and duplicate titles
Three weeks ago we had a very large number of site errors revealed by crawl diagostics. These errors related purely to the presence of both http://domain name and http://www.domain name. We used the rel canonical tag in the head of our index page to direct all to the www. preference, and we have no improvement. Matters got worse two weeks ago and I checked with Google Webmaster and found that Google had somehow lost our preference choice. A week ago I asked how to overcome this problem and received good advice about how to re-enter our preference for the www.tag with Google. This we did and it was accepted. We aso submitted a new sitemap.xml which was also acceptable to Google. Today, a week later we find that we have even more duplicate content (over 10,000 duplicate errors) showing up in the latest diagnostic crawl. Does anyone have any ideas? (Getting a bit desperate.)
Technical SEO | | FFTCOUK0 -
Problem with indexed files before domain was purchased
Hello everybody, We bought this domain a few months back and we're trying to figure out how to get rid of indexed pages that (i assume) existed before we bought this domain - the domain was registered in 2001 and had a few owners. I attached 3 files from my webmasters tools, can anyone tell me how to get rid of those "pages" and more important: aren't this kind of "pages" result of some kind of "sabotage"? Looking forward to hearing your thoughts on this. Thank you, Alex Picture-5.png Picture-6.png Picture-7.png
Technical SEO | | pwpaneuro0 -
Why mobi version of the file comes up higher on SERPs when compared to the web version?
hi Please see the URL http://news.oneindia.in/2011/10/22/tech-gmail-to-get-a-makeover-soon-google.html
Technical SEO | | greyniumseo
The corresponding mobile version is http://news.oneindia.mobi/2011/10/22/886893.html If we search for "Google video leaks; Gmail to get a make over soon" on Google the mobi version comes up instead of the web version. One reason could be because of the browser title. We do use meta title in our web version of the article. For the past few months our mobi version of the file comes up higher on SERPs when compared to the web version. What could be the reason? regards0 -
Is having a sitemap.xml file still beneficial?
Hi, I'm pretty new to SEO and something I've noticed is that a lot of things become relevant and irrelevant like the weather. I was just wondering if having a sitemap.xml file for Google's use is still a good idea and beneficial? Logically thinking, my websites would get crawled faster by having one. Cheers.
Technical SEO | | davieshussein0