What do you think of this reconsideration request?
-
Just about to send a reconsideration request to Google for my site: seoco.co.uk and would like your input. I was going to include information about each URL I found and the steps I have taken but there is not room. What do you think of this:
“Hi guys, i got an unnatural links message from you back in February and since then my website rankings have fallen dramatically. I spoke to someone at SEOmoz and they said that my website probably got penalised for directory links so I have gone out and tried to get rid of all the low quality ones that I am responsible for and some that I am not.
Altogether I was able to identify about 218 low quality directory links. I attempted to contact every one of the directory owners twice over a two week period and I was able to get about 68 removed. I have used the disavow tool to devalue the rest.
Trying to get rid of all of those bad links was hard work and I have definitely learned my lesson. Rest assured I will not be submitting to anymore directories in the future.
Please can you give me another chance?
If my site still violates the guidelines please could you point out some of the bad links that are still there?”
What do you think? Can you think of anything else I should say?
Dave
-
Honestly, just list out the domains in an Excel file (or google doc) and make separate columns:
One column should read "Contacted" and another should read "Link Removed."
**note: The domain list should come from multiple sources, not just the GWT list. It is incomplete and untrustworthy. Use Open Site Explorers list and maybe even Majestic/ahref or all of the above. Remove duplicates and sort your list and it should be pretty obvious which are the spam links based on anchor text and domain URLs.
Put x's in boxes accordingly.
Then just go ahead and take all of the screenshots of email contacts you have and put them in a folder, zip it up, and attach it to the email.
That's what I would do. Like I said, the more info you can include the better. As much tracking and data as you can possibly overwhelm them with. They may not look through all of it, but it will definitely indicate to them that you have figured out the problem and worked really hard to fix it.
Please keep us posted on how this works. As you may be able to tell this is something I work on often and am very interested in the process and how it works out. I and the whole community would be incredibly pleased if you were able to keep us informed as to how it turned out.
Good luck!
-
Hi Jesse, thanks for the response, don't mind the harshness at all. I have all the data about URLs and the steps i took in a normal email but am not sure about how to attach it. What would be the best way and how to I do it?
-
Don't love it, honestly. You need to re-work this. And put a ton more documentation into it.
I'd leave out everything about SEOmoz. They don't care where you learned what you learned. All you need to do is explain that you understand how purchasing link packages is wrong and a terrible idea and that you are interested in removing all unnatural/spam links from your profile. Then go into how you determined which were spam links and what you tried to remove.
This is where you want to attach screenshots and excel spreadsheets showing what links/who you contacted and how many times. You need to provide as much data showing you did as much work as possible. Giving a blanket number just simply isn't enough.
Now beyond this, don't big. Be authoritative. Man up. Realize that you are talking to a webspam team who does not feel sorry for you. So talk to them like you're trying to get a job with them. Be cordial, formal, polite and mature. And don't ask them to "point out some other bad links" for you. They want to know that YOU KNOW what a bad link is, ensuring that this won't happen again. ---THIS LAST SENTENCE IS VERY IMPORTANT and should be the underlying theme of your entire email.
Sorry if I'm coming off harsh, I just want to help. There are a ton of blogs about this.. I suggest checking them out. You can search through YouMoz for helpful posts about this.
Hope this helps... good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Error got removed by request
hey there, I have website, Which is Doing best on google, But from last 2-3 times i am Getting error in Google Search Console, removed by request., here is screenshot http://prntscr.com/iva5y0 My Question is i have Not Requested Google to Remove My Homepage URL, Then 1. Why I am Getting This warning again & again, ?? 2. Will this Harm My Sites Current traffic & Ranking Status ? 3.What Steps I need Take to stop This warning came again & again ??? Thnx in advance & please Help Out on this Query
Technical SEO | | innovative.rohit0 -
Do you think have to re-submit my site to search engines after I made improvements?
Some time ago I started to do SEO for a one-page website and didn't get any positive result: no traffic, no filled in online booking form (yet another, multiple page website offering the same service yielded in multiple filled-in "schedule an appointment" forms). I found out my one-page website was considered to be "keyword-spamming" and converted it to a multiple page one. Its domain authority went up, but it doesn't still bring any traffic. I am thinking maybe I have to let the search engines know that it has been updated so they stop penalizing it? Do you think it might help and if yes, what exactly I should do? Will be thankful very much for any suggestion!
Technical SEO | | kirupa0 -
When to file a Reconsideration Request
Hi all, I don't have any manual penalties from Google but do have a unnatural links message from them back in 2012. We have removed some of the spammy links over the last 2 years but we're now making a further effort and will use the disavow tool once we've done this. Will this be enough once I submit the file or should I / can I submit a Reconsideration Request as well? Do I have to have a manual penalty item in my webmaster account to be able to submit a request? Thanks everyone!
Technical SEO | | KerryK0 -
Pages noindex'ed. Submit removal request too?
We had a bunch of catalog pages "noindex,follow" 'ed. Now should we also submit removal request in WMT for these pages? Thank you! LL
Technical SEO | | LocalLocal0 -
I think google thinks i have two sites when i only have one
Hi, i am a bit puzzled, i have just used http://www.opensiteexplorer.org/anchors?site=in2town.co.uk to check my anchor text and forgot to put in the www. and the information came up totally different from when i put the www. in it shows a few links for the in2town.co.uk but then when i put in www.in2town.co.uk it gives me different information, is this a problem and if so how do i solve this | | | | | | | | |
Technical SEO | | ClaireH-184886
| | | | | | | | |0 -
Thinking aloud - what if WE could run rogerbot from our desktops?
Total, total noob question, I know - but is rogerbot performance bound because of bandwidth and processing capacity? I understand if it is, but I am wondering for those of us with very large sites if we would be able to offload the burden on SEOmoz resources by running our own local licensed version of rogerbot, crawl the sites we want and the upload the data to SEOmoz for analysis. If this was possible would we be getting more immediate results?
Technical SEO | | AspenFasteners0 -
I think I'm stuck in a 301 redirect loop
Hi all, I'm trying to correct some of my duplicate content errors. The site is built on Miva Merchant and the storefront page, /SFNT.html, needs to be permanently redirected to www.mydomain.com This is what my .htaccess file looks like: #RedirectPermanent /index.html http://dev.mydomain.com/mm5/merchant.mvc? RewriteEngine On RewriteCond %{HTTP_HOST} !^dev.mydomain.com$ [NC] RewriteRule ^(.*) http://dev.emydomain.com/$1 [L,R=301] DirectoryIndex index.html index.htm index.php /mm5/merchant.mvc redirect 301 /SFNT.html http://dev.mydomain.com/ RewriteCond %{QUERY_STRING} Screen=SFNT&Store_Code=MYSTORECODE [NC] When I use this code and navigate to http://dev.mydomain.com/SFNT.html the URL gets rewritten as http://dev.mydomain.com/?Screen=SFNT So I believe this is what's called a "redirect loop".... Can anyone provide any insight? I'm not a developer, but have been tasked with cleaning up the problems on the website and can use any input anyone is willing to offer. Thanks, jr
Technical SEO | | Technical_Contact0