What do you think of this reconsideration request?
-
Just about to send a reconsideration request to Google for my site: seoco.co.uk and would like your input. I was going to include information about each URL I found and the steps I have taken but there is not room. What do you think of this:
“Hi guys, i got an unnatural links message from you back in February and since then my website rankings have fallen dramatically. I spoke to someone at SEOmoz and they said that my website probably got penalised for directory links so I have gone out and tried to get rid of all the low quality ones that I am responsible for and some that I am not.
Altogether I was able to identify about 218 low quality directory links. I attempted to contact every one of the directory owners twice over a two week period and I was able to get about 68 removed. I have used the disavow tool to devalue the rest.
Trying to get rid of all of those bad links was hard work and I have definitely learned my lesson. Rest assured I will not be submitting to anymore directories in the future.
Please can you give me another chance?
If my site still violates the guidelines please could you point out some of the bad links that are still there?”
What do you think? Can you think of anything else I should say?
Dave
-
Honestly, just list out the domains in an Excel file (or google doc) and make separate columns:
One column should read "Contacted" and another should read "Link Removed."
**note: The domain list should come from multiple sources, not just the GWT list. It is incomplete and untrustworthy. Use Open Site Explorers list and maybe even Majestic/ahref or all of the above. Remove duplicates and sort your list and it should be pretty obvious which are the spam links based on anchor text and domain URLs.
Put x's in boxes accordingly.
Then just go ahead and take all of the screenshots of email contacts you have and put them in a folder, zip it up, and attach it to the email.
That's what I would do. Like I said, the more info you can include the better. As much tracking and data as you can possibly overwhelm them with. They may not look through all of it, but it will definitely indicate to them that you have figured out the problem and worked really hard to fix it.
Please keep us posted on how this works. As you may be able to tell this is something I work on often and am very interested in the process and how it works out. I and the whole community would be incredibly pleased if you were able to keep us informed as to how it turned out.
Good luck!
-
Hi Jesse, thanks for the response, don't mind the harshness at all. I have all the data about URLs and the steps i took in a normal email but am not sure about how to attach it. What would be the best way and how to I do it?
-
Don't love it, honestly. You need to re-work this. And put a ton more documentation into it.
I'd leave out everything about SEOmoz. They don't care where you learned what you learned. All you need to do is explain that you understand how purchasing link packages is wrong and a terrible idea and that you are interested in removing all unnatural/spam links from your profile. Then go into how you determined which were spam links and what you tried to remove.
This is where you want to attach screenshots and excel spreadsheets showing what links/who you contacted and how many times. You need to provide as much data showing you did as much work as possible. Giving a blanket number just simply isn't enough.
Now beyond this, don't big. Be authoritative. Man up. Realize that you are talking to a webspam team who does not feel sorry for you. So talk to them like you're trying to get a job with them. Be cordial, formal, polite and mature. And don't ask them to "point out some other bad links" for you. They want to know that YOU KNOW what a bad link is, ensuring that this won't happen again. ---THIS LAST SENTENCE IS VERY IMPORTANT and should be the underlying theme of your entire email.
Sorry if I'm coming off harsh, I just want to help. There are a ton of blogs about this.. I suggest checking them out. You can search through YouMoz for helpful posts about this.
Hope this helps... good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Error got removed by request
hey there, I have website, Which is Doing best on google, But from last 2-3 times i am Getting error in Google Search Console, removed by request., here is screenshot http://prntscr.com/iva5y0 My Question is i have Not Requested Google to Remove My Homepage URL, Then 1. Why I am Getting This warning again & again, ?? 2. Will this Harm My Sites Current traffic & Ranking Status ? 3.What Steps I need Take to stop This warning came again & again ??? Thnx in advance & please Help Out on this Query
Technical SEO | | innovative.rohit0 -
Ranking Drop and Google Disavow Requests
My website, www.nile-cruises-4u.co.uk has fallen dramatically for the top industry search terms (nile cruise, nile cruises) over the last 12 months from previous page one rankings to page three which has very badly affected us financially. I found, using Linkdetox, that we had thousands of back-links for non-related anchor-text, mainly porn terms, viagra, etc. I have submitted a Disavow file and request about a week ago and wondered firstly if the enormous amount of these links would have helped cause the drop to page three and secondly if the Disavow request will eventually help the website return to better rankings? Thanks,Colin
Technical SEO | | GratefulFred0 -
I'm thinking I might need to canonicalize back to the home site and combine some content, what do you think?
I have a site that is mostly just podcasts with transcripts, and it has both audio and video versions of the podcasts. I also have a blog that I contribute to that links back to the video/transcript page of these podcasts. So this blog I contribute to has the exact same content (the podcast; both audio and video but no transcript) and then an audio and video version of this podcast. Each post of the podcast has different content on it that is technically unique but I'm not sure it's unique enough. So my question is, should I canonicalize the posts on this blog back to the original video/transcript page of the podcast and then combine the video with the audio posts. Thanks!
Technical SEO | | ThridHour0 -
Page Indexing increase when I request Google Site Link demote
Hi there, Has anyone seen a page crawling increase in Google Web Master Tools when they have requested a site link demotion? I did this around the 23rd of March, the next day I started to see page crawling rise and rise and report a very visible spike in activity and to this day is still relatively high. From memory I have asked about this in SEOMOZ Q&A a couple of years ago in and was told that page crawl activity is a good thing - ok fine, no argument. However at the nearly in the same period I have noticed that my primary keyword rank for my home page has dropped away to something in the region of 4th page on Google US and since March has stayed there. However the exact same query in Google UK (Using SEOMOZ Rank Checker for this) has remained the same position (around 11th) - it has barely moved. I decided to request an undemote on GWT for this page link and the page crawl started to drop but not to the level before March 23rd. However the rank situation for this keyword term has not changed, the content on our website has not changed but something has come adrift with our US ranks. Using Open Site Explorer not one competitor listed has a higher domain authority than our site, page authority, domain links you name it but they sit there in first page. Sorry the above is a little bit of frustration, this question is not impulsive I have sat for weeks analyzing causes and effects but cannot see why this disparity is happening between the 2 country ranks when it has never happened for this length of time before. Ironically we are still number one in the United States for a keyword phrase which I moved away from over a month ago and do not refer to this phrase at all on our index page!! Bizarre. Granted, site link demotion may have no correlation to the KW ranking impact but looking at activities carried out on the site and timing of the page crawling. This is the only sizable factor I can identify that could be the cause. Oh! and the SEOMOZ 'On-Page Optimization Tool' reports that the home page gets an 'A' for this KW term. I have however this week commented out the canonical tag for the moment in the index page header to see if this has any effect. Why? Because as this was another (if not minor) change I employed to get the site to an 'A' credit with the tool. Any ideas, help appreciated as to what could be causing the rank differences. One final note the North American ranks initially were high, circa 11-12th but then consequently dropped away to 4th page but not the UK rankings, they witnessed no impact. Sorry one final thing, the rank in the US is my statistical outlier, using Google Analytics I have an average rank position of about 3 across all countries where our company appears for this term. Include the US and it pushes the average to 8/9th. Thanks David
Technical SEO | | David-E-Carey0 -
What can I do if my reconsideration request is rejected?
Last week I received an unnatural link warning from Google. Sad times. I followed the guidelines and reviewed all my inbound links for the last 3 months. All 5000 of them! Along with several genuine ones from trusted sites like BBC, Guardian and Telegraph there was a load of spam. About 2800 of them were junk. As we don't employ any SEO agency and don't buy links (we don't even buy adwords!) I know that all of this spam is generated by spam bots and site scrapers copying our content. As the bad links have not been created by us and there are 2800 of them I cannot hope to get them removed. There are no 'contact us' pages on these Russian spam directories and Indian scraper sites. And as for the 'adult book marking website' who have linked to us over 1000 times, well I couldn't even contact that site in company time if I wanted to! As a result i did my manual review all day, made a list of 2800 bad links and disavowed them. I followed this up with a reconsideration request to tell Google what I'd done but a week later this has been rejected "We've reviewed your site and we still see links to your site that violate our quality guidelines." As these links are beyond my control and I've tried to disavow them is there anything more to be done? Cheers Steve
Technical SEO | | SteveBrumpton0 -
I think google thinks i have two sites when i only have one
Hi, i am a bit puzzled, i have just used http://www.opensiteexplorer.org/anchors?site=in2town.co.uk to check my anchor text and forgot to put in the www. and the information came up totally different from when i put the www. in it shows a few links for the in2town.co.uk but then when i put in www.in2town.co.uk it gives me different information, is this a problem and if so how do i solve this | | | | | | | | |
Technical SEO | | ClaireH-184886
| | | | | | | | |0 -
Reconsideration Request
I've been cleaning up the back link profiles for a certain page on our site, my question is once I'm happy with the new link profile and I want to submit the URL for reconsideration can I submit just one URL or will Google take a look through the entire site?
Technical SEO | | DanHill0