What can I do if my reconsideration request is rejected?
-
Last week I received an unnatural link warning from Google. Sad times.
I followed the guidelines and reviewed all my inbound links for the last 3 months. All 5000 of them! Along with several genuine ones from trusted sites like BBC, Guardian and Telegraph there was a load of spam. About 2800 of them were junk. As we don't employ any SEO agency and don't buy links (we don't even buy adwords!) I know that all of this spam is generated by spam bots and site scrapers copying our content.
As the bad links have not been created by us and there are 2800 of them I cannot hope to get them removed. There are no 'contact us' pages on these Russian spam directories and Indian scraper sites. And as for the 'adult book marking website' who have linked to us over 1000 times, well I couldn't even contact that site in company time if I wanted to! As a result i did my manual review all day, made a list of 2800 bad links and disavowed them.
I followed this up with a reconsideration request to tell Google what I'd done but a week later this has been rejected "We've reviewed your site and we still see links to your site that violate our quality guidelines." As these links are beyond my control and I've tried to disavow them is there anything more to be done?
Cheers
Steve
-
Tom has given you good advice. I'll put in my 2 cents' worth as well.
There are 3 main reasons for a site to fail at reconsideration:
1. Not enough links were assessed by the site owner to be unnatural.
2. Not enough effort was put into removing links and documenting that to Google.
3. Improper use of the disavow tool.
In most cases #1 is the main cause. Almost every time I do a reconsideration request my client is surprised at what kind of links are considered unnatural. From what I have seen, Google is usually pretty good at figuring out whether you have been manually trying to manipulate the SERPS or whether links are just spam bot type of links.
Here are a few things to consider:
Are you being COMPLETELY honest with yourself about the spammy links you are seeing? How did Russian and porn sites end up linking to you? Most sites don't just get those by accident. Sometimes this can happen when sites use linkbuilding companies that use automated methods to build links. Even still, do all you can to address those links, and then for the ones that you can't get removed, document your efforts, show Google and then disavow them.
Even if these are foreign language sites, many of them will have whois emails that you can contact.
Are you ABSOLUTELY sure that your good links are truly natural? Just because they are from news sources is not a good enough reason. Have you read all the interflora stuff recently? They had a pile of links from advertorials (amongst other things) that now need to be cleaned up.
-
Hi Steve
If Google is saying there are still a few more links, then it might be an idea to manually review a few others that you haven't disavowed. I find the LinkDetox tool very useful for this. It's free with a tweet and will tell you if a link from a site is toxic (the site is deindexed) or if it's suspicious (and why it's suspicious). You still need to use your own judgement on these, but it might help you to find the extra links you're talking about.
However, there is a chance you have gone and disavowed every bad link, but still got the rejection. In this case, I'd keep trying but make your reconsideration request more detailed. Create an excel sheet and list the bad URLs and/or domains and give a reason explaining why you think they're bad links. Then provide information on how you found their contact details. If there are no contact us pages, check the whois registrar's email. After that, say when you contacted them (give a sample of your letter to them too), and if they replied, along with a follow up date if you got silence. If there are no details in the whois, explicitly mention that there are no contact details and so you have proceeded straight to disavowing.
Then list the URLs you've disavowed (upload the .txt file with your reconsideration email). You've now told Google that you've found bad links, why you think their bad (also include how you discovered them), that you've contacted the webmaster on numerous occasions and, if no removal was made, you've disavowed as a last resort. This is a very thorough process and uses the disavow tool in the way that Google wants us to - as a last resort to an unresponsive or anonymous webmaster.
Please forgive me if you've already done all this and it seems like repetition. I only mention it because I've found it's best to be as thorough as possible with Google in these situations. Remember, a reconsideration request is manual and if they see that you've gone through all this effort to be reinstated, you've got a better chance of being approved.
Keep trying, mate. It can be disheartening, but if you think it's worth the time and effort, then keep going for it. I would bear in mind the alternatives, however, such as starting fresh on a new domain. If you find yourself going round the bend with endless reconsiderations, sometimes your time, effort and expertise can be better put elsewhere.
All the best!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console rejecting XML sitemap files as HTML files, despite them being XML
Hi Moz folks, We have launched an international site that uses subdirectories for regions and have had trouble getting pages outside of USA and Canada indexed. Google Search Console accounts have finally been verified, so we can submit the correct regional sitemap to the relevant search console account. However, when submitting non-USA and CA sitemap files (e.g. AU, NZ, UK), we are receiving a submission error that states, "Your Sitemap appears to be an HTML page," despite them being .xml files, e.g. http://www.t2tea.com/en/au/sitemap1_en_AU.xml. Queries on this suggest it's a W3 Cache plugin problem, but we aren't using Wordpress; the site is running on Demandware. Can anyone guide us on why Google Search Console is rejecting these sitemap files? Page indexation is a real issue. Many thanks in advance!
Technical SEO | | SearchDeploy0 -
Can an AJAX framework (using HTML5 + pushstate) on your site impact your ranking?
Hello everybody, I am currently investigating a website which is rendered by an AJAX Framework (Angularjs) using the HTML5 +API history - Pushstate methods.
Technical SEO | | Netsociety
Recently Google announced that they are able to execute Javascript and can therefore see the content and links to discover all pages in the structure. However it seems that it doesn't run the Javascript at ALL times. (after some internal testing) So technically it is possible it arrives on a page without seeing any content and links, while another time he can arrive, run Javascript and read/discover the content and links generated by AJAX.
The fact that Google can't always interpret or read the website correctly can therefore have negative SEO impact? (not the indexation process but ranking) We are aware that is better to create a snapshot of the page but in the announcement of Google they state that the method that is currently used, should be sufficient. Does anybody have any experience with this AND what is the impact on the ranking process? Thanks!0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Similar Websites, Same C Block: Can I Get a Penalty?
One of my website has been heavily hit by Google's entire zoo so I decided to phase it out while building a new one. Old website: www.thewebhostinghero.com
Technical SEO | | sbrault74
New website: www.webhostinghero.com Now the thing is that both websites are obviously similar since I kept the branding. They also both have content about the same topics. No content has been copied or spinned or whatever though. Everything's original on both websites. There were only 3 parts of both websites that were too similar in terms of functionalities so I "noindexed" it on the old website. Now it seems that Google doesn't want you to have multiple websites for the same business just for the sake of occupying more space in the search results. This can especially be detected by the websites' C block. I am not sure if this is myth or fact though. So do you think I'm in a problematic situation with this scenario? It's getting ridiculous all you have to watch for when building a website, I'm afraid to touch my keyboard in fear my websites will get penalized! Sorry for my english btw.0 -
For an image which is in the CSS and not the HTML, can you add an alt tag?
I would like to improve SEO on a page with three big images, which are currently hosted in the CSS. The sample I am working with is at http://xquisitevents.com/about-us/ and I put my cursor over the big picture of the wedding dress with bouquet, I inspected the element and saw this code in a div tag: #upperleft { background-image:url(images/AboutTopLeft.jpg); Can I add an alt tag to the CSS somehow, or can I have it added to the HTML? What is the best way to handle this, to include keywords like exquisite weddings and special event designs?
Technical SEO | | BridgetGibbons0 -
Can anyone recommend a good template club for joomla
Hi, i am looking for a good template club for joomla. I am looking for a template club that offers magazine and news style templates which are optimised and are not slow. A template i have been using has been reported as running slow and i want to change our template. any help would be great.
Technical SEO | | ClaireH-1848860 -
How can I optimise for Google Products?
Has anyone got experience of optimising Google Products (Google Base) feeds? I've noticed that, although my site doesn't often appear on page one in the standard results, we occasionally appear right at the top because of the "universal" shopping results. My question is: how can we make this happen more often? There seems to be a lot less competition (presumably because our competitors haven't worked out how to provide the feed to Google yet!), so I imagine it should be easier and quicker to reach the top this way than any other way. Thanks! Alex
Technical SEO | | reddogmusic0 -
How I can deal with ajax pagination?
Hello! I would like to have your input about how I can deal with a specific page in my website You can see my page here As you can see, we have a list of 76 ski resort, our pagination use ajax, wich mean we have only one url, and just below the list, we have a simple list of all the ski resort in this mountain, which show all the 76 ski resorts.. I know it's quite bad, since we can reach the same ski resort with two différents anchors links. Thanks you very much in advance, Simon
Technical SEO | | Alexandre_0