Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What can I do if my reconsideration request is rejected?
-
Last week I received an unnatural link warning from Google. Sad times.
I followed the guidelines and reviewed all my inbound links for the last 3 months. All 5000 of them! Along with several genuine ones from trusted sites like BBC, Guardian and Telegraph there was a load of spam. About 2800 of them were junk. As we don't employ any SEO agency and don't buy links (we don't even buy adwords!) I know that all of this spam is generated by spam bots and site scrapers copying our content.
As the bad links have not been created by us and there are 2800 of them I cannot hope to get them removed. There are no 'contact us' pages on these Russian spam directories and Indian scraper sites. And as for the 'adult book marking website' who have linked to us over 1000 times, well I couldn't even contact that site in company time if I wanted to! As a result i did my manual review all day, made a list of 2800 bad links and disavowed them.
I followed this up with a reconsideration request to tell Google what I'd done but a week later this has been rejected "We've reviewed your site and we still see links to your site that violate our quality guidelines." As these links are beyond my control and I've tried to disavow them is there anything more to be done?
Cheers
Steve
-
Tom has given you good advice. I'll put in my 2 cents' worth as well.
There are 3 main reasons for a site to fail at reconsideration:
1. Not enough links were assessed by the site owner to be unnatural.
2. Not enough effort was put into removing links and documenting that to Google.
3. Improper use of the disavow tool.
In most cases #1 is the main cause. Almost every time I do a reconsideration request my client is surprised at what kind of links are considered unnatural. From what I have seen, Google is usually pretty good at figuring out whether you have been manually trying to manipulate the SERPS or whether links are just spam bot type of links.
Here are a few things to consider:
Are you being COMPLETELY honest with yourself about the spammy links you are seeing? How did Russian and porn sites end up linking to you? Most sites don't just get those by accident. Sometimes this can happen when sites use linkbuilding companies that use automated methods to build links. Even still, do all you can to address those links, and then for the ones that you can't get removed, document your efforts, show Google and then disavow them.
Even if these are foreign language sites, many of them will have whois emails that you can contact.
Are you ABSOLUTELY sure that your good links are truly natural? Just because they are from news sources is not a good enough reason. Have you read all the interflora stuff recently? They had a pile of links from advertorials (amongst other things) that now need to be cleaned up.
-
Hi Steve
If Google is saying there are still a few more links, then it might be an idea to manually review a few others that you haven't disavowed. I find the LinkDetox tool very useful for this. It's free with a tweet and will tell you if a link from a site is toxic (the site is deindexed) or if it's suspicious (and why it's suspicious). You still need to use your own judgement on these, but it might help you to find the extra links you're talking about.
However, there is a chance you have gone and disavowed every bad link, but still got the rejection. In this case, I'd keep trying but make your reconsideration request more detailed. Create an excel sheet and list the bad URLs and/or domains and give a reason explaining why you think they're bad links. Then provide information on how you found their contact details. If there are no contact us pages, check the whois registrar's email. After that, say when you contacted them (give a sample of your letter to them too), and if they replied, along with a follow up date if you got silence. If there are no details in the whois, explicitly mention that there are no contact details and so you have proceeded straight to disavowing.
Then list the URLs you've disavowed (upload the .txt file with your reconsideration email). You've now told Google that you've found bad links, why you think their bad (also include how you discovered them), that you've contacted the webmaster on numerous occasions and, if no removal was made, you've disavowed as a last resort. This is a very thorough process and uses the disavow tool in the way that Google wants us to - as a last resort to an unresponsive or anonymous webmaster.
Please forgive me if you've already done all this and it seems like repetition. I only mention it because I've found it's best to be as thorough as possible with Google in these situations. Remember, a reconsideration request is manual and if they see that you've gone through all this effort to be reinstated, you've got a better chance of being approved.
Keep trying, mate. It can be disheartening, but if you think it's worth the time and effort, then keep going for it. I would bear in mind the alternatives, however, such as starting fresh on a new domain. If you find yourself going round the bend with endless reconsiderations, sometimes your time, effort and expertise can be better put elsewhere.
All the best!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does a site that is worse than mine by every objective measure I can find, keep outranking me in search?
I’ve been working on educating myself about SEO all day, again. All-Star Telescope up in Canada. We have a competitor that consistently ranks #1 and I don't get it. Their site is full of duplicate content (straight copy and paste from the manufacturer site). They don't have any meaningful blog or video content to add relevance or value to their site. We have higher page authority, higher domain authority, and they keyword analyzer in moz says that our page is higher quality than the the competitors page. Our site is slow, but theirs is slower. I can’t find a single metric on any tool (ubbersuggest, Moz, ahrefs, semrush) that says Telescopes Canada is a better site, or has a better NexStar 8SE product page (a popular telescope). Here’s the link to Telescope Canada’s page for their Celestron 8SE: https://telescopescanada.ca/products/celestron-nexstar-8se-computerized-telescope-11069?_pos=1&_sid=f0aa91cc2&_ss=r Here’s a link to the Celestron 8SE page from the manufacturer website: https://www.celestron.com/products/nexstar-8se-computerized-telescope?_pos=1&_sid=56abdabd4&_ss=r#description Telescopes Canada has just copied and pasted. There is no original content aside from adding the shipping and return policy to the tab, and having some options for selecting accessories on the page. Here is our page: https://all-startelescope.com/products/celestron-nexstar-8se Our titles are good, our metadata is good (but I don’t think that’s been a serious ranking factor for about ten years). The text is original, it’s relevant, we have healthy internal links to the page. We have invensted in some excellent blog content, we’re adding new products to the website so that we rank for more keywords. All of those things are helping, but I fundamentally don’t understand why Telescopes Canada is #1 almost across the board on every key product in our market. There is something that I’m not seeing here, something that isn't being captured by the tools that I have. Is it simple the fact that they get more traffic? Is that why some people go and buy traffic? Can you see any metric, any tool in your toolbox that indicates why they rank at the top, or even higher than we do for in these search terms specific to that product: Celestron NexStar 8SE
Technical SEO | | nkennett
NexStar 8SE
Celestron NexStar 8SE Canada
NexStar 8SE Canada We've worked with two highly ranked SEO's to try and figure this out, one in Canada, and one in the USA. I haven't seen a confidence inspiring answer from either of them. Posting on a forum is a bit of an act of desperation, I'll continue to work the problem, but it's discouraging to see the leader in my industry look like he's just phoning it in with his website.1 -
How can you promote a sub-domain ahead of a domain on the SERPs?
I have a new client that wants to promote their subdomain uk.imagemcs.com and have their main domain imagemcs.com fall off the SERPs. Objective? Get uk.imagemcs.com to rank first for UK 'brand' searches. Do a search for 'imagem creative services' and you should see the issue (it looks like rules have been applied to the robots.txt on the main domain to exclude any bots from crawling - but since they've been indexed previously I need to take action as it doesn't look great!). I think I can do this by applying a permanent redirect from the main domain to the subdomain at domain level and then no-indexing the site - and then resubmit the sitemap. My slight concern is that this no-indexing of the main domain may impact on the visibility of the subdomains (I'm dealing with uk.imagemcs.com, but there is us.imagemcs.com and de.imagemcs.com) and was looking for some assurance that this would not be the case. My understanding is that subdomains are completely distinct from domains and as such this action should have no impact on the subdomains. I asked the question on the Webmasters Forum but haven't really got anywhere
Technical SEO | | nathangdavidson2
https://productforums.google.com/forum/#!msg/webmasters/1Avupy3Uw_o/hu6oLQntCAAJ Can anyone suggest a course of action? many thanks, Nathan0 -
Tools/Software that can crawl all image URLs in a site
Excluding Screaming Frog, what other tools/software to use in order to crawl all image URLs in a site? Because in Screaming Frog, they don't crawl image URLs which are not under the site domain. Example of an image URL outside the client site: http://cdn.shopify.com/images/this-is-just-a-sample.png If the client is: http://www.example.com, Screaming Frog only crawls images under it like, http://www.example.com/images/this-is-just-a-sample.png
Technical SEO | | jayoliverwright0 -
Can spiders crawl jQuery Fancy Box scripts
Hi Everyone - I'm not a technical person at all. I have some content that will be hidden until a user clicks "learn more" where upon it will be displayed via jQuery Fancy Box script. The content behind the learn more javascript is important and I need it to be crawled by search engine spiders. Does anyone know if there will be a problem with this script?
Technical SEO | | Santaur0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Error: Missing Meta Description Tag on pages I can't find in order to correct
This seems silly, but I have errors on blog URLs in our WordPress site that I don't know how to access because they are not in our Dashboard. We are using All in One SEO. The errors are for blog archive dates, authors and just simply 'blog'. Here are samples: http://www.fateyes.com/2012/10/
Technical SEO | | gfiedel
http://www.fateyes.com/author/gina-fiedel/
http://www.fateyes.com/blog/ Does anyone know how to input descriptions for pages like these?
Thanks!!0 -
Oh no googlebot can not access my robots.txt file
I just receive a n error message from google webmaster Wonder it was something to do with Yoast plugin. Could somebody help me with troubleshooting this? Here's original message Over the last 24 hours, Googlebot encountered 189 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. Recommended action If the site error rate is 100%: Using a web browser, attempt to access http://www.soobumimphotography.com//robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure. If the site error rate is less than 100%: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. After you think you've fixed the problem, use Fetch as Google to fetch http://www.soobumimphotography.com//robots.txt to verify that Googlebot can properly access your site.
Technical SEO | | BistosAmerica0 -
Can local SEO harm national rankings?
Today I met with a firm called Localeze that provides local directory submissions. I understand the importance of this service if your site is competing locally, however I'm not sure the effects of local SEO for a national brand. Our firm gets most of our traffic from across the country, not just one location, and our business is scattered (which is a good thing). We rank for service related keywords that are not tied to a location. We do not show up for local results so our business in our immediate location is weak. We would like to increase our local presence in search engines but I want to make sure that this will not take away from our national presence. Will optimizing a site for local search negatively affect general rankings? Thanks
Technical SEO | | KevinBloom1