Proper Way To Submit A Reconsideration Request To Google
-
Hello,
In previous posts, I was speaking about how we were penalized by Google for unnatural links. Basically 50,000 our of our 58,000 links were coming from 4-5 sites with the same exact anchor text and img alt tags. This obviously was causing our issues. Needless to say, I wen through the complete link profile to determine that all of the links besides this were of natrural origins.
My question here is what is the accepted protocol of submitting a reinclusion request; For example, how long should it be? Should I disclose that I was in fact using paid links, and now that I removed (or at least nofollowed) them? I want to make sure that the request as good as it should so I can get our rankings up in a timely manner.
Also, how long until the request is typically aknowledged?
Thanks
-
Hi Daniel,
I hope that by now your penalty situation has been resolved.
Was just a little concerned when I read this thread that there might be a little confusion over the need to remove those paid links from your site.
I just wanted to clear this up for any new people reading Q&A who might take away the wrong idea from the thread.
If the paid links are now nofollowed, you don't need to remove them (unless they are serving no other worthwhile purpose).
Matt Cutts talked about this in the webmaster help video Why do paid links violate Google's guidelines while other ads don't?
As Matt explains in the video, there is nothing wrong with having paid advertising on your site, as long as there is disclosure (usually in the form of a nofollow tag).
For those who might be reading this thread because they too are wondering what to include in a reconsideration request, we have created a Checklist on our site that might help. http://www.rmoov.com/google-reconsideration-request-checklist.php
Hope the penalty is now far behind you,
Sha
-
I would definitely mention the fact that you are under contract. I can't see any harm in it and it will probably look like a measure of good faith and honesty to Google that you are telling them everything.
-
Thank you everyone for your responses, they are all very helpful. Just to inform you guys, all of my problems were stemming from these paid links. The rest of my link profile is pretty healthy. The reason I cannot get the links removed altogether for now is because the ad network that I used is holding me to a contract, until the end of September, then i'm obviously canning them. Do you think I should state that in the request as well, about the links are there (but nofollowed) because I have a contract I committed to, but they will be completely removed at the end of September? Or should I just say I had them nofollowed. I am lucky in the way that all my paid links came from one network, and they cleaned up all of them in a few days (I verified on several hundred pages. Basically 15000 links were spread over 4 sites with macpokeronline.com as the anchor and 35000 links on macobserver.com with image link alt text as macpokeronline, which hit us with penguin as well as the manual paid link penalty.
Thanks again, and i'll let you guys know how this pans out
-
Good responses. Here is what I would do if I were in your shoes.
I would start off with a humble sentence: "Hi Google team. Thank you for receiving my reconsideration request. I realize that some of my links were not in line with the Google Quality Guidelines. In this document I have outlined what I have done to rectify that."
Then, I would list my links. So I'd say, for example,
"There are 10,000 links that come from example1.com. These were links that I previously paid for. I now realize that this is against the quality guidelines and I have had those links removed (or nofollowed)."
I'd do the same thing for all of my bad links.
If I had any where I couldn't get the links removed or nofollowed I would say this:
"There are xxxx links from example2.com. I have contacted the webmaster by sending an email to **admin@example1.com. **Here is a copy of my email:....
...I also found this email in the WHOIS data for the site and sent an email which you can see here....
...I heard no response, so I used a contact form on the site and have yet to hear back from them. "
I would make sure that I had an explanation for every single link pointing to my site. I wouldn't try to hide anything.
I would end with something like this,
"Thanks again for considering my request. From this point on I am committed to following the Quality Guidelines."
In the past it would take ages to get a response (and some webmasters never did) but Google has upgraded their response time. Most requests now get a response within 3-14 days.
Good luck!
-
Daniel,
One thing that I know in terms of how timely is that it will depend as much on what you submit as it will on Google. First, you have to show what you have done to change the issue. If there are communications between you and a paid directory where you are not getting any response or action, show that communication.
Avoid, the temptation to try to 'cosmetically' fix this by changing alt text, anchor text, etc. Remember, this will be looked at by humans. With it being four to five sites, your work should be easier so long as you have a lot of communication with the linking sites showing your efforts to end or ameliorate.Alan is very correct when he says, "Don't think you can keep sending them in because, from what I've read, they will eventually st listening."
Get every duck in a row,
Make sure a bunch are gone, if, they are going to the trouble to nofollow, why not eliminate altogether?
No point saying, I asked them all but no one complied....
Only when there is massive evidence of your effort to fix it would I ask for reconsideration.
Good luck,
Robert
-
Honesty pays, I think. Google knows pretty much what you did. The engine guesses, but when a live body checks on your request, it's not likely you will fool them. They know which sites sell paid links. If someone else also paid for links from that site and they already did a reinclusion request and said they bought links from that site, you paint a target on yourself if you don't own up to it - unless you really didn't buy the links.
You could wait up to a month for any response, going on past history. Be sure you're disclosing everything you know about in one request. Don't think you can keep sending them in because, from what I've read, they will eventually stop listening.
I can't help you with protocol, other than to say spell it all out because they can't read your mind and expecting them to guess probably won't work in your favor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle outdated & years old Blog-posts?
Hi all, We have almost 1000 pages or posts from our blog which are indexed in Google. Few of them are years old, but they have some relevant and credible content which appears in search results. I am just worried about other hundreds of non-relevant posts which are years old. Being hosting hundreds of them, our website is holding lots of these useless indexing pages which might be giving us little negative impact of keeping non-ranking pages. What's the best way to handle them? Are these pages Okay? Or must be non-indexed or deleted? Thanks
Algorithm Updates | | vtmoz0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Fetch as Google - removes start words from Meta Title ?? Help!
Hi all, I'm experiencing some strange behaviour with Google Webmaster Tools. I noticed that some of our pages from our ecom site were missing start keywords - I created a template for meta titles that uses Manufacturer - Ref Number - Product Name - Online Shop; all trimmed under 65 chars just in case. To give you an idea, an example meta title looks like:
Algorithm Updates | | bjs2010
Weber 522053 - Electric Barbecue Q 140 Grey - Online Shop The strange behaviour is if I do a "Fetch as Google" in GWT, no problem - I can see it pulls the variables and it's ok. So I click submit to index. Then I do a google site:URL search, to see what it has indexed, and I see the meta description has changed (so I know it's working), but the meta title has been cut so it looks like this:
Electric Barbecue Q 140 Grey - Online Shop So I am confused - why would Google cut off some words at start of meta title? Even after the Fetch as Googlebot looks perfectly ok? I should point out that this method works perfect on our other pages, which are many hundreds - but it's not working on some pages for some weird reason.... Any ideas?0 -
Struggling with Google Bot Blocks - Please help!
I own a site called www.wheretobuybeauty.com.au After months and months we still have a serious issue with all pages having blocked URLs according to Google Webmaster Tools. The 404 errors are returning a 200 header code according to the email below. Do you agree that the 404.php code should be changed? Can you do that please ? The current state: Google webmaster tools Index Status shows: 26,000 pages indexed 44,000 pages blocked by robots. In late March, we implemented a change recommended by an SEO expert and he provided a new robots.txt file, advised that we should amend sitemap.xml and other changes. We implemented those changes and then setup a re-index of the site by google. The no of blocked URLs eventually reduced in May and June to 1,000 for a few days – but now the problem has rapidly returned. The no of pages that are displayed in a google search request of www.google.com.au where the query was ‘site:wheretobuybeauty.com.au’ is 37,000: This new site has been re-crawled over last 4 weeks. About the site This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster tools robots.txt file has been modified several times: Firstly we had none Then we created one but were advised that it needed to have this current content: User-agent: * Disallow: Sitemap: http://www.wheretobuybeauty.com.au/sitemap.xml
Algorithm Updates | | socialgrowth0 -
What is the best way to organize a catergory for SEO purpsoes?
I work for a organic vitamin and supplement company and we are looking to rank for our categories by making more specific categories. For example we are going to try to add under the category "vitamin d" some smaller more relevant (longer-tail) categories like "spray vitamin d" and "vegan vitamin d" and try to rank instead for these searches and also searches containing words that we already have more authority from Google like "natural" or "organic". I know that putting the product pages a level deeper will only hurt us so I want to avoid that but I'm wondering if anyone has some advice on how to organize categories for longer tail keywords that we actually have a chance to rank for. Any help to figure this out would be greatly appreciated. Here is our page as it is currently, like I said we want to create sub categories that are effective for SEO, but also make searching and navigating the site easier. http://www.mynaturalmarket.com/Vitamin-D.html Thanks, ThatKwameGuy
Algorithm Updates | | ThatKwameGuy1 -
Google Places/Points of Interest Rankings?
Does anyone have an idea on how Google ranks or determines the 'Points of Interests' that come up when searching about places/cities?
Algorithm Updates | | CarlLarson0 -
Google Update?
We have a website that for the past several weeks has been very consistent at between 13,500 and 14,200 daily visits and this site received 15,600 last Thursday. THIS week, Monday is at 22,200, Tuesday is at 26,200, and at mid-day today (at about our traffic halfway point in the day) we're already at 14,000 today. This was a site that was bringing about 14,000 visits as of May 16th last year and dropped to 11,000 the following week. The traffic to this site this week is so far beyond statistical analysis that there must have been something that happened.
Algorithm Updates | | sourcelinemedia0 -
Related Searches in Google
Hello, We're helping a client remove/minimize some negative information about their brand in Google's search results. Just curious about your take on if the related searches that appear at the bottom of Google search results can in any way be influenced or if it is more a combination of so many factors that any one person or organization wouldn't be able to change very easily? I've heard the related results could be influenced if enough queries generated overtake the "negative" queries done initially but I feel like that is venturing into black hat land a bit. thanks -Mike
Algorithm Updates | | mattmainpath0