Ambiguous Response to Google Reconsideration Request
-
Hello,
On 9/11/12, we submitted a reconsideration request to Google for http://macpokeronline.com, at the time we received penalties from both penguin and manual removal. We have since worked on cleaning up our link profile, and got this response from Google:
We received a request from a site owner to reconsider how we index the following site: http://www.macpokeronline.com/.
We've now reviewed your site. When we review a site, we check to see if it's in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site. If your site still doesn't appear in our search results, check our Help Center for steps you can take.
I honestly don't even know how to take this, we always showed up #1 while doing a site search, so it is kind of irrelevant to us in this case.
Is this the reply of them accepting our request?
Thanks
Zach
-
For those wondering, Matt Cutts answers the question in this video; It basically means that your site is is a grey zone. Not completely clean nor completely useless. Unfortunately for http://macpokeronline.com, it seems that the issues are not that easy to resolve.http://suite.searchmetrics.com/en/research/domains/organic?url=MacPokerOnline.com&cc=US
-
We received exactly the same response following a recent reconsideration request after several failed attempts where they reiterated that we still had bad links. We continued removing questionable links over the following six weeks while I waited to see if they would follow it up with something a little more enlighten but unfortunately nothing was forthcoming.
As such I submitted another reconsideration request asking if they could possibly elaborate slightly and after just two days received exactly the same response again. As such I’m still a little puzzled.
-
You are correct, that's why I'm doing a huge linkbuilding campaign focusing on .edu TLDs. The .edus should help us recover from what we were losing before, especially since they are high quality domains, with a high DA.
-
Don't forget because you removed those links, those links that could have been helping you were the same ones you were being penalized for.
Thus if you remove those, you will not have as many links.
-
Irving,
Thank you for your response. I understand what your saying. MacPokerOnline.com isn't an online betting site, it's simply a site where users can get reviews and discount codes for online casinos, but specializing with Poker on the mac OS.
Please be aware that when you search google (in an incognito window) for "mac poker" (broad match), that our site is #1 there, with Google Authorship. We never lost rankings on our main pages, it is just for longtail phrases.
Just thought I'd let you guys know this.
-
This is a good sign because you didn't get the email that says they found it is still in violation. But it doesn't mean your site will automatically bounce back by any means it means they are considering it. Given that it's a online betting site and that sites like that are illegal in the US you're fighting an uphill battle.
Keep tracking your rankings and see if they are coming back. Keep looking at backlinks and on page areas of improvement. Make a new reconsideration request every two weeks stating specific areas of improvement that you have made showing very specific examples.
-
I'm laughing at some of the responses.
My suggestion is just send off another reconsideration request. And under explanation, tell them you were unsure what they previous reconsideration reply mean't. I'm sure that should work out and couldn't hurt.
Its not like they would say, hey we are indexing your site again but for the smart--- reply we will deindex your site or vice versa.
-
Hmm, it's a tough one, but it seems like they are saying you have no further problems else, they would certainly state that the site was still in violation.
The standard type of message you see if they still detect problems is more like this:
"Dear site owner or webmaster of http://www.site.com/,We received a request from a site owner to reconsider http://www.site.com/ for compliance with Google's Webmaster Guidelines.**We've reviewed your site and we still see links to your site that violate our quality guidelines.**Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support."
Now, your message does not have that, but it is still very strangely worded so I would imagine you still have some kind of problems.
What I am seeing with a few clients I have picked up is a cluster of problems. So, they may have a manual penalty, then some penguin issues and some panda issues on top of that.
Additionally, whilst your manual penalty may have been lifted, you could still have a period where you will remain penalised in my experience but that penalty should, at some point, timeout.
But... depending on when you picked up this manual penalty, who knows what other penalties you have picked up in the meanwhile (panda / penguin etc).
Ultimately, it's impossible to give a generic answer here. You need to review the site attempt to identify any remaining algorithmic penalties which there are some fairly good guides available to do that now.
Here are some links that may help:
General content improvement
http://www.seomoz.org/blog/fat-pandas-and-thin-content
**Anchor text ratios **
http://www.bowlerhat.co.uk/blog/seo/anchor-text-ratios-and-link-building/
Penguin
http://www.distilled.net/blog/seo/penguin-strategies/ - complicated
http://www.bowlerhat.co.uk/blog/penguin-diagnosis-and-recovery-strategy/ - a little simpler
Panda
http://www.distilled.net/blog/seo/beating-the-panda-diagnosing-and-rescuing-a-clients-traffic/
http://www.bowlerhat.co.uk/blog/google-panda-problems-and-solutions/
I have had a quick dig in the link profile and it seems you have cleaned up a lot of what is reported in Open Site Explorer so maybe it is just a case of waiting things out a little longer and seeing if a Penguin refresh sorts things out a bit. It's hard to get a real guage of your link profile, anchor text, relevance of linking sites etc based on the current data returned by open site explorer but some general rules that are coming out of the research done by several sites (the microsites and anchor text articles above make for a good read)
- Have at least 50% of your links from topically relevant sites
- keep keyword anchors below 30%
- branded URL anchors should make up about 70% of your link profile
- Ensure that at least two of the top five anchors are branded (really, post penalty, aim for three +)
When you have got to those kind of conservative levels with your link profile then you can be a little more happy that you are going to be able to bounce back but again, it may take time.
It is a frustrating situation and any 'attempt' at sorting this out should be given 110% else you are really going to struggle. I am not really sure what another agency could do to help you at this point other than more of what you are doing and maybe a solid on site review but you are probably in the position where you just have to wait and see (however frustrating that may be).
Let me know how you get on!
Marcus
-
What if we were to increase the amount of "good/great" links that we having coming to the site? Wouldn't that assist in creating a more "natural" link profile?
-
You should focus on removing those nasty backlinks from the blog networks... or at least provide a list of them in your reconsideration request with appropriate explanation. Usually it would take around 1-2 years for these backlinks to evaporate by themselves, why? because there is no reason whatsoever to renew those de-indexed domains & also cover the hosting costs.
-
Okay. I'll see what I can do with that. We can't to PPC since were promoting gambling unfortunately. Maybe well wait it out a few weeks to see what happens. Our site isn't deindexed and we have seen an increase in traffic the past few weeks of 5-15% week over week. I think this is penguin penalties getting lifted as google detects more links we've removed.
-
Find someone who can make Google look bad in public for deindexing your site. That always seems to work.
All jokes aside, I'd just wait it out. See what Google says and go from there. Perhaps invest in Adwords for a while and then beg your account manager to look at your reconsideration request again.
-
I'm actually working in house for the owner of this website, a larger marketing firm screwed him over from a Penguin and link perspective. I live in Philadelphia and know some employees from Seer, do you think I should ask them if this would be something Seer could take on?
-
I don't know what you should do.
If this was my site I would either find a pro who knows how to redeem sin.... start working on sites in a different niche.... or.... go to Vegas and get a job as a dealer.
-
So what should I do now?
-
"We've now reviewed your site."
They are being weasels. They are saying "wait and see if we have forgiven you".
I think you are screwed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing - what did I missed??
Hello, all SEOers~ I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses. But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious. Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much. I am re-doing 301 redirect within today, but I am not sure it means anything anymore. Any advise or opinion?? Thanks in advance~!
Technical SEO | | Yunhee.Choi0 -
Local Google vs. default Google search
Hello Moz community, I have a question: what is the difference between a local version of Google vs. the default Google in regards to search results? I have a Mexican site that I'm trying to rank in www.google.com.mx, but my rankings are actually better if I check my keywords on www.google.com The domain is a .mx site, so wouldn't it make more sense that this page would rank higher on google.com.mx instead of the default Google site, which in theory would mean a "broader" scope? Also, what determines whether a user gets automatically directed to a local Google version vs. staying on the default one? Thanks for your valuable input!
Technical SEO | | EduardoRuiz0 -
What do you think of this reconsideration request?
Just about to send a reconsideration request to Google for my site: seoco.co.uk and would like your input. I was going to include information about each URL I found and the steps I have taken but there is not room. What do you think of this: “Hi guys, i got an unnatural links message from you back in February and since then my website rankings have fallen dramatically. I spoke to someone at SEOmoz and they said that my website probably got penalised for directory links so I have gone out and tried to get rid of all the low quality ones that I am responsible for and some that I am not. Altogether I was able to identify about 218 low quality directory links. I attempted to contact every one of the directory owners twice over a two week period and I was able to get about 68 removed. I have used the disavow tool to devalue the rest. Trying to get rid of all of those bad links was hard work and I have definitely learned my lesson. Rest assured I will not be submitting to anymore directories in the future. Please can you give me another chance? If my site still violates the guidelines please could you point out some of the bad links that are still there?” What do you think? Can you think of anything else I should say? Dave
Technical SEO | | Eavesy0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
My Old Domain is Not Changing in Google
I have taken over the following domain www.choice-cottages.co.uk, part of the contract was to re-direct the old site www.choicecottages.info to the new site. Unfortunately I am only a middle man in the arrangement as the website is hosted with another company. The switch was done well over 4 weeks ago, the re-direct itself is working fine. However if you google choice cottages you will see the first listing is www.choicecottages.info, then I have my new site below for a few listings. Google is definitely updating something as before the old domain had lots of site links but this has reduced to a few. Does anyone know anything on this, as in the past it only takes a couple of days to update. Many thanks Andy
Technical SEO | | iprosoftware0 -
Google Places & Multiple Accounts?
As an agency that manages multiple accounts should I have all my Google Place accounts under one account or should I create a separate account for each client with a unique username and password for every client? Thanks,
Technical SEO | | fun52dig
Gary Downey0 -
Google Dmoz description in SERPS
My dmoz description is not as KW rich as my sites normal description. IS there an advantage or disadvantage to either? If so, How do I prevent google from doing this?
Technical SEO | | DavidS-2820610