Why will Google not remove a manual penalty against us?
-
Our site was placed under a manual penalty last year in June 2012 after penguin rolled out. We were advised by Google that we had unnatural links pointing to our site. We fought for months, running backlink checks and contacting webmasters where Google's WMT was showing the sites which had links. We have submitted numerous reconsideration requests with proof of our efforts in the form of huge well labeled spreadsheets, emails, and screen shots of online forms requesting link removal.When the disavow tool came out we thought it was a godsend and added all the sites who had either ignored us or refused to take down the links to the disavow.txt with the domain: tag. Then we submitted another reconsideration request, but to no avail.We have since had email correspondence with a member of the Google Quality Search Team who after reviewing the evidence of all our previous reconsideration requests and disavow.txt still advised us to make a genuine effort and listed sites which had inorganic links pointing to our site which were already included in the disavow.txt.Google has stated "In order for your site to have a successful reconsideration request, we will need to see a substantial, good-faith effort to remove the links, and this effort should result in a significant decrease in the number of bad links that we see."We have truly done everything we can and proven it too! Especially with all the sites in the disavow.txt there must be a decrease in links pointing to our site. What more can we do? Please help!
-
It is something we have considered, but it is a very good source of income.
Obviously this only when it is ranking. It has a whole lot of history (granted this is why it is not ranking) and the domain is our brand as well. Basically we would rather keep it if we can.
Kindly thanking you for your advice Matt.
-
Hmmm, I understand. It can be frustrating.
The problem is that even if you are trying (and trying hard) to sort the issue out, if they still see bad links they wont lift any manual penalty.
I'm not saying that it is worthwhile in your case but sometimes it can make sense to throw away a domain and start again. Does it take many sales?
Just remember that if you start again with a new domain, do it properly and don't outsource to people who can't be trusted
-
Yes the penalty is referring to rankings, and while we are being indexed we are still under a manual penalty.
We know we have a good site, and have seen natural links since we dropped last year, but it is the old spammed backlinks which obviously spurned Google on to hitting us with this penalty.
-
Yes, you are right!
About a year ago we outsourced our SEO work to a company in india and they were extremely aggressive with these industry specific keywords.
Needless to say when penguin rolled out we pretty much disappeared.
I cannot say we have disavowed all of them but we have been working on removing these unnatural links for the past 9-10 months or so. So everything we found which we couldn't get taken down or no-followed we disavowed. As you can imagine the disavow.txt was longer than my arm!
The point is we have done everything we can and shown evidence of this but Google fails to recognize this.
-
Looks like your site is getting indexed. Is the penalty referring to rankings?
If so, some sites pre-penguin experienced high rankings because of their backlinks. If some of those backlinks' "value" were pulled (spammy links) your rankings fell.
Even if some of those spammy links were successfully taken down, doesn't necessary mean you get the organic position you lost after penguin.
This may or may not apply to your case--but generally speaking.
-
I see a problem already, although it could be that these links have been disavowed (there is no way something like Open Site Explorer can tell this).
Check out the image attached. You have used some very aggressive link building tactics which target exact match anchor text links - and lots of them! Very unnatural indeed.
Can you confirm that these have been disavowed?
Matt
-
No problem, thanks for your time. www.accidentsdirect.com
-
Do you have a link to your site please? In order to help, I think we would need to have a good look around your link profile.
Matt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Google penalize your site without sending you a Manual Spam Action?
I had a massive drop in traffic in Mid 2013, and a slow reduction since then. It has sort of leveled off now, but it's not exactly climbing I've never received a manual spam action. The answer to my question seems pretty obvious, now that I write it out... but have you heard of anyone getting penalized, without specifically receiving a warning? Thanks!
Intermediate & Advanced SEO | | DavidC.0 -
Hide Aggregation from Google?
Google isn't a fan of aggregation, but sometimes it is a good way to fill out content when you cannot cover every news story there is. What I'm wondering is if anyone has continued to do any form of aggregation based on a category and hide that url from Google. Example: example.com/industry-news/ -- is where you'd post aggregation stories but you block robots from crawling that. We wouldn't be doing this for search value just value to our readers. Thoughts?
Intermediate & Advanced SEO | | meistermedia0 -
Apps content Google indexation ?
I read some months back that Google was indexing the apps content to display it into its SERP. Does anyone got any update on this recently ? I'll be very interesting to know more on it 🙂
Intermediate & Advanced SEO | | JoomGeek0 -
Google is not honoring my descriptions
I finally got our title tags honored and now Google is just making the descriptions whatever it wants. This is happening on pretty much every one of our pages. An example: http://www.sqlsentry.com/products/plan-explorer/sql-server-query-view SERPS = SQL Server MVP Aaron Bertrand shares a demo kit for Plan Explorer to give you better insight into the advantages of the tool, and to help you share its virtues ... Description tag = SQL Sentry Plan Explorer is a free query plan analysis tool that will allow you to find the most expensive operators by CPU, I/O, or both. I can see the description tag when I view source so I know that it is pulling it from the table correctly. What can I do to fix this?
Intermediate & Advanced SEO | | Sika220 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
How does Google know if a backlink is good or not?
Hi, What does Google look at when assessing a backlink? How important is it to get a backlink from a website with relevant content? Ex: 1. Domain/Page Auth 80, website is not relevant. Does not use any of the words in your target term in any area of the website. 2. Domain/Page Auth 40, website is relevant. Uses the words in your target term multiple times across website. Which website example would benefit your SERP's more if you gained a backlink? (and if you can say, how much more would it benefit - low, medium, high).
Intermediate & Advanced SEO | | activitysuper0 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0 -
Will blocking google and SE's from indexing images hurt SEO?
Hi, We have a bit of a problem where on a website we are managing, there are thousands of "Dynamically" re-sized images. These are stressing out the server as on any page there could be upto 100 dynamically re-sized images. Google alone is indexing 50,000 pages a day, so multiply that by the number of images and it is a huge drag on the server. I was wondering if it maybe an idea to blog Robots (in robots.txt) from indexing all the images in the image file, to reduce the server load until we have a proper fix in place. We don't get any real value from having our website images in "Google Images" so I am wondering if this could be a safe way of reducing server load? Are there any other potential SEO issues this could cause?? Thanks
Intermediate & Advanced SEO | | James770