Google contradictory communications about manual action being applied
-
Hello,
we received a manual action (partial match) for pure spam for a site of ours. The date is not sure, because we didn't receive any notification in mail or inside Google Webmaster Tools dashboard, so all we can say for sure is that we noticed that the manual action page wasn't empty anymore in 10/03/2013.Some context: our Google traffic got a big hit on 07/20/2013, losing around 60% out of 250k visits per day. At first we thought it was an algorithmic penalisation related to Panda update. It already happened a few times in the past: losing part of Google traffic and having it back usually a couple of months after, often even better than before.
We were really surprised at first to be deemed as pure spam given that the domain is ours since it was created 7 years ago, that we have never employed black hat techniques and that our efforts were always put into building valuable pages for users instead of using spam techniques to deceive them. But after noticing the manual action, we obviously thought that this was the actual reason for our traffic sudden drop.
So we tried to figure out from the 4 URLs that Google reported as examples of the pure spam affected pages, what issues on our site could have been misinterpreted for pure spam. We also checked all the webmaster guidelines and fixed the issues we thought we could not be fully compliant with. All this process lasted for 3 months, after which we submitted our reconsideration request on 12/16/2013.
On 01/07/2013 we got the following answer:We've reviewed your site and found no manual actions by the webspam team that would directly affect your site's ranking in Google's search results. You can use the Manual Actions page in Webmaster Tools to view actions currently applied to your site.
Of course, there may be other issues with your site that could affect its ranking. Google determines the order of search results using a series of computer programs known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking will happen from time to time as we make updates to present the best results to our users.
If your site isn't appearing in Google search results, or if it's performing more poorly than it once did, check out our Help Center to identify and fix potential causes of the problem.Now we are really puzzled because Google is saying 2 opposite things: We still have a pure spam manual action, and we don't have a manual action (as per their newest response to our reconsideration request).
We could find online a few cases somehow similar to our own, with Google apparently giving contradictory communications about manual actions, but none of them helped to build a clear explanation.I don't want to enter into the merits of the reasons of the penalisation or whether it was or wasn't deserved, but rather knowing if anyone had the same experience or has any guess on what happened.
What we could think of is some bug or problem related to synching between different pieces of Google but still, after some days, the manual action notice is always there on Google Webmaster Tools and nothing changed in our traffic.We are now thinking about sending a second reconsideration request asking to update our Google Webmaster Tools manual actions page accordingly to our current actual status.
What do you think?thank you very much
-
Sorry, I'm completely confused - you said that there's no notice of a manual action in Google Webmaster Tools, but the manual action page isn't empty and Google is giving you examples of URLs? If the page isn't empty, then you have a notice of manual action. I think I'm missing something. Did the message come and go in GWT?
-
Sounds like it is a bug/mistake on their part. I agree with sending a second RR and mentioning the discrepancy between your GWT message and previous RR response. If they send back the same thing (no penalty yet GWT still shows the penalty), I would post to the Google Webmaster Forums with screenshots and hope a Google rep responds.
However, there is probably something going on with your site if traffic dropped so precipitously and you received a message in GWT. I would review the links in the GWT report as well as OSE and look for any possible issues.
-
That story sounds a lot like an algorithmic penalty and not a manual one. I've heard of people receiving manual action messages as a mistake or with a similar story to yours, maybe a Google slip-up?
I'd take a good, hard look at your website from a neutral perspective. Take your bias out of it, regardless if you think you're a company that's only ever done white-hat SEO techniques. Are pages "really" meant for users? Is there information you could probably remove that's "unnecessary" or "duplicate"?
If you have to, hire someone to do this in a consultant role. It'd probably pay off in the long run. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Migration - Google Guidelines
Hi all. I saw in support.google.com the following text: Create and save the Sitemap and lists of links A Sitemap file containing the new URL mapping A Sitemap file containing the old URLs to map A list of sites with link to your current content I would like to better understand about a "A list of sites with bond link to current content" Question 1: have I need tree sitemaps simultaneously ?
Intermediate & Advanced SEO | | mobic
Question 2: If yes, should I put this sitemap on the Search Console of the new website?
Question 3: or just Google gave a about context how do we make the migration? And I'll need really have sitemaps about the new site only..? What about is Google talking? Thanks for any advice.0 -
Number of indexed pages dropped. No manual action though?
I have a client who had their WordPress site hacked. At that point there was no message from Google in webmaster tools and the search results for their pages still looked normal. They paid sitelock to fix the site. This was all about a month ago. Logging into Webmaster Tools now there are still no messages from Google nor anything on the manual actions page. Their organic traffic is essentially gone. Looking at the submitted sitemap only 3 of their 121 submitted pages are indexed. Before this all of them where in the index. Looking at the index status report I can see that the number of indexed pages dropped completely off the map. We are sure that the site is free of malware. This client has done no fishy SEO practices. What can be done?
Intermediate & Advanced SEO | | connectiveWeb0 -
Manual Penalty Reconsideration Request Help
Hi All, I'm currently in the process of creating a reconsideration request for an 'Impact Links' manual penalty. So far I have downloaded all LIVE backlinks from multiple sources and audited them into groups; Domains that I'm keeping (good quality, natural links). Domains that I'm changing to No Follow (relevant good quality links that are good for the user but may be affiliated with my company, therefore changing the links to no follow rather than removing). Domains that I'm getting rid of. (poor quality sites with optimised anchor text, directories, articles sites etc.). One of my next steps is to review every historical back link to my website that is NO LONGER LIVE. To be thorough, I have planned to go through every domain (even if its no longer linking to my site) that has previously linked and straight up disavow the domain (if its poor quality).But I want to first check whether this is completely necessary for a successful reconsideration request? My concerns are that its extremely time consuming (as I'm going through the domains to avoid disavowing a good quality domain that might link back to me in future and also because the historical list is the largest list of them all!) and there is also some risk involved as some good domains might get caught in the disavowing crossfire, therefore I only really want to carry this out if its completely necessary for the success of the reconsideration request. Obviously I understand that reconsideration requests are meant to be time consuming as I'm repenting against previous SEO sin (and believe me I've already spent weeks getting to the stage I'm at right now)... But as an in house Digital Marketer with many other digital avenues to look after for my company too, I can't justify spending such a long time on something if its not 100% necessary. So overall - with a manual penalty request, would you bother sifting through domains that either don't exist anymore or no longer link to your site and disavow them for a thorough reconsideration request? Is this a necessary requirement to revoke the penalty or is Google only interested in links that are currently or recently live? All responses, thoughts and ideas are appreciated 🙂 Kind Regards Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
Google Ranking Wrong Page
The company I work for started with a website targeting one city. Soon after I started SEO for them, they expanded to two cities. Optimization was challenging, but we managed to rank highly in both cities for our keywords. A year or so later, the company expanded to two new locations, so now 4 total. At the time, we realized it was going to be tough to rank any one page for four different cities, so our new SEO strategy was to break the website into 5 sections or minisites consisting of 4 city-targeted sites, and our original site which will now be branded as more of a national website. Our URL structures now look something like this:
Intermediate & Advanced SEO | | cpapciak
www.company.com
www.company.com/city-1
www.company.com/city-2
www.company.com/city-3
www.company.com.city-4 Now, in the present time, all is going well except for our original targeted city. The problem is that Google keeps ranking our original site (which is now national) instead of the new city-specific site we created. I realize that this is probably due to all of the past SEO we did optimizing for that city. My thoughts are that Google is confused as to which page to actually rank for this city's keyword terms and I was wondering if canonical tags would be a possible solution here, since the pages are about 95% identical. Anyone have any insight? I'd really appreciate it!0 -
Got a site in Google news, now what do I do?
I have been working on a site for 7 months, publishing articles each day. The site is dedicated to the niche that I am in. It was accepted in to Google news 1 week ago and I am now getting 3 times the amount of traffic. Its great news but I am now wondering how beneficial it is for me. I sell business management advice and really need to generate leads. Thats how I increase my business Any ideas how I could use this Google news site to do this?
Intermediate & Advanced SEO | | JohnPeters0 -
How to Block Google Preview?
Hi, Our site is very good for Javascript-On users, however many pages are loaded via AJAX and are inaccessible with JS-off. I'm looking to make this content available with JS-off so Search Engines can access them, however we don't have the Dev time to make them 'pretty' for JS-off users. The idea is to make them accessible with JS-off, but when requested by a user with JS-on the user is forwarded to the 'pretty' AJAX version. The content (text, images, links, videos etc) is exactly the same but it's an enormous amount of effort to make the JS-off version 'pretty' and I can't justify the development time to do this. The problem is that Googlebot will index this page and show a preview of the ugly JS-off page in the preview on their results - which isn't good for the brand. Is there a way or meta code that can be used to stop the preview but still have it cached? My current options are to use the meta noarchive or "Cache-Control" content="no-cache" to ask Google to stop caching the page completely, but wanted to know if there was a better way of doing this? Any ideas guys and girls? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
SERP Experience After You Resubmit Your Site to Google
Hello Everyone, We suddenly noticed that our keywords fell off the map and discovered that porn had been placed (via.htaccess redirects and masking) on our site. The porn links caused Google to drop us.We scrubbed our .htaccess file and asked Google to reindex our site 3 weeks ago.Does anyone have experience with reindexing?If so, how long were you down and did your keyword positions return eventually?Thanks,Bob
Intermediate & Advanced SEO | | impressem0 -
Have we suffered a Google penalty?
Hello, In January, we started a new blog to supplement our core ecommerce website. The URL of the website is www.footballshirtblog.co.uk and the idea behind it was that we would write articles related to our industry to build a community which would ultimately boost our sales. We would add several posts per day, a mix between shorter news stories of around 150 words and more detailed content pages of around 500 words. Everything was going well, we were making slow but sure progress on the main generic keywords but were receiving several thousand visitors a day, mostly finding the posts themselves on Google. The surge on traffic meant we needed to move server, which we did around 6 weeks ago. When we did this, we had a few teething problems with file permissions, etc, which meant we were tempoarily able to add new posts. As our developers were tied up with other issues, this continued for a 7-10 day period, with no new content being added. In this period, the site completely dropped from Google, losing all it's rankings and traffic, to the extent it now doesn't even rank for it's own name. This is very frustrating as we have put a huge amount of work and content into developing this site. We have added a few posts since, but not a huge amount as it is frustrating to do it with no return and the concern that the site has been banned forever. I cannot think of any logical reason why this penalty has occured as we haven't been link spamming, etc. Does anyone have any feedback or suggestions as to how we can get back on track? Regards,
Intermediate & Advanced SEO | | ukss1984
David0