Can someone that had (or seen) a Manual Action in WMTs tell me.....
-
This is a repost of this question http://moz.com/community/q/manual-action-found-in-wmts-no-email-no-message-in-wmts
But I'm sure there is someone in the moz fourms that have had/seen manual action
Someone I know said that they were looking though their WMTs and under Manual Actions they found they had a partial penalty. There is no date against it and they never got an email and there are no messages WMTs for it. I haven't personally dealt with a Manual penalty before, but I would have expected there to be a message in WMTs for it ( an email might have been missed because of a spam filter etc). Could it be a very old penalty?
-
Thanks for confirming that.
As its a old partial penalty, and from a quick look at their analytics I can't see any main pages that have been penalised , is it best to leave it alone ( in the short term at least)?
-
Hey Paddy Displays!
Sounds like an older penalty to me. I would highly suggest creating an email filter for @google.com so that future emails from WMT or other G related services won't go into spam. Just to play it safe!!
-
Hi there
I think you're inkling is correct, this would be an old penalty.
The manual action tab is relatively new, coming up to just a year old this month. Both myself and some agency buddies have seen sites that have had penalties in there when previously the site owner (and sometimes even the agency!) did not think there was a penalty.
If you have Webmaster Tools set up, you should always receive a message whenever manual action has been taken (bar a system error). However, in the past, for example, you may not have had WMT set up when a penalty has been applied, or you didn't have access to the account then. Therefore, the message would not be there for you to read, due to it not being sent or deleted. However, the manual action tab will show any penalties that the site has, regardless of when it was applied, when WMT was set up, or if a message was sent or not.
Therefore, if you're seeing a penalty notice in the tab but no message was received, I believe that means it would be an old penalty - as a new one would have also sent a message as well.
Hope that all makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a page that's 301 redirected get indexed / show in search results?
Hey folks, have searched around and haven't been able to find an answer to this question. I've got a client who has very different search results when including his middle initial. His bio page on his company's website has the slug /people/john-smith; I'm wondering if we set up a duplicate bio page with his middle initial (e.g. /people/john-b-smith) and then 301 redirect it to the existent bio page, whether the latter page would get indexed by google and show in search results for queries that use the middle initial (e.g. "john b smith"). I've already got the metadata based on the middle initial version but I know the slug is a ranking signal and since it's a direct match to one of his higher volume branded queries I thought it might help to get his bio page ranking more highly. Would that work or does the 301'd page effectively cease to exist in Google's eyes?
Technical SEO | | Greentarget0 -
Can you use Screaming Frog to find all instances of relative or absolute linking?
My client wants to pull every instance of an absolute URL on their site so that they can update them for an upcoming migration to HTTPS (the majority of the site uses relative linking). Is there a way to use the extraction tool in Screaming Frog to crawl one page at a time and extract every occurrence of _href="http://" _? I have gone back and forth between using an x-path extractor as well as a regex and have had no luck with either. Ex. X-path: //*[starts-with(@href, “http://”)][1] Ex. Regex: href=\”//
Technical SEO | | Merkle-Impaqt0 -
Can we use our existing site content on new site?
We added 1000s of pages unique content on our site and soon after google release penguin and we loose our ranking for major keywords and after months of efforts we decided to start a new site. If we use all the existing site content on new domain does google going to penalized the site for duplicate content or it will be treated as unique? Thanks
Technical SEO | | mozfreak0 -
How can i check the speed from section to section on my site
hi, i have a real problem on my site with speed www.in2town.co.uk and one of the problems is from going from section to section, so for example from home page to the travel section. i would like to know how i can time the time it takes from section to section and what tools i need. I urgently need to sort out time delay and make my site much faster, any help would be great
Technical SEO | | ClaireH-1848860 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
Can I Disallow Faceted Nav URLs - Robots.txt
I have been disallowing /*? So I know that works without affecting crawling. I am wondering if I can disallow the faceted nav urls. So disallow: /category.html/? /category2.html/? /category3.html/*? To prevent the price faceted url from being cached: /category.html?price=1%2C1000
Technical SEO | | tylerfraser
and
/category.html?price=1%2C1000&product_material=88 Thanks!0 -
Rel canonical = can it hurt your SEO
I have a site that has been developed to default to the non-www version. However each page has a rel canonical to the non-www version too. Could having this in place on all pages hurt the site in terms of search engines? thanks Steve
Technical SEO | | stevecounsell0 -
Grr . . . Just can't seem to get there
mrswitch.com.au is one site that we are consistantly struggling with . . . It has a page rank of 3 which beats most of the competitors, but when it comes to Google AU searches such as Sydney Electrician and Electrician Sydney etc, we just can't seem to get there and the rankings keep dropping. We backlink and update the pages on a regular basis Any ideas? - Could it be the custom CMS system?
Technical SEO | | kayweb0