Manual Webspam Error. Same Penalty on all sites on Webmaster Tools account.
-
My URL is: www.ebuzznet.comToday when i checked webmaster tools under manual spam section. I got Manual spam and reason was Thin content with little or no added value. Then I checked other sites from same webmaster tools account, there are 11 sites, all of them received same manual action. I never received any mail, no notification on site messages section regarding this manual action. I just need confirmation whether it is something to do with any error in webmaster tools or all of the sites really received manual spam actions.Most of the article on sites are above 500 words, quality content (no spun or copied).Looking for suggestions, answers
-
As per your example above.. I have enabled auto syndication to all my social network.. so only excerpt will syndicate along with linkback to original content..if you open any of those 50 links it will be facebook post. So sharing content is not against webmaster guidelines..
Already started editing and rewriting of post... It will take more than a week..
-
This is very strange.. they penalized this page with no content too http://ivishalverma.blogspot.com/: Thin content... another strange thing is I have added 4 of my sites to another webmaster account with different mail id. same 4 sites on both account. I am owner on both accounts. They penalized all blogs there too... totally they penalized two webmaster account. with around 18 sites (5 blogspot).
-
I Googled text from a few of your articles, and found tons of duplicates. See, for example, this search. Are you actually writing your own content, or is this sourced from someone else? I also noticed a lot of non-standard and incorrect grammar across multiple articles, which isn't going to help in a panda-type penalty.
Google is getting better at measuring users' reactions to your content, and they've even begun to dig into factors that cause users to leave a page and seek another result. Whether or not Google can determine the quality of language, users can, and they don't react well to language that doesn't flow.
Here's what I'd do:
- Make sure your content is unique. If you're sourcing it, make sure it hasn't been republished in part or in full.
- Ensure your content is written by someone who can really engage the audience and sound like an expert writing in English (or whatever your sites' languages are)
- Really try to hook the user right away. Make the posts as visual as possible, avoid large first paragraphs, and make sure the user knows why they should care right away. You need to elicit an emotion early on in your audience: fear, anger, amusement, interest, surprise, etc.
- Use tags sparingly, and only when it makes sense. Avoid tags with only one tagged post.
- Credibility: add dates and authors to articles backed by full bios.
The way to get out of this penalty is to really add value on each of your pages.
-
This is definitely a unique situation in my experience, so keep that in mind, but I'd think about (after making sure nothing on- or off-page could have caused the manual action) submitting 11 reconsideration requests at the same time, telling them everything you looked into, the tests, and the situation with your entire account getting manual actions. See how they respond to that.
-
manual actions is one or two sites seems legit..but all sites listed on webmaster tools account looks suspicious... They even penalized a new blogspot site with no content at all... I just created a new blogspot for testing.. and there was no article posted... It is Blank... but received thin content warning.....
-
I do see a red flag right away in the page source. When I viewed the page source code I saw some spammy looking links right before the footer. This could be an indication of a hack. If all of your websites in webmasters are on the same server, or your login for all of them is the same, or something similar, it's possible you had all your sites hacked. If you put those links there, then remove them.
Manual action messages are very vague, so sometimes it takes some digging to identify the root of the problem. The above is an idea of where to begin.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Soft 404 error
Hello friends
Technical SEO | | industriestaedt
This is my site
https://www.alihosseini.org/ In the search console I have a soft 404 error
How can I fix this error?
I use WordPress0 -
Site Audit Tools Not Picking Up Content Nor Does Google Cache
Hi Guys, Got a site I am working with on the Wix platform. However site audit tools such as Screaming Frog, Ryte and even Moz's onpage crawler show the pages having no content, despite them having 200 words+. Fetching the site as Google clearly shows the rendered page with content, however when I look at the Google cached pages, they also show just blank pages. I have had issues with nofollow, noindex on here, but it shows the meta tags correct, just 0 content. What would you look to diagnose? I am guessing some rogue JS but why wasn't this picked up on the "fetch as Google".
Technical SEO | | nezona0 -
Site Link Issues
For several search terms I get site links for the page http://www.waikoloavacationrentals.com/kolea-rentals/kolea-condos/ It makes sense that that page be a site link as it is one of my most used pages, but the problem is google gave it the site link "Kolea 10A". I am having 0 luck making any sense of why that was chosen. It should be something like "Kolea Condos" or something of that nature. Does anyone have any thoughts on where google is coming up with this?
Technical SEO | | RobDalton0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
Severe Health issue on my site through Webmaster tools
I use Go Daddy Website Tonight. I keep getting a severe health message in Google Webmaster tools stating that my robots.txt file is blocking some important page. When I try to get more details the blocked file will not open. When I asked the Go Daddy peeps they told me that it was just image and backup files that do not need to be crawled. But if Google spiders keep thinking an important page is blocked will this hurt my SERPS?
Technical SEO | | VictorVC0 -
Are lots of links from an external site to non-existant pages on my site harmful?
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site. The site itself has scraped content from elsewhere and has created 100's of malformed URLs. Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact? Thanks!
Technical SEO | | Nobody15569050351140 -
Having some weird crawl issues in Google Webmaster Tools
I am having a large amount of errors in the not found section that are linked to old urls that haven't been used for 4 years. Some of the ulrs being linked to are not even in the structure that we used to use for urls. Never the less Google is saying they are now 404ing and there are hundreds of them. I know the best way to attack this is to 301 them, but I was wondering why all of these errors would be popping up. I cant find anything in the google index searching for the link in "" and in webmaster tools it shows unavailable as where these are being linked to from. Any help would be awesome!
Technical SEO | | Gordian1