Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Pure spam Manual Action by Google
-
Hello Everyone,
We have a website http://www.webstarttoday.com. Recently, we have received manual action from Google says "Pages on this site appear to use aggressive spam techniques such as automatically generated gibberish, cloaking, scraping content from other websites, and/or repeated or egregious violations of Google’s Webmaster Guidelines." . Google has given an example http://smoothblog.webstarttoday.com/. The nature of the business of http://www.webstarttoday.com is to creating sub-domains (website builder). Anyone can register and create sub-domains.
My questions are:
- What are the best practices in case if someone is creating sub-domain for webstarttoday.com?
- How can I revoke my website from this penalty?
- What should i do with other hundreds of sub-domains those are already created by third party like http://smoothblog.webstarttoday.com? .
- Why these type of issues don't come with WordPress or weebly. ?
Regards,
Ruchi
-
That's great news that you got the penalty revoked.
It can often take a few days for the manual spam actions viewer to show that there is no longer a penalty. Also, keep an eye on the manual spam actions viewer. I've seen a number of sites lately that got a pure spam penalty revoked and then a few days or weeks later got either a thin content penalty or an unnatural links penalty. Hopefully that's not the case for you though!
-
It could be that the message is only disappearing tomorrow.
The message from Google however doesn't say that the penalty is revoked but that it has been revoked or adjusted. It's possible that the penalty is now only applied to the specific subdomain rather than the site as a whole. Is it still the original message which is shown under Manual actions?
Would update the terms & conditions anyway - so that you can react quick if you see other actions appearing. Try to scan the subdomains from time to time to make sure that they are not violating the Google guidelines.
Regards,
Dirk
-
Thanks Dirk,
You have nicely give all answers of my questions. I will take care of your points while creating the sub-domains. Also, I received this message from Google after filing the reconsideration request:
Dear Webmaster of http://www.webstarttoday.com/
We have processed the reconsideration request from a site owner for http://www.webstarttoday.com/. The site has been reviewed for violations of our quality guidelines. Any manual spam actions applied to the site have been revoked or adjusted where appropriate.
As per the message my website should had revoked from the penalty but the penalty is still showing, under "Manual action".
Thanks,
Ruchi
-
Thanks for your quick repose. Much appreciated.
-
^ VERY nice, Dirk!
-
Hi,
Try to answer your questions point by point:
1. You could add to your terms & conditions that sites created need to follow Google webmasterguidelines - and if they are not followed you can delete the subdomain.
2. Revoke the penalty is only possible by cleaning the site and removing the contested content. It depends on your current terms & conditions if you have the possibility to force the one who is managing this blog to clean the site.
3. Idem as above - if your terms & conditions didn't stipulate that messing with Google guidelines is forbidden, there is not much you can do at this point.
4. Wordpress is hosting the blogs on wordpress.com - the main site is wordpress.org. Weebly has terms & conditions that forbid Spam/SEO sites (probably Wordpress.com has this as well - but it's stated very clearly on the Weebly.com)
Update terms & conditions if necessary - send warning to offending blog users & delete them if necessary.
Hope this helps,
Dirk
-
Hi there
1. Here are a couple of resources: Moz and HotDesign 2. Pure Spam: What Are Google Penalties & What to Do to Recover from Search Engine Watch and this Q+A thread from Moz
3. I would go through your subdomains - find the ones that are blatant spam or thin with content and remove them. I would then make sure that they are blocked in your robots.txt.
4. I would say because Wordpress is the top used CMS in the world and a lot of reputable websites use it.I would really work on the spam features for your product - looking for IPs that continually create websites, thin content, cloaking, off topic websites, link farms, etc. It's your duty as a CMS to watch how your users use the product. Not only will it keep your product's reputation clean, it will also show that you are taking steps to run a product with integrity.
Hope this all helps - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How get google reviews on search results?
Hi, We have good google reviews. (4,8) Can we get this rating stars also on our organic search results ? Best remco
Intermediate & Advanced SEO | | remcoz0 -
Changed all external links to 'NoFollow' to fix manual action penalty. How do we get back?
I have a blog that received a Webmaster Tools message about a guidelines violation because of "unnatural outbound links" back in August. We added a plugin to make all external links 'NoFollow' links and Google removed the penalty fairly quickly. My question, how do we start changing links to 'follow' again? Or at least being able to add 'follow' links in posts going forward? I'm confused by the penalty because the blog has literally never done anything SEO-related, they have done everything via social and email. I only started working with them recently to help with their organic presence. We don't want them to hurt themselves at all, but 'follow' links are more NATURAL than having everything as 'NoFollow' links, and it helps with their own SEO by having clean external 'follow' links. Not sure if there is a perfect answer to this question because it is Google we're dealing with here, but I'm hoping someone else has some tips that I may not have thought about. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
How to rank my website in Google UK?
Hi guys, I own a London based rubbish removal company, but don't have enough jobs. I know for sure that some of my competitors get most of their jobs trough Google searches. I also have a website, but don't receive calls from it at all. Can you please tell me how to rank my website on keywords like: "rubbish removal london", "waste clearance london", "junk collection london" and other similar keywords? I know that for person like me (without much experience in online marketing) will be difficult task to optimize the website, but at least - I need some advices from where to start. I'm also thinking to hire an SEO but not sure where to find a trusted company. Most importantly I have no idea how much should pay to expect good results? What is too much and what is too low? I will appreciate all advices.
Intermediate & Advanced SEO | | gorubbishgo0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Wrong country sites being shown in google
Hi, I am having some issues with country targeting of our sites. Just to give a brief background of our setup and web domains We use magento and have 7 connected ecommerce sites on that magento installation 1.www.tidy-books.co.uk (UK) - main site 2. www.tidy-books.com (US) - variations in copy but basically a duplicate of UK 3.www.tidy-books.it (Italy) - fully translated by a native speaker - its' own country based social medias and content regularly updated/created 4.www.tidy-books.fr (France) - fully translated by a native speaker - its' own country based social medias and content regularly updated/created 5.www.tidy-books.de (Germany) - fully translated by a native speaker - uits' own country based social medias and content regularly updated/created 6.www.tidy-books.com.au (Australia) - duplicate of UK 7.www.tidy-books.eu (rest of Europe) - duplicate of UK I’ve added the country and language href tags to all sites. We use cross domain canonical URLS I’ve targeted in the international targeting in Google webmaster the correct country where appropriate So we are getting number issues which are driving me crazy trying to work out why The major one is for example If you search with an Italian IP in google.it for our brand name Tidy Books the .com site is shown first then .co.uk and then all other sites followed on page 3 the correct site www.tidy-books.it The Italian site is most extreme example but the French and German site still appear below the .com site. This surely shouldn’t be the case? Again this problem happens with the co.uk and .com sites with when searching google.co.uk for our keywords the .com often comes up before the .co.uk so it seems we have are sites competing against each other which again can’t be right or good. The next problem lies in the errors we are getting on google webmaster on all sites is having no return tags in the international targeting section. Any advice or help would be very much appreciated. I’ve added some screen shots to help illustrate and happy to provide extra details. Thanks UK%20hreflang%20errors.png de%20search.png fr%20search.png it%20search.png
Intermediate & Advanced SEO | | tidybooks1 -
Will Google View Using Google Translate As Duplicate?
If I have a page in English, which exist on 100 other websites, we have a case where my website has duplicate content. What if I use Google Translate to translate the page from English to Japanese, as the only website doing this translation will my page get credit for producing original content? Or, will Google view my page as duplicate content, because Google can tell it is translated from an original English page, which runs on 100+ different websites, since Google Translate is Google's own software?
Intermediate & Advanced SEO | | khi50 -
Google and Product Description Tabs
How does Google process a product page with description tabs? For example, lets say the product page has a tab for Overview, Specifications, What's In the Box and so on. Wouldn't that content be better served in one main product description tab with the tab names used as (htags) or highlighted paragraph separators? Or, does all that content get crawled as a single page regardless of the tabs?
Intermediate & Advanced SEO | | AWCthreads0 -
How does Google know if a backlink is good or not?
Hi, What does Google look at when assessing a backlink? How important is it to get a backlink from a website with relevant content? Ex: 1. Domain/Page Auth 80, website is not relevant. Does not use any of the words in your target term in any area of the website. 2. Domain/Page Auth 40, website is relevant. Uses the words in your target term multiple times across website. Which website example would benefit your SERP's more if you gained a backlink? (and if you can say, how much more would it benefit - low, medium, high).
Intermediate & Advanced SEO | | activitysuper0