Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Pure spam Manual Action by Google
-
Hello Everyone,
We have a website http://www.webstarttoday.com. Recently, we have received manual action from Google says "Pages on this site appear to use aggressive spam techniques such as automatically generated gibberish, cloaking, scraping content from other websites, and/or repeated or egregious violations of Google’s Webmaster Guidelines." . Google has given an example http://smoothblog.webstarttoday.com/. The nature of the business of http://www.webstarttoday.com is to creating sub-domains (website builder). Anyone can register and create sub-domains.
My questions are:
- What are the best practices in case if someone is creating sub-domain for webstarttoday.com?
- How can I revoke my website from this penalty?
- What should i do with other hundreds of sub-domains those are already created by third party like http://smoothblog.webstarttoday.com? .
- Why these type of issues don't come with WordPress or weebly. ?
Regards,
Ruchi
-
That's great news that you got the penalty revoked.
It can often take a few days for the manual spam actions viewer to show that there is no longer a penalty. Also, keep an eye on the manual spam actions viewer. I've seen a number of sites lately that got a pure spam penalty revoked and then a few days or weeks later got either a thin content penalty or an unnatural links penalty. Hopefully that's not the case for you though!
-
It could be that the message is only disappearing tomorrow.
The message from Google however doesn't say that the penalty is revoked but that it has been revoked or adjusted. It's possible that the penalty is now only applied to the specific subdomain rather than the site as a whole. Is it still the original message which is shown under Manual actions?
Would update the terms & conditions anyway - so that you can react quick if you see other actions appearing. Try to scan the subdomains from time to time to make sure that they are not violating the Google guidelines.
Regards,
Dirk
-
Thanks Dirk,
You have nicely give all answers of my questions. I will take care of your points while creating the sub-domains. Also, I received this message from Google after filing the reconsideration request:
Dear Webmaster of http://www.webstarttoday.com/
We have processed the reconsideration request from a site owner for http://www.webstarttoday.com/. The site has been reviewed for violations of our quality guidelines. Any manual spam actions applied to the site have been revoked or adjusted where appropriate.
As per the message my website should had revoked from the penalty but the penalty is still showing, under "Manual action".
Thanks,
Ruchi
-
Thanks for your quick repose. Much appreciated.
-
^ VERY nice, Dirk!
-
Hi,
Try to answer your questions point by point:
1. You could add to your terms & conditions that sites created need to follow Google webmasterguidelines - and if they are not followed you can delete the subdomain.
2. Revoke the penalty is only possible by cleaning the site and removing the contested content. It depends on your current terms & conditions if you have the possibility to force the one who is managing this blog to clean the site.
3. Idem as above - if your terms & conditions didn't stipulate that messing with Google guidelines is forbidden, there is not much you can do at this point.
4. Wordpress is hosting the blogs on wordpress.com - the main site is wordpress.org. Weebly has terms & conditions that forbid Spam/SEO sites (probably Wordpress.com has this as well - but it's stated very clearly on the Weebly.com)
Update terms & conditions if necessary - send warning to offending blog users & delete them if necessary.
Hope this helps,
Dirk
-
Hi there
1. Here are a couple of resources: Moz and HotDesign 2. Pure Spam: What Are Google Penalties & What to Do to Recover from Search Engine Watch and this Q+A thread from Moz
3. I would go through your subdomains - find the ones that are blatant spam or thin with content and remove them. I would then make sure that they are blocked in your robots.txt.
4. I would say because Wordpress is the top used CMS in the world and a lot of reputable websites use it.I would really work on the spam features for your product - looking for IPs that continually create websites, thin content, cloaking, off topic websites, link farms, etc. It's your duty as a CMS to watch how your users use the product. Not only will it keep your product's reputation clean, it will also show that you are taking steps to run a product with integrity.
Hope this all helps - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does google ignore ? in url?
Hi Guys, Have a site which ends ?v=6cc98ba2045f for all its URLs. Example: https://domain.com/products/cashmere/robes/?v=6cc98ba2045f Just wondering does Google ignore what is after the ?. Also any ideas what that is? Cheers.
Intermediate & Advanced SEO | | CarolynSC0 -
How do you check the google cache for hashbang pages?
So we use http://webcache.googleusercontent.com/search?q=cache:x.com/#!/hashbangpage to check what googlebot has cached but when we try to use this method for hashbang pages, we get the x.com's cache... not x.com/#!/hashbangpage That actually makes sense because the hashbang is part of the homepage in that case so I get why the cache returns back the homepage. My question is - how can you actually look up the cache for hashbang page?
Intermediate & Advanced SEO | | navidash0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
Google is displaying wrong address
I have a client whose Google Places listing is not showing correctly. We have control of the page, and have the address verified by postcard. Yet when we view the listing it shows a totally different address that is miles away and on a totally different street. We have relogged into manage the business listing and all of the info is correct. We dragged the marker and submitted it to them that they had things wrong and left a note with the right address. Why would this happen and how can we fix it? Right now they rank highly but with a blatantly wrong address.
Intermediate & Advanced SEO | | Atomicx0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
How does Google know if a backlink is good or not?
Hi, What does Google look at when assessing a backlink? How important is it to get a backlink from a website with relevant content? Ex: 1. Domain/Page Auth 80, website is not relevant. Does not use any of the words in your target term in any area of the website. 2. Domain/Page Auth 40, website is relevant. Uses the words in your target term multiple times across website. Which website example would benefit your SERP's more if you gained a backlink? (and if you can say, how much more would it benefit - low, medium, high).
Intermediate & Advanced SEO | | activitysuper0 -
How long is the google sandbox these days?
Hello, I'm putting up a new site for the first time in a while. How long is the Google Sandbox these days, and what has changed about it. Before it was 6 months to 1 year long. Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Zero visits from keyword in Google Analytics
The keyword "business engagement in outsourcing" shows 0 visits. I have a look at Seomoz post at - http://www.seomoz.org/blog/advanced-google-analytics. According to it, "If someone makes more than one visit to a site within the same "session" and each visit comes from a search but on different keywords, then both keywords will be included in the keywords report - the first with 0 visits and the second with 1 visit" In my GA report, i could only see 0 visit for the above keyword. Why is 1 visit not being shown ? On reading the blog, http://webanalysis.blogspot.com/2008/04/google-analytics-tips-and-tricks-why-do.html#axzz1UPqhMV7o i am more confused, as it says "Google Analytics, assigns the visitors activity to the first keyword " . which is NOT what seomoz suggests
Intermediate & Advanced SEO | | seoug_20050