Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to check if a site is doing blackhat SEO?
-
Thanks in advance!
-
It really depends on what you define as blackhat. On-page trickery (cloaking, redirects for search engines bots, etc.) can be discovered by browsing as a search bot, digging into code, viewing caches, etc. Danny Sullivan and Rand uncovered a large amount of cloaked (and stolen) content on stage at SMX Sydney a few years ago. It was quite entertaining at the time
Some people are basic enough to use tactics like hidden, white-on-white text, as Martijn says. I'm yet to see that tactic actually working post-2004 though
If it's links they're using, the easiest way is to use a tool like Open Site Explorer, Ahrefs or similar to check the links out. Sneaky people can block the OSE / Ahrefs / MajesticSEO bots from crawling the sources of their backhat links if they have access to the linking sites. You can block the bots either in robots.txt or by rejecting the visits to stop the bots from noting that the links exist. That way, the backlink analysis tools will never see that blackhatsite.com links to rankingsite.com, and so forth. It takes a big network that the spammer controls to block link research tools' bots' access to every link you build, however, so this isn't too common.
Whether all big brands / well ranked sites are using blackhat tactics pretty much depends on your definition of blackhat, but it's certainly true that it is very hard if not impossible to rank top 3 for competitive terms (car insurance, poker, credit cards) without parting with money that results in links being built. This doesn't mean that they're all buying links, but they're definitely investing in marketing that results in links, and the whitest of the whitehats will say that this is technically not organic, natural link development. It is, however, what we do - marketing.
-
Why does it matter?
-
An even easier way is to check their rankings - if they're top 3 for big money terms in their niche, they're probably using some blackhat tactics. Even the whitest of whitehats are still using some blackhat tactics in the background, despite people not wanting to admit it.
-
I can't agree more with Gary, we probably need some more information to know what kind of black hat you're possibily dealing with. One of the first things I tend to look at trying to find out if the site is using some ways of black hat tactics are:
- Backlink profile, if the quality of links is low or certain percentages between follow/ nofollow links are different then it could be a sign.
- Look at the site with Google as a user agent and see if the site is showing different information then to a real user.
- Just do a select all on the site to see if they hide any content (yup, still happens).
-
Your question is a bit to open ended, what do you want to achieve by knowing this information.
Does a site rank better than you?
Are they doing negative seo to other people?
Do they steal content from people?Are they building links as dofollow from places they should not?Too many questions to ask before answering such a vague answer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz was unable to crawl your site? Redirect Loop issue
Moz was unable to crawl your site on Jul 25, 2017. I am getting this message for my site: It says "unable to access your homepage due to a redirect loop. https://kuzyklaw.com/ Site is working fine and last crawled on 22nd July. I am not sure why this issue is coming. When I checked the website in Chrome extension it saysThe server has previously indicated this domain should always be accessed via HTTPS (HSTS Protocol). Chrome has cached this internally, and did not connect to any server for this redirect. Chrome reports this redirect as a "307 Internal Redirect" however this probably would have been a "301 Permanent redirect" originally. You can verify this by clearing your browser cache and visiting the original URL again. Not sure if this is actual issue, This is migrated on Https just 5 days ago so may be it will resolved automatically. Not sure, can anybody from Moz team help me with this?
White Hat / Black Hat SEO | | CustomCreatives0 -
Opinion on Gotch SEO methods & services
I would love to get you all's take on Gotch SEO. I am gearing up to link build for a site in the next several months, and have been reading up from sources other than Moz, in preparation. (Need to re-read Moz's guide, too, but I have already read it last year) I'm reading Gotch SEO's main link building method articles right now, and am wondering what you all think. Do you think they have a good approach and are generally reliable? Likewise, has anyone used their service for getting a link? What was your experience? Or if you haven't used the service, any quick takes on it?
White Hat / Black Hat SEO | | scienceisrad0 -
Spam sites with low spam score?
Hello! I have a fair few links on some of the old SEO 'Directory' sites. I've got rid of all the obviously spammy ones - however there are a few that remain which have very low spam scores, and decent page authority, yet they are clearly just SEO directories - I can't believe they service any other purpose. Should we now just be getting rid of all links like this, or is it worth keeping if the domain authority is decent and spam score low? Thanks Sam
White Hat / Black Hat SEO | | wearehappymedia0 -
How to make second site in same niche and do white hat SEO
Hello, As much as we would like, there's a possibility that our site will never recover from it's Google penalties. Our team has decided to launch a new site in the same niche. What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.) We won't have duplicate content, but it's hard to make the sites not similar. Thanks
White Hat / Black Hat SEO | | BobGW0 -
How would you optimize a new site?
Hi guys, im here to ask based on your personal opinion. We know in order to rank in SEO for a site is to make authority contents that interest people. But what would you do to increase your ranking of your site or maybe a blog post? leaving your link on blogs comment seem dangerous, nowadays. Is social media the only way to go? Trying to get people to write about you? what else can be done?
White Hat / Black Hat SEO | | andzon0 -
Bad for SEO to have two very similar websites on the same server?
Is it bad for SEO to have two very similar sites on the same server? What's the best way to set this up?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Do pingbacks in Wordpress help or harm SEO? Or neither?
Hey everyone, Just wondering, do pingbacks in Wordpress help or harm SEO? Or neither?
White Hat / Black Hat SEO | | jhinchcliffe1