How to run SEO tests you don't want to be associated with
-
A client has a competitor who is ranking above them for a highly competitive term they shouldn't really be able to rank for.
I think I know how the site got there, and I think I can replicate it myself with a quick test, but it's definitely grey hat if not black hat to do so.
I do not want my own sites and company to be damamged by the test, but i'd like to let the client know for sure, and also i'd love to know myself.
The test should take about a week to run, there is no hacking involved or password stealing or anything damaging to another.
How would you do such a test? I'm dubious about using my own server / site for it, but would a week really matter?
Tom
-
thats very helpful, I forgot about the hidden whois thingy!
-
Would need to know more about the test. Many point to sites that have a load of cheap links and say that they should not be ranking so well with such cheap links, but in many cases these sites have a few really good links amung them that earns them their place.
I would not be so quick to suggest they do not deserve their place
-
The first might not be ethical. You could try your tactics on a website you are associated with, and already is an established domain. As a personal example, I created a Magento extension to handle administrating category attributes. For my own case study I wanted to try out, I pointed links at the Magento Connect store that hosts the plugin page instead of a blog post or store page I developed. Some probably have ethical problems with this. One downside is you don't have access to Analytics, but tracking keywords shouldn't be tough.
The next is to grab up domains and start testing away. This is how I got started back in SEO around 6 years ago. Grab whoisguard to protect others from seeing domain registration information if you are worried. Personally, I wouldn't be. After all, who really cares that you are testing SEO out on a domain that no one cares about? (Although, I will say, your guinea pig websites often graduate testing status and become worthwhile websites without you intending)
Hope that helps. Maybe others have more ideas.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What sort of knowledge differentiates a good SEO from a great SEO?
This is definitely more of a discussion than a clear cut answer.
White Hat / Black Hat SEO | | Edward_Sturm0 -
New Service/Product SEO and rankings
Hello, fellow MOZers. We are a web design company, and we had SEO as secondary service for years. Due to changes in the company we started pushing SEO as one of our main services about 6 monhs ago. We have separate page , targeting that service, as well as case studies, supportive information pages, even SEO Center, which is like a blog about SEO only. We are not using black hat SEO, doing honest link earning and building, don't use keyword stuffing, everything is by the book. I understand that SEO takes time, especially for a company which has a footprint as web design company, not as SEO company. We are ranking very good for web design related keyphrases, however, we don't see any improvements for SEO related keywords. It always was and is between 25-30 SERP. At the same time, competitors, who are ranking on first page for SEO related phrases are pretty bad looking. Design-wise as well as blackhat-SEO-wise. Everything is keyword stuffed, UX is horrible, prices are ridiculous. So, do you guys have any thought/advise on how we can see results / why we are not seeing results. Links: Google search result: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=seo%20houston Competitors: www.seohouston.com, www.graphicsbycindy.com Our pages: https://www.hyperlinksmedia.com/seo-houston.php, https://www.hyperlinksmedia.com/seo-houston/
White Hat / Black Hat SEO | | seomozinator0 -
Are businesses still hiring SEO that use strategies that could lead to a Google penalty?
Is anyone worried that businesses know so little about SEO that they are continuing to hire SEO consultants that use strategies that could land the website with a Google penalty? I ask because we did some research with businesses and found the results worrying: blog farms, over optimised anchor text. We will be releasing the data later this week, but wondered if it something for the SEO community to worry about and what can be done about it.
White Hat / Black Hat SEO | | williamgoodseoagency.com0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Negative SEO to inner page: remove page or disavow links?
Someone decided to run a negative-SEO campaign, hitting one of the inner pages on my blog 😞 I noticed the links started to pile up yesterday but I assume there will be more to come over the next few days. The targeted page is of little value to my blog, so the question is: should I remove the affected page (hoping that the links won't affect the entire site) or to submit a disavow request? I'm not concerned about what happens to the affected page, but I want to make sure the entire site doesn't get affected as a result of the negative-SEO. Thanks in advance. Howard
White Hat / Black Hat SEO | | howardd0 -
Which SEO companies offer Penalty analysis?
I'm having a hard time finding a (good) SEO company which specializes itself in Penalty analysis? Any recommendations? I only found Bruce Clay, but they charge 8,000$ :)...
White Hat / Black Hat SEO | | wellnesswooz0 -
Recovering From Black Hat SEO Tactics
A client recently engaged my service to deliver foundational white hat SEO. Upon site audit, I discovered a tremendous amount of black hat SEO tactics employed by their former SEO company. I'm concerned that the efforts of the old company, including forum spamming, irrelevant backlink development, exploiting code vulnerabilities on BB's and other messy practices, could negatively influence the target site's campaigns for years to come. The site owner handed over hundreds of pages of paperwork from the old company detailing their black hat SEO efforts. The sheer amount of data is insurmountable. I took just one week of reports and tracked back the links to find that 10% of the accounts were banned, 20% tagged as abusive, some of the sites were shut down completely, WOT reports of abusive practices and mentions on BB control programs of blacklisting for the site. My question is simple. How does one mitigate the negative effects of old black hat SEO efforts and move forward with white hat solutions when faced with hundreds of hours of black gunk to clean up. Is there a clean way to eliminate the old efforts without contacting every site administrator and requesting removal of content/profiles? This seems daunting, but my client is a wonderful person who got in over her head, paying for a service that she did not understand. I'd really like to help her succeed. Craig Cook
White Hat / Black Hat SEO | | SEOptPro
http://seoptimization.pro
info@seoptimization.pro0