I am being black hat SEO'd by another company. What should I do?
-
Hi There,
I found out about 6 months ago that I have been getting black hat SEO'd by another company.
There are around 350 spammy domains pointing to my home page and product page.
I have disavowed a lot of them. Is there anything else I can do?
http://bareblends.com.au/the-optimum-9400-blender
Thanks!
-
Disavow those bad links and seek to increase the number of good reputable links you have. I checked out the url, you have a great looking site!
I bet you could relatively easily get some influential health food bloggers or magazine editors to write about your products. Food's relatively easy to get links for because you can send out 1000s of samples to influencers at a very tiny cost. This is a time honored PR method that still applies in the digital age. If people like the food and the message, they will many times write about it. I'd suggest approaching those types of influencers, telling them the story of your brand and potentially sampling them. If they like your product, chances are they will write about it and you might get some great backlinks out of it. Key is you cannot pay for those links, they must be freely given because the editor/blogger/writer likes your product and/or brand. Best of luck!
-
I've been followin the "Negative SEO" world for years - http://www.canuckseo.com/index.php?s=%22negative+seo%22 - will lead to our own blog posts on same...
Suffice it to say - please tho do read some other experts and how they say to fight off same - the best tool you have is the Google disavow one - oh, don't forget that BING has one too, eh!
Hang in. Stay on top by using a great very functional IBL tool too like majestic too...and keep on the disavow trail...is the best thing I can offer today....
-
Hi Oscar,
Are you noticing anything negative in terms of your phrases slipping? The reason I ask, is that Google is pretty smart when it comes to links, and someone just building spammy links isn't usually enough to draw a penalty on its own. If Google sees a few hundred spammy links appear overnight (figuratively speaking) then they are likely to just ignore them.
By all means disavow these bad links, but focus more on getting new links - be careful how you do this though. I have clients who have thousands upon thousands of low quality / spammy links, but the fact that I keep the link profiles topped up with high quality links, keeps penalties away.
-Andy
-
You're pretty much doing what you can. Continue to disavow and continue to build high quality links. The stronger your link profile, the less likely you are to receive a penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
How necessary is it to disavow links in 2017? Doesn't Google's algorithm take care of determining what it will count or not?
Hi All, So this is a obvious question now. We can see sudden fall or rise of rankings; heavy fluctuations. New backlinks are contributing enough. Google claims it'll take care of any low quality backlinks without passing pagerank to website. Other end we can many scenarios where websites improved ranking and out of penalty using disavow tool. Google's statement and Disavow tool, both are opposite concepts. So when some unknown low quality backlinks are pointing and been increasing to a website? What's the ideal measure to be taken?
Intermediate & Advanced SEO | | vtmoz0 -
List of SEO "to do's" to increase organic rankings
We are looking for a complete list of all white hat SEO "to do's" that an SEO firm should do in order to help increase Google/Bing/Yahoo organic rankings. We would like to use this list to be sure that the SEO company/individual we choose uses all these white hat items as part of an overall SEO strategy to increase organic rankings. Can anyone please point me in the right direction as to where we can obtain this complete list? If this is not the best approach, please let me know what is, as I am not an SEO person. Thank you kindly in advance
Intermediate & Advanced SEO | | RetractableAwnings.com0 -
Why is /home used in this company's home URL?
Just working with a company that has chosen a home URL with /home latched on - very strange indeed - has anybody else comes across this kind of homepage URL "decision" in the past? I can't see why on earth anybody would do this! Perhaps simply a logic-defying decision?
Intermediate & Advanced SEO | | McTaggart0 -
SEO Behind a paywall.
Good Morning! Does anybody have any experience with SEO behind a paywall. If we have a portion of a website that is going to be locked, will google still be able to access all of that regardless of paying? If not is there any way to circumvent that? Any thoughts are greatly appreciated! MOZel Tov!
Intermediate & Advanced SEO | | HashtagHustler0 -
SEO and Internal Pages
Howdy Moz Fans (quoting Rand), I have a weird issue. I have a site dedicated to criminal defense. When you Google some crimes, the homepage comes up INSTEAD of the internal page directly related to that type of crime. However, on other crimes, the more relevant internal page appears. Obviously, I want the internal page to appear when a particular crime is Googled and NOT the homepage. Does anyone have an explanation why this happens? FYI: I recently moved to WP and used a site map plugin that values the internal pages at 60% (instead of Weebly, which has an auto site map that didn't do that). Could that be it? I have repeatedly submitted the internal pages via GWT, but nothing happens. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Multiple, Partial Redirecting URLs from old SEO company
Received quite a surprise when I gained access to the Google webmaster account and saw 4 domains that are link to my clients domain and the number of links for each domain range between 10,000 and 90,000. Come to find out this was a result of their former agency. The business is very local central. I will use the example of a burger place. They main site is burgers.com and burger places are listed by city and state. Their former agency bought several domains like californiaburgers.com and duplicated the listings for that state on this domain. You can view certain pages of the second domain, but the home page is redirected as are most of the city pages with 301s to the main burgers.com domain. However, there are pages on the additional domains that do not redirect, as they are not duplicated on the main domain so nowhere to redirect. Google has only found four of them but looks like there could be at least 50. Pages that are not redirected are indexed by the engines - but not ranking (at least not well). There is a duplicate content issue, although "limited" in the sense that it really is just the name of the business, address and phone number - there is not much to these listings. What is the best approach to overcome? Right now GWT is showing over 300,000 links, however at least 150,000 to 200,000 of that is from these domains.
Intermediate & Advanced SEO | | LeverSEO0 -
SEO Recommendations
For about 3 years our website was number one in Google.co.uk for our trades main keyphrase which resulted in excellent sales. About 12 months ago that position started to slide downwards and for that keyphrase we are now number 10. We are still at number's 1,2 and 3 for several other keyphrases but ones that result in fewer daily Google searches and resultant sales. I have always added unique content to the site but admit that my blog posts became less than daily over the past 12 months. However I am adding posts as often as I can now, of good length and of unique content. As well as tweaking all our online seo factors I'm trying to add good backlinks as often as possible. I wonder if anyone has been in a similar position and what they did to try and regain their previous position? Colin
Intermediate & Advanced SEO | | NileCruises0