What is the best way to remove and fight back backlink spam?
-
Removing low quality and spam backlinks. What is the most effective clean-up process?
-
Hey Matti
Glad it helped buddy. Check out those tools, they won't do the job for you but they will certainly help out with some of the manual labour aspects.
Marcus
-
thank you for the response Marcus. so far it's not really that bad. I discovered that there were some pretty bizarre links that facilitated the atrophy of our rankings but not quite reach 90% spam links.Starting a new domain is far more than I think. I'll probably look at the tools that you suggested and see what's out there then.
Regards,
Matti
-
Hey Matti
In a nutshell, if it is really bad, then start again on a new domain.
What I am seeing with a few people I am helping is that where the site has had historical results, but is now penalised, attempting to clean up if the back link profile is pretty rotten (90% + placed links) is a pretty tough gig.
There are some tools out there that are proving useful and the pick of the bunch would be:
- rmoov
- Link Cleanup and Contact
- Remove’em
These all have pros and cons so you will likely want to use all of them.
Additionally, you will want to make sure the site is worth saving and likely invest some time and effort in generating some honest links through some solid content marketing. Maybe build some kind of free report or something specific to the site that you can use to do some outreach based link building. Do some blogging, invest some time and effort in the quality of the site.
Additionally, if you have a penalty, be prepared to put in a few requests and if you intend to disavow, be thorough.
With some experience here, you also have to ask yourself - what are you trying to save? If the answer to that question is that you are trying to save some spam links that still seem to be working at the moment, then, seriously, start again.
Without a link and some research it is hard to make a call but know this, it is a tough job to remove bad links and unless you have a link profile where there is something worth saving, then a new domain is likely the fastest way to sort out this mess and make sure you don't get hit again when they tighten up the link penalties down the road (you know it's going to happen).
There really is no generic answer here and every situation is different but be sure to know what you are getting yourself into before you undertake this and do an honest review of the site, the content and the links to make sure this is a battle you can win.
This is a good read:
http://cyrusshepard.com/penalty-lifted/
Hope that helps buddy
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Exposure from backlinks for job posting URL. Will soon expire, how best to keep the backlink juice?
Hi All, First post and apologies if this seems obvious. I run a niche jobs board and recently one of our openings was shared online quite heavily after a press release. The exposure has been great but my problem is the URL generated for the job post will soon expire. I was wondering the best way to keep the "link juice" as I can't extend the post indefinitely as the job has been filled. Would a 301 redirect work best in this case? Thanks in advance for the info!
Technical SEO | | MartinAndrew0 -
What is the best way to handle Product URLs which prepopulate options?
We are currently building a new site which has the ability to pre-populate product options based on parameters in the URL. We have done this so that we can send individual product URLs to google shopping. I don't want to create lots of duplicate pages so I was wondering what you thought was the best way to handle this? My current thoughts are: 1. Sessions and Parameters
Technical SEO | | moturner
On-site product page filters populate using sessions so no parameters are required on-site but options can still be pre-populated via parameters (product?colour=blue&size=100cm) if the user reaches the site via google shopping. We could also add "noindex, follow" to the pages with parameters and a canonical tag to the page without parameters. 2. Text base Parameters
Make the parameters in to text based URLs (product/blue/100cm/) and still use "noindex, follow" meta tag and add a canonical tag to the page without parameters. I believe this is possibly the best solution as it still allows users to link to and share pre-populated pages but they won't get indexed and the link juice would still pass to the main product page. 3. Standard Parmaters
After thinking more today I am considering the best way may be the simplest. Simply using standard parameters (product?colour=blue&size=100cm) so that I can then tell google what they do in webmaster tools and also add "noindex, follow" to the pages with parameters along with the canonical tag to the page without parameters. What do you think the best way to handle this would be?0 -
The best way to do Interstitial (ads)
Hello, I want to ask you guys what's the best way do to Interstitial without penalty?
Technical SEO | | JohnPalmer
and feel free to give me samples from another major websites. Thanks!0 -
Best way to deal with these urls?
Found overly dynamic urls in the crawl report. http://www.trespass.co.uk/camping/festivals-friendly/clothing?Product_sort=PriceDesc&utm_campaign=banner&utm_medium=blog&utm_source=Roslyn Best way to deal with these? Cheers Guys
Technical SEO | | Trespass0 -
Finding Broken Back Links
Hello there I am new here but really want to mend my broken website by myself as I enjoy a challenge! I used to have great rankings but have moved websites a few times (same domain) and the last move was to wordpress. I now have loads of broken links in the SERPS and wondered if there was an easy way to flush google of them as they are getting lots of 404 errors? They really are too many to do a 301 on (I have done the main pages) Also how do I do a crawl of my website for any internal broken links? Does SEOmoz have something or is there an external program you would recommend? Thanks Victoria
Technical SEO | | vcasebourne0 -
What are the best tools for back links?
I am a new to SEO, please help me in choosing the right tools for back links. I am thinking to buy Ultimate demon, Should I buy it or not? I have a range of you tube videos to rank.
Technical SEO | | Sajiali0 -
What are the best techniques for sub-menu?
Which techniques are "SEO-Friendly" for creating a sub-menu when you have to go hover a menu to discover the sub-menu? Best regards, Jonathan
Technical SEO | | JonathanLeplang0 -
Best Dynamic Sitemap Generator
Hello Mozers, Could you please share the best Dynamic Sitemap Generator you are using. I have found this place: http://www.seotools.kreationstudio.com/xml-sitemap-generator/free_dynamic_xml_sitemap_generator.php Thanks in advanced for your help.
Technical SEO | | SEOPractices0