How to remove bad link to your site?
-
Hello,
Our website www.footballshirtblog.co.uk recently suffered a major Google penalty, wiping out 6 months of hard work. We went from getting 6000-10000 hits a day to absolutely nothing from Google. We have been baffled by the penalty as we couldn't think of anything we've done wrong.
After some analysis of Open Site Explorer, it seems I may have found the answer. There is a ton of bad links pointing to us. A few example domains are:
This is nothing to do with us and so I can only assume some competitor has done this. As we were only about 4-5 months old, I guess Google has punished us.
What do we do now? This is not a situation I have experienced before and would really appreciate your expert advice.
-
The best way to go is to continue building quality links. These links you've 'kindly' acquired from a competitor have clearly made your link profile look worse than it should be. The fact you've got nearly 450k links coming from less than 100 domains will look a tad suspicious to G.
If you continue to build quality links, google will see that you're not getting your links from spammy sources and the other links you've been given could easily turn from having a negative effect to having a positive effect.
As it's a new site, this is the time when your link profile is watched carefully, so although it's unfortunate to have got these links, it's not the end of the world by any means. Try and get some links from sites with a high MozRank using white hat tactics.
-
If it was the competitor's bad links that triggered a penalty, you can try a Google Reconsideration Request and explain what happened.
http://www.google.com/support/webmasters/bin/answer.py?answer=35843
-
Assuming that you didn't use any questionable linking practices to get your site where it was before the penalty, I would submit the site for a manual review.
If you were using some gray hat tactics, you'll have a much better chance of getting rid of those links than the ones that were so generously pointed your way. So, you time would be better spent getting rid of the links you wouldn't want them to see that you actually built vs. trying to eliminate the ones that were likely sent by a competitor. The other alternative is to blame all of the bad links on the competitor...again, assuming that you built some links you don't want Google to see.
The other option, although more painful, is to wait it out. I've seen several sites get clobbered by competitor's nasty link building tactics and after a few months they all the sudden popped back into the SERPs higher than they were before hand.
The other thing I would suggest is to make sure those links are the source of your problem, i.e. how's your anchor text? Is it over optimized? How about on-site? And how could we forget good ole Panda and it's many, many revisions. Are you certain that it didn't get hit in the 2.14x update? (I'm kidding about the numbers...it just seems like Panda is never-ending.)
-
Since you have no control over those other websites, you can't really do anything to get them to remove the links. But because of this, incoming links to your site will never penalize your site. If they did, you're right, a competitor could get sites penalized on purpose and point lots of links from them to your site.
What else could have caused your site to get penalized? Could one of the Panda updates have caused it? Check out the time line here and see if it lines up to when your site lost its traffic: http://www.seomoz.org/google-algorithm-change.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way of crawling my entire site to get a list of NoFollow links?
Hi all, hope somebody can help. I want to crawl my site to export an audit showing: All nofollow links (what links, from which pages) All external links broken down by follow/nofollow. I had thought Moz would do it, but that's not in Crawl info. So I thought Screaming Frog would do it, but unless I'm not looking in the right place, that only seems to provide this information if you manually click down each link and view "Inlinks" details. Surely this must be easy?! Hope someone can nudge me in the right direction... Thanks....
Intermediate & Advanced SEO | | rl_uk0 -
What link building techniques would you recommend for a dating site?
I am working on adding more content to the site (content marketing, trying to attract natural links), and this includes a blog. On-site optimization will be done based on good keyword research, and after that I will be working on link building for the site. I will pull backlink data of competing best performing dating websites, google-wise, and try to get some links from there. What other link building strategies / techniques could be good for this? Thanks.
Intermediate & Advanced SEO | | blrs120 -
If we remove all of the content for a branch office in one city from a web site, will it harm rankings for the other branches?
We have a client with a large, multi-city home services business. The service offerings vary from city to city, so each branch has it's own section on a fairly large (~6,000 pages) web site. Each branch drives a significant amount of revenue from organic searches specific to its geographic location (ex: Houston plumbers or Fort Worth landscaping). Recently, one of the larger branches has decided that it wants its own web site on a new domain because they have been convinced by an SEO firm that they can get better results with a standalone site. That branch wants us to remove all of its content (700-800 pages) on the current site and has said we can 301 all inbound links to the removed content to other pages on the existing site to mitigate any loss to domain authority. The other branch managers want to know if removing this city-specific content could negatively impact search rankings for their cities. On the surface it seems like as long as we have proper redirects in place, the other branches should be okay. Am I missing something?
Intermediate & Advanced SEO | | monkeeboy0 -
Using both dofollow & nofollow links within the same blog site (but different post).
Hi all, I have been actively pursuing bloggers for my site in order to build page rank. My website sells women undergarments that are more on the exotic end. I noticed a large amount of prospective bloggers demand product samples. As already confirm, bloggers that are given "free" samples should use a rel=no follow attribute in their links. Unfortunately this does not build my page rank or transfer links juice. My question is this: is it advisable for them to also blog additional posts and include dofollow links? The idea is for the blogger to use a nofollow when posting about the sample and a regular link for a secondary post at a later time. What are you thoughts concerning this matter?
Intermediate & Advanced SEO | | 90miLLA0 -
Depth of Links on Ecommerce Site
Hi, In my sitemap, I have the preferred entrance pages and URL's of categories and subcategories. But I would like to know more about how Googlebot and other spiders see a site - e.g. - what is classed as a deep link? I am using Screaming Frog SEO spider, and it has a metric called level on it - and this represents how deep or how many clicks away this content is.. but I don't know if that is how Googlebot would see it - From what Screaming Frog SEO spider software says, each move horizontally across from Navigation is another level which visually doesnt make sense to me? Also, in my sitemap, I list the URL's of all the products, there are no levels within the sitemap. Should I be concerned about this? Thanks, B
Intermediate & Advanced SEO | | bjs20100 -
Our Site's Content on a Third Party Site--Best Practices?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content. I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site. Our thoughts so far: add a paragraph of original content to our content link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties) What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site? They are really pushing for not using a canonical--so this isn't an option. What would you do?
Intermediate & Advanced SEO | | nicole.healthline1 -
Directories - Bad or Good for Link Building (Discussion on Penguin)
Hello, I would like to hear everybodies opinion on directories for link building now that penguin is out. Here's a good background post: http://www.seomoz.org/blog/web-directory-submission-danger Do you think their out? How do you still use them? Which ones do you stick to?
Intermediate & Advanced SEO | | BobGW0 -
A Client Changed the Link Structure for Their Site... Not Just Once, but Twice
I have a client who's experiencing a number of crawl errors, which I've gotten down fo 9,000 from 18,000. One of the challenges they experience is that they've modified their URL structure a couple times. First it was: site.com/year/month/day/post-name
Intermediate & Advanced SEO | | digisavvy
Then it was: site.com/category/post-name
Now it's: site.com/post-name I'm not sure of the time elapsed between these changes, but enough time has passed that the URLs for the previous two URL structures have been indexed and spit out 404s now. What's the best/clean way to address this issue?I'm not going to create 9k redirect rules obviously, but there's got to be a way to address this issue and resolve it moving forward.0