Spam Backlinks to My Website
-
today i have created inbound link report using Link Research & Analysis tool and i found that there are number of spam inbound link to my website from lots of blogs and other sites Which anchor text are not relevant to my site. It contain some abusive words in anchor text like "viagra expiration date" and other.
I want remove these irrelevant backlinks. As there are very high number of links approx 9000, its almost impossible to remove the links manually. Is there any way to remove and restrict those backlink?
Whats steps required to protect any negative affect to my website?
Please advice asap.
-
At least their earning revenue as opposed to the 95% of the do gooders that earn nothing..fact!
-
I have never do any balckhats techniques or never use any software for link building.
-
To be honest I have never heard of some one discovering spammy links to their site??
Blackhats build spammy links all of the time.
-
No matter that anyone says, you can't be punished for outbound links to your site...imagine if I wanted to sabotage one of my competitors rankings in google, well I could order a load of spammy links couldn't I, I could even use proxies or some other technique to really hide my mischievous work.
To be honest I have never heard of some one discovering spammy links to their site?? Are you sure you didn't order a link building service that has gone sour?? There is nothing to be ashamed of if you did....
Good Luck!
-
Dear Saurabh,
There is a very low chance that those links might leave a negative impact on your website, because the anchor text doesn't match with your website subject at all. So I think that Google will simply ignore these links. Of course there is always a way to send an email asking to remove the link from the website but not in your case, where you have 9000 links
The best what you can do in this case is to get some valuable links from trusted websites with relevant information, thereby you will get some credibility in front of the Google!
Hope it will make sense to you,
Cheers ,
Russel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website affected by Penguine / Panda
Dear All,
Technical SEO | | omverma
We have some websites. How we can check if site got affected by penguine / panda. We have observed few things since last few days like impressions are going down and keyword ranking is going down too. Any tools or any steps, to detect it will help us.
Thanks,
Om1 -
Duplicate Pages on GWT when redesigning website
Hi, we recently redesigned our online shop. We have done the 301 redirects for all product pages to the new URL (and went live about 1.5 week ago), but GWT indicated that the old product URL and the new product URL are 2 different pages with the same meta title tags (duplication) - when in fact, the old URL is 301 redirecting to the new URL when visited. I found this article on google forum: https://productforums.google.com/forum/#!topic/webmasters/CvCjeNOxOUw
Technical SEO | | Essentia
It says we either just wait for Google to re-crawl, of use the fetch URL function for the OLD URLs. Question is, after i fetch the OLD URL to tell Google that it's being redirected, should i click the button 'submit to index' or not? (See screengrab - please note that it was the OLD URL that was being fetched, not the NEW URL). I mean, if i click this button, is it telling Google that: a. 'This old URL has been redirected, therefore please index the new URL'? or
b. 'Please keep this old URL in your index'? What's your view on this? Thanks1 -
UK rankings disappeared after US website launch
Hi all, I had a client that recently released a US version of their UK website and put it live without informing me first! Once I saw it (about 3/4 days later) I immediately asked them to include the rel=alternate tag onto both websites. However, in the meantime our UK rankings have all gone and it seems as if Google has just kicked the UK website. How long will it take for our rankings to return to normal? Thanks for the advice, Karl
Technical SEO | | KarlBantleman0 -
Hundreds of Thousand Spammy Backlinks Overnight
Hello,
Technical SEO | | JDLitchfield
I have a client who unfortunately got breached (not sure how) and as a result 6 HTML files promoting gucci bags and Louis Vuitton bags were put in the root.
I found the files within a week of them being put there but what I didn't realise (and only found yesterday when looking at the backlink profile) was that there are literally hundreds if not thousands of spammy domains pointing at these files. Some of the sites 404 but some are posts on other bloggers sites who auto accept comments and they total 10,000 links so impossible to remove. Question is: Will Google understand what has happened and ignore these links (especially because the pages don't exist on the server?) Should I use the Disavow tool to block these 1000 odd domains (can it do any harm?) and more links are being found every day so do I just keep doing it? Is there another way to explain to Google what has happened? Your help would be greatly appreciated. Thanks
James0 -
How to display the full structure of website on Google serps
I have been searching around but unable to gather information as to how we can control or list top pages of a website on Google's first page , i.e. if we type seomoz in google , we can see the main listing with 6 subdomain listings , which link to Blog , Seo tool , Beginner Seo guide , Learn Seo , Pricing & Plans and login My question is can we control these listings i.e. what to display and what not , and if yes how can we make this type of visibility on first page , by using html or xml sitemaps or theirs something mostly websites are missing. Cause this type of data is coming up for very less websites and mostly websites are with single urls. c43Ki.jpg
Technical SEO | | ngupta10 -
Should I promote each section of my website
Hi, i have a magazine website and i have been heavily promoting the main page of the site thinking that all the work i am doing for the main page which includes links and so on would then pass onto the rest of my site but i have a feeling this is not correct. Can anyone let me know if i should be concentrating on each section of the site and also on my articles should i be promoting these articles or let the search engines pick them up. I already use facebook and twitter to promote new articles but i would like to know if i should be doing more than this
Technical SEO | | ClaireH-1848860 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
Removing pages from website
Hello all, I am fairly new to the SEOmoz community. But i am working for a company which organizes exhibitons, events and training in Holland. A lot of these events are only given ones ore twice and then we do not organise them any more because they are no longer relevant. Every event has its own few webpages which provide information about the event and are being indexed by Google. In the past we did not remove any of these events. I was looking in the CMS and saw a lot of events of 2008 and older which are being indexed. To clean the website and the CMS i am thinking of removing these pages of old events. The risk is that these pages have some links to them and are getting some traffic, so if i remove them there is a risk of losing traffic and rankings. What would be the wise thing to do? Make a folder with archive or something? Regards, Ruud
Technical SEO | | RuudHeijnen0