Is there a tool to figure out bad backlinks
-
With the new changes to the google algorithm. I'm trying to figure out what links google may think are hurting my site. Any thoughts? Thanks
-
I see.
I would presume that these guys have been penalised then. It may be that they are now trying to reindex their site and are still appearing in the listings but much lower now.
There isnt really a definite way of knowing if they have been banned, but as good practise, I would try and get any URL removed from a website that I had any doubts about.
You could maybe try contacting the Google Webmaster team to confirm this, but I don't know how useful they will be. Then again, its worth a try at least.
Matt.
-
Hi Matt,
The sites I believe might have a problem are still showing up in the results like that. However, where they might have ranked the first page before, some of the blog sites that we link with are not in the top 100 results anymore. Strange...
-
Hi Matt,
The sites I believe might have a problem are still showing up in the results like that. However, where they might have ranked the first page before, some of the blog sites that we link with are not in the top 100 results anymore. Strange...
-
Hi Morgan,
The website will show in Google if you do a site:www.domain.com search.
If you searched for 'domain' (replace this with their website name, i.e. seo moz) and they don't show up in the top few listings thn you can be pretty sure that they have been banned.
As soon as this is the case, I would either contact the webmaster or manually delete the link if you can.
Good luck!
Matt.
-
Hi Matt,
Thanks for the help. If the site is still showing page results that is linking to you, that means it is not banned? I see a few that I have on blog rolls, but that site still shows 1500 results with google.
-
Hi Morgan,
Unfortunately, there isn't a quick way to do this.
What I have done is use Open Site Explorer and downloaded the .csv file of all the linking domains to my website.
Now that I have them in a spreadsheet, it is a bit easier to filter through them. I have been drilling down on links from blogs with a particularly low PA or DA. Then just doing the hard task of checking to see if they have individually been banned by Google. You can do this by searching for their domain to see if they appear.
This is a slow process, but better safe than sorry, eh?
Hope this helps.
Matt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Clarification on indexation of XML sitemaps within Webmaster Tools
Hi Mozzers, I have a large service based website, which seems to be losing pages within Google's index. Whilst working on the site, I noticed that there are a number of xml sitemaps for each of the services. So I submitted them to webmaster tools last Friday (14th) and when I left they were "pending". On returning to the office today, they all appear to have been successfully processed on either the 15th or 17th and I can see the following data: 13/08 - Submitted=0 Indexed=0
Technical SEO | | Silkstream
14/08 - Submitted=606,733 Indexed=122,243
15/08 - Submitted=606,733 Indexed=494,651
16/08 - Submitted=606,733 Indexed=517,527
17/08 - Submitted=606,733 Indexed=517,498 Question 1: The indexed pages on 14th of 122,243 - Is this how many pages were previously indexed? Before Google processed the sitemaps? As they were not marked processed until 15th and 17th? Question 2: The indexed pages are already slipping, I'm working on fixing the site by reducing pages and improving internal structure and content, which I'm hoping will fix the crawling issue. But how often will Google crawl these XML sitemaps? Thanks in advance for any help.0 -
Organizing A Backlink Authority Category Page
I work for a company that has many promotions throughout the year, some big, some HUGE. Typically they have created a landing page for this content. The issue is, when this promotion ends, we will kill the landing page, thus 404ing the backlinks and putting the page authority in purgatory. (1) What would be the best way the keep these pages organized? I was thinking about creating a main "Promotions" page with the current promotion on it (the previous ones linked on the bottom of the page). Then when the promotion ends I would copy those contents and add them to a new page and link to it from the original "promotions" page. An issue I see with this is that the promotions page would always have the same Title Tag and vanity URL. (2) This could provide many links to the "promotions" page over time to build it's authority, but would constantly changing content hurt ranking factors?
Technical SEO | | nat88han0 -
Google Disavow Tool
Some background: My rankings have been wildly fluctuating for the past few months for no apparent reason. When I inquired about this, many people said that even though I haven't received any penalty notice, I was probably affected by penguin. (http://moz.com/community/q/ranking-fluctuations) I recently did a link detox by LinkRemovalTools and it gave me a list of all my links, 2% were toxic and 51% were suspiscious. Should I simply disavow the 2%? There are many sites where is no contact info.
Technical SEO | | EcomLkwd0 -
Strange Webmaster Tools Crawl Report
Up until recently I had robots.txt blocking the indexing of my pdf files which are all manuals for products we sell. I changed this last week to allow indexing of those files and now my webmaster tools crawl report is listing all my pdfs as not founds. What is really strange is that Webmaster Tools is listing an incorrect link structure: "domain.com/file.pdf" instead of "domain.com/manuals/file.pdf" Why is google indexing these particular pages incorrectly? My robots.txt has nothing else in it besides a disallow for an entirely different folder on my server and my htaccess is not redirecting anything in regards to my manuals folder either. Even in the case of outside links present in the crawl report supposedly linking to this 404 file when I visit these 3rd party pages they have the correct link structure. Hope someone can help because right now my not founds are up in the 500s and that can't be good 🙂 Thanks is advance!
Technical SEO | | Virage0 -
What is the best way to remove and fight back backlink spam?
Removing low quality and spam backlinks. What is the most effective clean-up process?
Technical SEO | | matti_wilson0 -
Is there actual risk to having multiple URLs that frame in main url? Or is it just bad form and waste of money?
Client has many urls that just frame in the main site. It seems like a total waste of money, but if they are frames, is there an actual risk?
Technical SEO | | gravityseo0 -
Anyone tested Open Site Explorer backlink numbers ?
Hi, I am new here and I have noticed that open site explorer reports about half of the links on my page. I am really interested in knowing your perspective on this. cheers, Vishal
Technical SEO | | vishalkhialani0