Is Webmaster Tools Useless as a broken Link Detector?
-
Buongiorno from yes we still have free parking Wetherby UK!
Ok when it comes to detecting broken links I'm getting really frustrated with webmaster tools. Now I'm probably going to end up with egg on my face with this one but here is an example of webmaster tools reporting a broken link which i cant find.http://i216.photobucket.com/albums/cc53/zymurgy_bucket/phantom-broken-links_zpsb74e1246.jpg
Having trawled through the code i just cant see the knackered link? Is it a phantom report or is something usefull being detected here?
Grazie tanto,
David -
Hi David,
Check the date when the broken link was detected in Google Webmaster Tools. The reports get updated late in GWT. To ensure that all the internal links ae working fine, use Xenu Link Sleuth. It is a freely available tool and is pretty good at detecting broken links.
Regards
-
I use a chrome extension and it is very reliable, called check my links. I checked the page with it and it didn't report any broken link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will noindex pages still get link equity?
We think we get link equity from some large travel domains to white label versions of our main website. These pages are noindex because they're the same URLs and content as our main B2C website and have canonicals to the pages we want indexed. Question is, is there REALLY link equity to pages on our domain which have "noindex,nofollow" on them? Secondly we're looking to put all these white label pages on a separate structure, to better protect our main indexed pages from duplicate content risks. The best bet would be to put them on a sub folder rather than a subdomain, yes? That way, even though the pages are still noindex, we'd get link equity from these big domains to www.ourdomain.com/subfolder where we wouldn't to subdomain.ourdomain.com? Thank you!
Reporting & Analytics | | HTXSEO0 -
Can not divide in different properties a domain in Search Console (Webmaster Tools)
Dear Moz Community, I hope you can give me a hand with the following questions. Im in charge of SEO of an ecommerce site in LATAM. It´s service is available in several countries, therefore each country has it subdirectory Eg. /ar /pe /co /bo /cl /br,etc... (in the future we will move to differente ccTLDs). I have been recomended to split or create different Search Console or Webmaster Tools properties (one for each subdirectory) but when Im creating a new property with a subdirectory, lets say www.domain.com/ar, Webmaster tools starts creating a property for www.domain.com/ar/ (NOTICE THE LAST SLASH) and it returns since that page doesn´t exist, what do you recomend me to do? Best wishes, Pablo Lòpez C
Reporting & Analytics | | pablo_carrara0 -
Domain Authority and Links drastically decreased in less than a month
I'm fairly new to SEO, and am left completely perplexed about why our two SEO client's links and DA decreased so drastically in so little time. Looking at their stats on Moz, between May and June, their DA's went down by 3-5 (which, they were fairly low to begin with), and external links were halved! This is rather concerning to me as building their external links has been my focus for the past few months for both of them. There are a number of tweaks that I could make to site structure in terms of meta descriptions and page title length (though I can't fix that as the clients want to keep the titles as is), however there are no critical errors. I have also placed a lot of filters on both of their Google Analytics pages as they were getting what I just now realized was a ton of spam, though I don't see how that would affect anything. I just can't imagine what lead to all of those external link suddenly dropping. I imagine the drastic drop in those lead to the DA decrease, but again, I'm very new, so I'm not certain. Any help at all would be greatly appreciated. I have been panicking quite a bit since I've seen this!
Reporting & Analytics | | everestagency0 -
Google Webmaster Tools - When will the links go away!?
About 9 months back we thought having an extremely reputable company build our client some local citations would be a good idea. You definitely know this citation company, but I'll leave names out. Regardless, it's our mistake to cut corners. Google Webmaster Tools quickly picked up these new citations and added them to the links section. One of these citation spawned a complete mess of about 60K+ links on their network of sites through ridiculous subdomains of every state in the country and so many other domain variations. We immediately went into remove mode and had the site's webmaster take down the bad links from their site. This process took about a month for outreach. The bad links (60K+) have not been on the spam site for well over 6 months but GWT still shows them in the "links to your site" section. Majestic, Bing, and OSE only displayed the bad links for a brief time. Why is webmaster tools still showing these links after 6+ months? We typically see GWT update about every 2 weeks, a month tops. Any ideas? Could a changed robots.txt on the bad site prevent Google from updating the links displayed in GWT? We have submitted to disavow, but Google replied with "no manual penalty". We even blasted the bad site with Fiverr links, in hopes that Google would re-crawl them. No luck with anything we do. We have patiently waited for way too long. The rankings for this site got crushed on Google after these citations. How do we fix this? Should we worry about this? Any advice would really help. Thanks so much in advance.
Reporting & Analytics | | zadro0 -
'Search Queries Report' in Webmaster Tools Question
Hi, How much do you use the search queries report in webmaster tools to research current rankings/movements? It does look like a great tool but the data doesn't seem to be spot on. For example a keyword over a week might have flux in position so lets say 6.0 then 9.2 for 3 days then back to 6.0. But I check the serp's for this keyword everyday and didn't see any movement?!?! Is this a good tool for you?
Reporting & Analytics | | activitysuper0 -
Why Webmaster tools shows different avg. positions?
Hi everyone, I am new to Seomoz, I loved this amazing SEO school 🙂 My questions is about webmaster tool. Webmaster tool shows that my site's avg. positions is 34 and its getting better every week. I even see some new queries, new avg. positions for that queries. That's great ! But when I search the avg position it is between 54 -60 and I don't see any new queries . Shouldnt we rely on what webmaster tells us about avg. positions? Thanks a lot
Reporting & Analytics | | EzgiGunyel-InfinPixels0 -
Will javascript generated links affect my bounce rate?
Hi all, I run a site called Applicable Jobs (http://www.applicablejobs.com) and from analyising my analytics I notice my bounce rate is unusually high at around 85%. I'm keen to get this right down as I've read recently that a high bounce rate is a metric Google uses in determining positioning in the SERPs. I honestly don't think it's the quality of my content because I feel it's genuinely useful to my target audience but I'm wondering if the way my jobs list is generated is causing an issue. At the moment I have my jobs listings generated through javascript so I can have nice effects and use a bit of ajax but if Google crawls it, it obviously won't be able to see the listings. So I'm wondering if when a user comes to the site and they click on one of the job listings, does the Google analytics code recognise that click because that link is generated through javascript? Thanks
Reporting & Analytics | | Benji870 -
Googlebot encountered extremely large numbers of links on your site??? How Do I resolve this?
I am working on a site with over 30 million pages. Every time I get about One Million indexed I get a Message in the Google Webmasters Tools saying "Googlebot encountered extremely large numbers of links on your site" The indexing then starts dropping like a Rock. I need to get the site indexed. Please Help!
Reporting & Analytics | | GlobalFlex0