4xx errors
-
Hi
I checked in my campaign to look for errors on my page and i have got a report showing me a lot of 404 broken or dead links error. So how can i view the source of the broken link in order to fix it.
Thank you!
-
I don't see any "drop down" on the SEOmoz listed broken links... I also would like to know where to find the source of the broken link! Why don't show the source right inside the error report?
-
Google Webmaster Tools -- create an account there if you don't already have one -- is also a useful way to find 404 errors and track down their sources. Once your account is setup, go to Diagnostics > Crawl Errors > HTTP (this is the deault tab for the "Crawl Errors" screen).
-
You want to see where the links to the broken page are coming from?
3 options:
-
Xenu - http://home.snafu.de/tilman/xenulink.html - run that on your site and it will tell you. It's not the prettiest solution though.
-
Webmaster tools - Diagnostics > Crawl Errors - click on the page that is a 404 and it will tell you where the links are coming from.
-
SEOmoz - Set up a campaign in the pro section and in the crawl it will give you 4xx errors. Click on that then on each broken link drop down there an 'Explore links' option. That will open up OpenSiteExplorer for that link and show you where you're getting links from
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster indicates robots.text access error
Seems that Google has not been crawling due to an access issue with our robots.txt
Reporting & Analytics | | jmueller0823
Late 2013 we migrated to a new host, WPEngine, so things might have changed, however this issue appears to be recent. A quick test shows I can access the file. This is the Google Webmaster Tool message: http://www.growth trac dot com/: Googlebot can't access your site January 17, 2014 Over the last 24 hours, Googlebot encountered 62 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 8.8% Note the above message says 'over the last 24 hours', however the date is Jan-17 This is the response from our host:
Thanks for contacting WP Engine support! I looked into the suggestions listed below and it doesn't appear that these scenarios are the cause of the errors. I looked into the server logs and I was only able to find 200 server responses on the /robots.txt. Secondly I made sure that the server wasn't over loaded. The last suggestion doesn't apply to your setup on WP Engine. We do not have any leads as to why the errors occurred. If you have any other questions or concerns, please feel free to reach out to us. Google is crawling the site-- should I be concerned? If so, is there a way to remedy this? By the way, our robots file is very lean, only a few lines, not a big deal. Thanks!0 -
Sitemap 404 error
I have generated a .xml sitemap of the site www.ihc.co.uk. The sitemap generated seems all fine, however when submitting to webmaster tools, it is returning a 404 error? anyone experienced this before. deleted and re-done the process. Tried different xml sitemap generators and even cleared cache along the way.
Reporting & Analytics | | dentaldesign0 -
Can 500 errors hurt rankings for an entire site or just the pages with the errors?
I'm working with a site that had over 700 500 errors after a redesign in april. Most of them were fixed in June, but there are still about 200. Can 500 errors affect rankings sitewide, or just the pages with the errors? Thanks for reading!
Reporting & Analytics | | DA20130 -
Moz is showing different "errors" than Webmaster tools
I have set up my Moz campaign and the crawl errors are showing multiple duplicate content and page titles however when I check my webmaster tools data, these errors are not showing up. Is this normal and who should I listen to?
Reporting & Analytics | | LabelMedia0 -
SEO Moz Errors
We have SEO Moz Errors and warnings showing up, yet we have cleaned them
Reporting & Analytics | | RNK
up. The same errors were showing up in Google's Webmaster tools but after we corrected them they do not show up as crawl errors in Webmaster tools.
Why is SEO Moz different and why does it continue to show corrections already made.0 -
Google Webmasters DNS error
Hi, In my webmaster tools I have a yellow triangle stating that there is a DNS error that is preventing Google crawling my sites. The site is indexed and I have checked fetch as Google and that seems ok but the triangle is still there every time I check it. The whois sites all have the correct information and point to Hostgator who I am using. I have contacted them and they said everything seems ok. Should I just carry on as normal with my link building as the site is indexed or investigate even further? Cheers, Stuart
Reporting & Analytics | | stuart420 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
500 errors and impact on google rankings
Since the launch of our newly designed website about 6 months ago, we are experiencing a high number of 500 server errors (>2000). Attempts to resolve these errors have been unsuccessful to date. We have just started to notice a consistent and sustained drop in rankings despite our hard sought efforts to correct. Two questions... can very high levels of 500 errors adversely effect our google rankings? And, if this is the case, what type of specialist (what are they called) has expertise to investigate and fix this issue. I should also mention that the sitemap also goes down on a regular basis, which some have stated is due to the size of the site (>500 pages). Don't know if they're part of the same problem? Thanks.
Reporting & Analytics | | ahw0