4xx errors
-
Hi
I checked in my campaign to look for errors on my page and i have got a report showing me a lot of 404 broken or dead links error. So how can i view the source of the broken link in order to fix it.
Thank you!
-
I don't see any "drop down" on the SEOmoz listed broken links... I also would like to know where to find the source of the broken link! Why don't show the source right inside the error report?
-
Google Webmaster Tools -- create an account there if you don't already have one -- is also a useful way to find 404 errors and track down their sources. Once your account is setup, go to Diagnostics > Crawl Errors > HTTP (this is the deault tab for the "Crawl Errors" screen).
-
You want to see where the links to the broken page are coming from?
3 options:
-
Xenu - http://home.snafu.de/tilman/xenulink.html - run that on your site and it will tell you. It's not the prettiest solution though.
-
Webmaster tools - Diagnostics > Crawl Errors - click on the page that is a 404 and it will tell you where the links are coming from.
-
SEOmoz - Set up a campaign in the pro section and in the crawl it will give you 4xx errors. Click on that then on each broken link drop down there an 'Explore links' option. That will open up OpenSiteExplorer for that link and show you where you're getting links from
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
UTM source errors in google search console
Dear Friends, I need help with UTM source and UTM medium errors. There are 300 such errors on my site which is affecting the site i think, The URL appended at the end is utm_source=rss&utm_medium=rss&utm_campaign= How do i resolve this? Please help me with it.Thanks ccEpFDn.png ccEpFDn.png
Reporting & Analytics | | marketing910 -
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
Can 500 errors hurt rankings for an entire site or just the pages with the errors?
I'm working with a site that had over 700 500 errors after a redesign in april. Most of them were fixed in June, but there are still about 200. Can 500 errors affect rankings sitewide, or just the pages with the errors? Thanks for reading!
Reporting & Analytics | | DA20130 -
Spike in 404 Errors from a redirecting domain
All, The non www version of a site I own redirects to the www version. Recently, WMT began showing a big spike in 404 errors (1,000+) on the non www version of the site. Site traffic is off about 15% since the spike in 404 errors. There was also a brief period about two weeks ago, where the site went down due to an issue with the code that has now been resolved. Any ideas how WMT is showing 404 errors on redirected pages? Thanks, John
Reporting & Analytics | | JSOC0 -
Not found and Not followed Errors on Web Master Tools
Just noticed in web master tools that we are showing 150 not found errors on our site, the majority appear to be from old blog posts that have been deleted, is this damaging from an SEO perspective on a scale from 1-10? Also we have over 100000 not followed errors throughout the same site, is this damaging from an SEO perspective on a scale from 1-10? Thanks in Advance Andy
Reporting & Analytics | | First-VehicleLeasing0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
How serious are the Duplicate page content and Tags error?
I have a travel booking website which reserves flights, cars, hotels, vacation packages and Cruises. I encounter a huge number of Duplicate Page Title and Content error. This is expected because of the nature of my website. Say if you look for flights between Washington DC and London Heathrow you will at least get 60 different options with same content and title tags. How can I go about reducing the harm if any of duplicate content and meta tags on my website? Knowing that invariably I will have multiple pages with same content and tags? Would appreciate your advice? S.H
Reporting & Analytics | | sherohass0 -
Spider 404 errors linked to purchased domain
Hi, My client purchased a domain which based on the seller "promising lots of traffic". Subsequent investigation showed it was a scam and that the seller had been creative in Photoshop with some GA reports. Nevertheless, my client had redirected the acquired domain to their primary domain (via the domain registrar). From the period on which the acquired domain was redirected to the point when we removed the redirect, the web log files had a high volume of spider/bot 404 errors relating to an online pharmaacy - viagra, pills etc. The account does not seem to have been hacked. No additional files are present and the rest of the logs seem normal. As soon as the redirect was removed the spider 404 errors stopped. Aside from the advice about acquiring domains promising traffic which I've already discussed with my client, does anybody have any ideas about how a redirect could cause the 404 errors? Thanks
Reporting & Analytics | | bjalc20110