Can we retrieve all 404 pages of my site?
-
Hi,
Can we retrieve all 404 pages of my site?
is there any syntax i can use in Google search to list just pages that give 404?
Tool/Site that can scan all pages in Google Index and give me this report.
Thanks
-
The 404s in webmaster tools relate to crawl errors. As such they will only appear if internally linked. It also limits the report to the top 1000 pages with errors only.
-
Set up a webmaster tools account for your site. You should be able to see all the 404 error urls.
-
I wouldn't try to manually remove that number of URLs. Mass individual removals can cause their own problems.
If the pages are 404ing correctly, then they will be removed. However it is a slow process. For the number you are looking at it will mostly likely take months. Google has to recrawl all of the URLs before it even knows that they are returning a 404 status. It will then likely wait a while and do it again before removing then. That's a painful truth and there really is not anything much you can do about it.
It might (and this is very arguable) be worth ensuring that there is a crawl path to the 404 content. So maybe a link from a high authority page to a "recently removed content" list that contains links to a selection and keep replacing that list. This will help that content get recrawled more quickly, but it will also mean that you are linking to 404 pages which might send quality signal issues. Something to weigh up.
What would work more quickly is to mass remove in particular directories (if you are lucky enough that some of your content fits that pattern). If you have a lot of urls in mysite.com/olddirectory and there is definitely nothing you want to keep in that directory then you can lose big swathes of URLs in one hit - see here: https://support.google.com/webmasters/answer/1663427?hl=en
Unfortunately that is only good for directories, not wildcards. However it's very helpful when it is an option.
So, how to find those URLs? (Your original question!!).
Unfortunately there is no way to get them all back from google. Even if you did a search for site:www.mysite.com and saved all of the results it will not return the number of results that you are looking for.
I tend to do this by looking for patterns and removing those to find more patterns. I'll try to explain:
- Search for site:www.yoursite.com
- Scroll down the list until you start seeing a pattern. (eg mysite.com/olddynamicpage-111.php , mysite.com/olddynamicpage-112.php , mysite.com/olddynamicpage-185.php etc) .
- Note that pattern (return later to check that they all return a 404 )
- Now search again with that pattern removed, site:www.mysite.com -inurl:olddynamicpage
- Return to step 2
Do this (a lot) and you start understanding the pattern that have been picked up. There are usually a few that account for large number of the incorrectly indexed URLs. In the recent problem I did they were almost all relating to "faceted search gone wrong".
Once you know the patterns you can check that the correct headers are being returned so that they start dropping out of the index. If any are directory patterns then you can remove than in big hits through GWMT.
It's painful. It's slow, but it does work.
-
Yes you need right at the same time to know which of the google indexed ones are 404
As google does not remove the dead 404 pages for months and was thinking to manually add them for removal in webmaster tools but need to find all of them that are indexed but 404
-
OK - that is a bit of a different problem (and a rather familiar one). So the aim is to figure out what the 330 "phantom" pages are and then how to remove them?
Let me know if I have that right. If I have then I'll give you some tips based on me doing to same with a few million URLs recently. I'll check first though, as it might get long!
-
Thanks you
I will try explaining my query again and you can correct me if the above is the solution again
1. My site has 70K pages
2. Google has indexed 500K pages from the site
Site:mysitename shows this
We have noindexed etc on most of them which is got down the counts to 300K
Now i want to find the pages that show 404 for our site checking the 300K pages
Webmaster shows few hundred as 404 but am sure there are many more
Can we scan the index rather then the site to find the ones Google search engine has indexed that are 404
-
As you say, on site crawlers such as Xenu & Screaming frog will only tell you when you are linking to 404 pages, not where people are linking to your 404 pages.
There are a few ways you can get to this data:
Your server logs : All 404 errors will be recorded on your server. If someone links to a non-existent page and that link is ever followed by a single user or a crawler like google-bot, that will be recorded in your server log files. You can access those directly (or pull 404s out of them on a regular, automatic basis). Alternatively most hosting comes with some form of log analysis built in (awstats being one of the most common). That will show you the 404 errors.
That isn't quite what you asked, as it doesn't mean that they have all been indexed, however that will be an exhaustive list that you can then check against.
Check that backlinks resolve : Download all of your backlinks (OSE, webmaster tools, ahreafs, majestic), look at the target and see what header is returned. We use a custom build tools called linkwatchman to do this on an automatic regular basis. However as an occasional check you can download in to excel and use the excellent SEO Tools for excel to do this for free. ( http://nielsbosma.se/projects/seotools/ <- best seo tool around)
Analytics : As long as your error pages trigger the google analytics tracking code you can get the data from here as well. Most helpful when the page either triggers a custom variable, or uses a virtual url ( 404/requestedurl.html for instance). Isolate the pages and look at where the traffic came from.
-
It will scan and list you all results, like 301 redirect, 200, 404 errors, 403 errors. However, screaming frog can spider upto 500 urls in there free product
If you have more, suggest to go with Xenu Link Sleuth. Download it, get your site crawled and get all pages including server error 404 to unlimited pages.
-
Thanks but this would be scanning pages in my site. How will i find 404 pages that are indexed in Google?
-
Hey there
Screaming Frog is a great (and free!) tool that lets you do this. You can download it here
Simply insert your URL and it will spider all of the URLs it can find for your site. It will then serve up a ton of information about the page, including whether it is a 200, 404, 301 or so on. You can even export this information into excel for easy filtering.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I find all broken links pointing to my site?
I help manage a large website with over 20M backlinks and I want to find all of the broken ones. What would be the most efficient way to go about this besides exporting and checking each backlink's reponse code? Thank you in advance!
Intermediate & Advanced SEO | | StevenLevine3 -
Would be the network site map page considered link spam
In the course of the last 18 months my sites have lost from 50 to 70 percent of traffic. Never have used any tricks, just simple white-hat SEO. Anyway, I am now trying to fix things that hadn't been a problem before all those Google updates, but apparently now are. Would appreciate any help.. I used to have a network site map page on everyone of my sites (about 30 sites). It basically would be a page called 'our network' and it'll show a list of links to all of my other sites. These pages were indexed, had decent PR and didn't seem to cause any problem. Here's an example of one of them:
Intermediate & Advanced SEO | | romanbond
http://www.psoriasisguide.ca/psoriasis_scg.html In the light of Panda and Penguin and all these 'bad links' I decided to get rid of most of them. My traffic didn't recover at all, it actually went further down. Not sure if there is any connection to what I'd done. So, the question is: In your opinion/experience, do you think such network sitemap pages could be causing penalties for link spam?0 -
Web pages fighting over rank for one keyword. Can it be stopped?
Hey, See attachment. Website is Omega Red. The page I want to rank for seems like it is being held back by other closely related pages with similar titles. I am looking to rank for electrical earthing with this page. On the graph it shows how the other pages have interacted over a period of time on the website and how if they drop out of the top 50 this page then moves up in Google. I don't really want to canonicalise the other pages into one but maybe this is what needs to happen? Any suggestions? bWymgVt.jpg
Intermediate & Advanced SEO | | Hughescov0 -
Optimize the category page or a content page?
Hi, We wish to start ranking on a specific keyword ("log house prices" in italian). We have two options on what pages we should optimize for this keyword: A long content page (1000+ words with images) Log houses category page, optimized for the keyword (we have 50+ houses on this page, together with a short price summary). I would think that we have better chances with ranking with option nr.2 , but then we can't use that page for ranking with a more short-tail keyword (like "log houses"). What would you suggest? Is there maybe a third option for this?
Intermediate & Advanced SEO | | JohanMattisson0 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
Can a Hosting provider that also hosts adult content sites negatively affect our SEO rankings on a non-adult site hosted on same platform?
We're considering moving a site to a host that also offers hosting for adult websites. Can this have a negative affect on SEO, if our hosting company is in any way associated with adult websites?
Intermediate & Advanced SEO | | grapevinemktg0 -
Have completed keyword analysis and on page optimization. What else can I do to help improve SERP ranking besides adding authoritative links?
Looking for concrete ways to continue to improve SERP results. thanks
Intermediate & Advanced SEO | | casper4340 -
Can you advise why my site get outranked by sites with way less authority and so on
Hello SeoMoz, As a new member I first want to thank you guys for your service, seomoz is by far the best resource and toolbox I have ever found. I have a question, or more of a request if you could advise me on what I do wrong.
Intermediate & Advanced SEO | | DennisForte
I have a website: www.letsflycheaper.com with a Domain Authority of 80, and my target keywords are keywords like: cheap business class, business class flights.
My target page is: www.letsflycheaper.com/business-class.php. With all my keywords I am page 2 and I have a real hard time getting on the first page, but if I look at my competitors like: www.wholesale-flights.com with a Domain Authority of 'just' 50, crappy backlinks and so on, they are all on the first page with almost all of my keywords that I want to target. What do I do wrong? Can you maybe give me a couple tips on where I should focus on more? Hopefully you guys can help me... Kind Regards, Ramon van Meer0