How to find 20 hidden 404s
-
Hello,
We have like twenty 404s left to find. How do you find these when:
1. They don't show up in Google Webmaster Tools
2. They don't have any other internal or external pages linking to them.
3. They don't show up in site:domain.com (We have 9000 pages and only 600 show up - I fixed those out of the 600).
4. They are probably causing high bounce rates.
5. They're not in the sitemap
Thanks!
-
You should be able to crawl the entire site without increasing the RAM, once you buy the paid version.
-
Right in the paid version wont stop! You can see in their website that the limit of urls will be remove once you buy the license. http://www.screamingfrog.co.uk/seo-spider/licence/
It should catch everything, maybe you can contact their support just to make sure, they are very good on support over twitter.
-
Yes, but if it only check 397 in the free version is it going to stop there in the paid version as well. Just making sure.
Also, will it catch everything?
-
Yes I just run a craw for 500 000 urls
-
The free version checked 397 URLs. Will the purchased version check all 9000?
-
You can try crawl the site using screamingfrog, make sure you check the box “Check Links Outside Folder” in the spider menu.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Finding Ranking for search term and increasing ranking
Hi. The company that I'm working with would like to rank highly in google for certain generic search terms (dentist, dentists, etc.). Certain websites the company has used to rank highly in google for generic keywords, but has not for years now since google has revised their algorithm so many times. Moz lists that the company websites are not found in the top 51+ results in google. My first question is: **Is there a way, apart from manually searching the results, to find the ranking position of the website in google? **Ideally, I would like to find a program that will do this. Second, I've been reading a lot of the great articles and comments on Moz, and I've been learning a lot more about SEO. My focus has shifted to spending more attention on User Experience and Social Media instead of placing the exact keywords in the pages / tags of the website. What area(s) should I be focusing on to best increase the ranking of the company website for certain generic terms? Ideally, I'd like to create good quality content, so that users will not instantly click away. I appreciate any thoughts or comments. Thank you in advance!
Intermediate & Advanced SEO | | americasmiles0 -
Is there a tool to find out if a URL has been deemed "SPAM" by GOOGLE
I am currently doing a link audit on one of my sites and I am coming across some links that appear to be spam. Is there a tool that I can plug their URL into to see if they have been deemed spam by GOOGLE?
Intermediate & Advanced SEO | | Mozd0 -
Can't find X-Robots tag!
Hi all. I've been checking out http://www.unthankbooks.com/ as it seems to have some indexing problems. I ran a server header check, and got a 200 response. However, it also shows the following: X-Robots-Tag:
Intermediate & Advanced SEO | | Blink-SEO
noindex, nofollow It's not in the page HTML though. Could it be being picked up from somewhere else?0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
Cant find my home page to seo it....aajhhhhhh
Hi Guys, This might be more of a joomla thiing than a SEO thing but it is correlated as I need to seo this pgage and i cant find it. Please help if you can, while my developer is on hols, this is driving me nuts!! I can find the article sections in Joomla 2.5 to edit all the text in my other pages but for some reason cannot find the text for the home page!!??? any ideas? Please...?? He set a lot of it up using CSS and Jquery / php etc....so im a little confused as to why I can find the html to edit.......aaahhhhhhhh Thanks guys, Im sure its quite easy!! Thanks in advance. Craig
Intermediate & Advanced SEO | | craigyboy0 -
Where is the best place to find an SEO coach?
I'm looking for a coach who can help me get to the next level with SEO and help me determine what's junk and what's true advice. Are there any recommendations any of you have out there on where to find such a person?
Intermediate & Advanced SEO | | kadesmith0 -
Old pages still crawled by SE returning 404s. Better to put 301 or block with robots.txt ?
Hello guys, A client of ours has thousand of pages returning 404 visibile on googl webmaster tools. These are all old pages which don't exist anymore but Google keeps on detecting them. These pages belong to sections of the site which don't exist anymore. They are not linked externally and didn't provide much value even when they existed What do u suggest us to do: (a) do nothing (b) redirect all these URL/folders to the homepage through a 301 (c) block these pages through the robots.txt. Are we inappropriately using part of the crawling budget set by Search Engines by not doing anything ? thx
Intermediate & Advanced SEO | | H-FARM0