Dismiss crawl diagnostics error
-
Hello everyone,
Is there a way to dismiss some errors in the Crawl Diagnostics tool so they don't appear again? It happens so that some of the errors are never going to be fixed because of their nature. For example, 'Title too long' errors that point to some of the threads on my forum - it doesn't make sense to change the title of a thread posted by user just for the sake of the error disappearing from the 'Crawl Diagnostics' tool.
Otherwise the CD interface gets a little bit cluttered with errors which I will never fix anyway.
I wonder how others deal with this problem.
Thanks.
-
Ryan, I completely agree. A list, whatever it may be, is of little value, unless you can tick/check items on that list as completed.
However, overall the Crawl Diagnostics tool is very helpful.
-
I asked the help desk this exact question. In short, the answer was No.
The official explanation was the tool is designed to err on the side of offering too-much information, rather then too little.
My reply....that's not helpful. A report is useful when it provides data that is either informative or actionable. The very first time I looked at a site crawl report and noticed the "title too long" errors, I investigated each one. I resolved many issues, but many were acknowledged and no action is going to be taken.
The site involved has a forum section. The forum software automatically appends the site's title to all titles. It's not a bad thing at all. Also, the thread title are controlled by users, and sometimes they choose lengthy titles. I could adjust scripts to cut off the titles and cut-off the title @70 characters, but why? To satisfy a specific tool which is designed to assist me? That's an approach I decided against long ago.
I whole-heartedly agree. we should be able to say "hey Roger, thanks for letting me know. The first time was informative, but now you are being a pain in the ass." I still wish to be informed about NEW title-too long errors, but I wish to be able to disable the warning about these "errors" which have been acknowledged.
Until the moz team upgrades the tool, or someone creates a better crawl tool, we are stuck with this issue. I would be interested to hear if anyone has found a better tool elsewhere.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Blog archive pages in Craw Error Report
Hi there, I'm new to MOZ Pro and have a question. My scan shows Archive pages as having crawl issues, but this is because Yoast is set up to block robots on these pages. Should I be allowing search engines to crawl these pages, or am I fine to leave them as I have it set up already? Any advice is greatly appreciated.
Moz Pro | | mhenshall
Marc0 -
Why is Link Count smaller than Internal Links in Crawl Test report?
We recently ran the crawl test report and for most of our pages we are getting 1150 internal links but 40-50 as the link count. Why is there such a big disparity?
Moz Pro | | usdmseo0 -
What is the best approach to handling 404 errors?
Hello All - I'm a new here and working on the SEO on my site www.shoottokyo.com. When I am finding 4xx (Client Errors) what is the best way to deal with them? I am finding an error like this for example: http://shoottokyo.com/2010/11/28/technology-and-karma/ This may have been caused when I updated my permalinks from shoottokyo.com/2011/09/postname to shoottokyo.com/postname. I was using the plug in Permalinks moved permanently to fix them. Sometimes I am able to find http://shoottokyo.com/a-very-long-week/www.newscafe.jp and I can tell that I simply have a bad link to News Cafe and I can go to the post and correct it but in the case of the first one I can't find out where the crawler even found the problem. I'm using Wordpress. Is it best to just use a plugin like 'Redirection' to move the rest that have errors where I cannot find the source of the issue? Thanks Dave
Moz Pro | | ShootTokyo0 -
Functionality of SEOmoz crawl page reports
I am trying to find a way to ask SEOmoz staff to answer this question because I think it is a functionality question so I checked SEOmoz pro resources. I also have had no responses in the Forum too it either. So here it is again. Thanks much for your consideration! Is it possible to configure the SEOMoz Rogerbot error-finding bot (that make the crawl diagnostic reports) to obey the instructions in the individual page headers and http://client.com/robots.txt file? For example, there is a page at http://truthbook.com/quotes/index.cfm month=5&day=14&year=2007 that has – in the header -
Moz Pro | | jimmyzig
<meta name="robots" content="noindex"> </meta name="robots" content="noindex"> This page is themed Quote of the Day page and is duplicated twice intentionally at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2004 and also at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2010 but they all have <meta name="robots" content="noindex"> in them. So Google should not see them as duplicates right. Google does not in Webmaster Tools.</meta name="robots" content="noindex"> So it should not be counted 3 times? But it seems to be? How do we gen a report of the actual pages shown in the report as dups so we can check? We do not believe Google sees it as a duplicate page but Roger appears too. Similarly, one can use http://truthbook.com/contemplative_prayer/ , here also the http://truthbook.com/robots.txt tells Google to stay clear. Yet we are showing thousands of dup. page content errors when Google Webmaster tools as shown only a few hundred configured as described. Anyone? Jim0 -
Crawl report - duplicate page title/content issue
When the crawl report is finished, it is saying that there are duplicate content/page titles issues. However there is a canonical tag that is formatted correctly so just wondered if this was a bug or if anyone else was having the same issues? For example, I'm getting a error warning for this page http://www.thegreatgiftcompany.com/categories/categories_travel?sort=name_asc&searchterm=&page=1&layout=table
Moz Pro | | KarlBantleman0 -
Order of urls in SEOMoz crawl report
Is there any rhyme or reason to the order of urls in the SEOMoz crawl report, or are the urls just listed in random order?
Moz Pro | | LynnMarie0 -
An error in the SeoMoz On page note?
Hello folks, Whenever I go the OnPage link in SeoMoz some of my links show a F ranking note. And when I click in one of them to see the detail of the page rank, it shows me as an A ranking note. Do you have seen the same problem? Which note shall I rely on? Thanks!!
Moz Pro | | jgomes0 -
Why is blocking the SEOmoz crawler considered a red "error?"
Why is blocking the SEOmoz crawler considered a red "error?" Please see attached image... Y3Vay.png
Moz Pro | | vkernel0