Crawl Diagnostics
-
My site was crawled last night and found 10,000 errors due to a Robot.txt change implemented last week in between Moz crawls. This is obviously very bad so we have corrected it this morning. We do not want to wait until next Monday (6 days) to see if the fix has worked. How do we force a Moz crawl now?
Thanks
-
Its a dotnetblogengine.com blog its open source but not sure where to start
-
Why so many duplicates? As it's a blog I suspect it's something to do with tags and/or categories.
Instead of trying to hide the problem using the robot.txt file can tackle the root cause directly?
-
Hi,
As Chris says I don't think there is a way to force a refresh on your campaign crawls, but that crawl test tool should be able to give you an indication if the relevant pages are still producing duplicate content issues or if the fix seems to be reducing them.
That being said, I don't think that robots.txt is the best way to approach duplicate content issues generally. Check out this guide for best practices. It is also worth noting that many times duplicate content issues can be solved by simply removing or adjusting the various differently formatted links that are producing them in the first place (though this depends a lot on which cms you are using and what the root cause of the duplicate content is).
-
Thanks
9000 duplicate content and duplicate page titles caused by my blog. I have added
User-agent: *Allow: /Blog/post/Disallow: /Blog
to the Robot.txt to just allow the main site and the Blog posts
Is this a good way to fix it?
-
I'm pretty sure that you're not able to force a refresh of your campaign stats in between your normal weekly crawl. This tool will crawl the site but it doesn't refresh your campaign. Specifically, what errors were found that you're trying to get rid of?
-
Hi,
I think this will do what you are after: http://pro.moz.com/tools/crawl-test, limited to 3000 pages but should give an idea if the fix is working as you expect.
-
Thanks but I require a Moz crawl first.
-
"Submit URL to Index," which allows you to submit new and updated URLs that Google themselves say they "will usually crawl that day"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seomoz crawl: 4XX (Client Error) How to find were the error are?
I got eight 404 errors with the Seomoz crawl, but the report does not says where the 404 page is linked from (like it does for dup content), or I'm I missing something? Thanks
Moz Pro | | PaddyDisplays0 -
Crawl report - duplicate page title/content issue
When the crawl report is finished, it is saying that there are duplicate content/page titles issues. However there is a canonical tag that is formatted correctly so just wondered if this was a bug or if anyone else was having the same issues? For example, I'm getting a error warning for this page http://www.thegreatgiftcompany.com/categories/categories_travel?sort=name_asc&searchterm=&page=1&layout=table
Moz Pro | | KarlBantleman0 -
Wiped from Google Top 50 and Need Diagnostic Help
Hey All, On the 4th of January our site (http://spotcolorstudio.com) Webmaster tools showed a massive drop in the number of impressions we're getting from Google. We went from over 500 to around 100 and we haven't recovered. That week's SEOMOZ keyword report showed we were wiped from the Top 50 for everything we were tracking except our branded terms. I've seen no indicators to why this might have happened. Our Domain Authority hasn't changed. I haven't received any malware notices in Webmaster tools. GetListed.org displayed our Google Places listing as not present despite being able to click through and see our listing displaying as "active." Is it possible there's something wrong with the DNS that I'm missing? What could cause a complete wiping like this that wouldn't trigger an alert in Webmaster tools? Any help, guidance or suggestions will be greatly appreciated! Craig
Moz Pro | | SpotColorMarketing530 -
Campaigns - crawled
The new Pages Crawled: 2. I have many 404 and other errors, I wanted to start working on it tomorrow but the new crawl only crawled to pages and doesn't show any errors. Whats the problem and what can I do? Yoseph
Moz Pro | | Joseph-Green-SEO0 -
Joined yesterday, today crawl errors (incorrectly) shows as zero...
Hi. We set up our SEOMoz account yesterday, and the initial crawl showed up a number of errors and warnings which we were in the process of looking at and resolving. I log into SEOMoz today and it's showing 0 errors, Pages Crawled: 0 | Limit: 10,000 Last Crawl Completed: Nov. 27th, 2012 Next Crawl Starts: Dec. 4th, 2012errors, warnings and notices show as 0, and the issues found yesterday show only in the change indicators.There's no way of getting to the results seen yesterday other than waiting a week?We were hoping to continue working through the found issues!
Moz Pro | | WorldText0 -
Ruling out subfolders in pro tool crawl
Is there a way to "rule out" a subfolder in the pro dashboard site crawl? We're working on a site that has 500,000+ pages in the forums, but its the CMS pages we're optimizing and don't want to spend the 10k limit on forum pages.
Moz Pro | | DeepRipples0 -
How do i get to know th pages crawled by SEOMOZ?
My SEOMOZ campaign says that "n" number of pages were crawled. How do i get access to the list of the pages crawled by SEOMOZ?
Moz Pro | | IM_Learner0