Crawl Diagnostics
-
My site was crawled last night and found 10,000 errors due to a Robot.txt change implemented last week in between Moz crawls. This is obviously very bad so we have corrected it this morning. We do not want to wait until next Monday (6 days) to see if the fix has worked. How do we force a Moz crawl now?
Thanks
-
Its a dotnetblogengine.com blog its open source but not sure where to start
-
Why so many duplicates? As it's a blog I suspect it's something to do with tags and/or categories.
Instead of trying to hide the problem using the robot.txt file can tackle the root cause directly?
-
Hi,
As Chris says I don't think there is a way to force a refresh on your campaign crawls, but that crawl test tool should be able to give you an indication if the relevant pages are still producing duplicate content issues or if the fix seems to be reducing them.
That being said, I don't think that robots.txt is the best way to approach duplicate content issues generally. Check out this guide for best practices. It is also worth noting that many times duplicate content issues can be solved by simply removing or adjusting the various differently formatted links that are producing them in the first place (though this depends a lot on which cms you are using and what the root cause of the duplicate content is).
-
Thanks
9000 duplicate content and duplicate page titles caused by my blog. I have added
User-agent: *Allow: /Blog/post/Disallow: /Blog
to the Robot.txt to just allow the main site and the Blog posts
Is this a good way to fix it?
-
I'm pretty sure that you're not able to force a refresh of your campaign stats in between your normal weekly crawl. This tool will crawl the site but it doesn't refresh your campaign. Specifically, what errors were found that you're trying to get rid of?
-
Hi,
I think this will do what you are after: http://pro.moz.com/tools/crawl-test, limited to 3000 pages but should give an idea if the fix is working as you expect.
-
Thanks but I require a Moz crawl first.
-
"Submit URL to Index," which allows you to submit new and updated URLs that Google themselves say they "will usually crawl that day"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Lag time between MOZ crawl and report notification?
I did a lot of work to one of my sites last week and eagerly awaited this week's MOZ report to confirm that I had achieved what I was trying to do, but alas I still see the same errors and warnings in the latest report. This was supposedly generated five days AFTER I made the changes, so why are they not apparent in the new report? I am mainly referring to missing metadata, long page titles, duplicate content and duplicate title errors (due to crawl and URL issues). Why would the new crawl not have picked up that these have been corrected? Does it rely on some other crawl having updated (e.g. Google or Bing)?
Moz Pro | | Gavin.Atkinson0 -
Order of urls in SEOMoz crawl report
Is there any rhyme or reason to the order of urls in the SEOMoz crawl report, or are the urls just listed in random order?
Moz Pro | | LynnMarie0 -
Should I be running my crawl on our www address or our non-www address?
I currently run our crawl on oursitename.com, but am wondering if it should be run on www.oursitename.com instead.
Moz Pro | | THMCC0 -
Slowing down SEOmoz Crawl Rate
Is there a way to slow down SEOmoz crawl rate? My site is pretty huge and I'm getting 10k pages crawled every week, which is great. However I sometimes get multiple page requests in one second which slows down my site a bit. If this feature exists I couldn't find it, if it doesn't, it's a great idea to have, in a similar way to how Googlebot do it. Thanks.
Moz Pro | | corwin0 -
How do I force a crawl?
In the campaign overview it reads that 0 pages were crawled. Also got an email saying that a comprehensive audit will be done in 7 days. But the 'crawl in progress' wheel disappeared. I think it stopped, and I need to submit that report to substantiate buying the tool! How do I force a crawl?
Moz Pro | | ilhaam0 -
Our Duplicate Content Crawled by SEOMoz Roger, but Not in Google Webmaster Tools
Hi Guys, We're new here and I couldn't find the answer to my question. Here it goes: We had SEOMoz's Roger Crawl all of our pages and he came up with quite a few erros (Duplicate Content, Duplicate Page Titles, Long URL's). Per our CTO and using our Google Webmaster Tools, we informed Google not to index those Duplicate Content Pages. For our Long URL Errors, they are redirected to SEF URL's. What we would like to know is if Roger is able to know that we have instructed Google to not index these pages. My concern is Should we still be concerned if Roger is still crawling those pages and the errors are not showing up in our Webmaster Tools Is there a way we can let Roger know so they don't come up as errors in our SEOMoz Tools? Thanks so much, e
Moz Pro | | RichSteel0 -
Pages Crawled: 250 | Limit: 250
One of my campaigns says: Pages Crawled: 250 | Limit: 250 Is this because it's new and the limit will go up to 10,000 after the crawl is complete? I have a pro account, 4 other campaigns running and should be allowed 50,000 pages in total
Moz Pro | | MirandaP0 -
Question about when new crawls start
Hi everyone, I'm currently using the trial of seomoz and I absolutely love what I'm seeing. However, I have 2 different websites (one has over 10,000 pages and one has about 40 pages). I've noticed that the smaller website is crawled every few days. However, the larger site hasn't been crawled in a few days. Although both campaigns state that the sites won't be crawled until next Monday, is there any way to get the crawl to start sooner on the large site? The reason that I've asked is that I've implemented some changes that will likely decrease the amount of pages that are crawled simply based upon the recommendations on this site. So, I'm excited to see the potential changes. Thanks, Brian
Moz Pro | | beeneeb0