Site crawler hasn't crawled my site in 6 days!
-
On 4.23 i requested a site crawl. My site only has about 550 pages. So how can we get faster crawls?
-
Crawl In Progress
Subdomain
www.taxproblem.org
Submitted 5:47pm GMT
Apr 23rd 2011
Is it a good idea to delete this (can't see how in the tool). I was thinking that maybe it got lost or something, it's now 10 days. I wanted to make another request.
I was mistaken before, the site wasn't crawled, it was just a campaign update.
-
I listed my craws in progress with the dates, so yes.
-
That one uses the exact same technology without the pretty report. Have you used it lately? It was upgraded a week ago and I believe intended to replace the one in seomoz labs.
-
Actually that's the link to the worst one that doesn't do anything. Yesterday my site was crawled, so it actually did take a week, but as you can see the results here are lagging...
Crawl In Progress
Subdomain
www.tax-audit.us
Submitted 1:36am GMT
Apr 25th 2011
Crawl In Progress
Subdomain
www.taxproblem.org
Submitted 5:47pm GMT
Apr 23rd 2011
-
That seems long to me. If you run a crawl. You can run it here instead and get results in 24 hours. Up to 2 per day and 3000 pages. It won't give you that nice red, blue and yellow report but it basic info is the same. You just need to manually analyze it. http://pro.seomoz.org/tools/crawl-test
-
I'm not sure to be honest but before it slowed down a little I tended to get a crawl once a week per site, whether a site was large or small.
-
Thanks. When fixed, how long should we have to wait for crawls?
-
That's not actually that long, and there's a queue anyway... I had a small site wait a couple of weeks recently but it did get crawled, was just in the queue for a while first. Just give it a while longer and it'll be fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing as site is based on regular ticket selling (CRM), kindly help
Sites error increasing i.e. 404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing day by day as site is based on regular ticket selling (CRM), We have checked with webmasters for 404's, but it is not easy to delete 1.8 lac entries. How to resolve this issue for future. kindly help and suggest the solution.
Reporting & Analytics | | 1akal0 -
Multi-Site Analytics Dashboards?
Anyone have recommendations on a good multi-site analytics dashboard? I am managing roughly 20 sites right now, and am looking for a dashboard that provides basic info like # of visitors, search traffic, etc. for a couple dozen sites at a glance.
Reporting & Analytics | | TakeshiYoung0 -
SEOMoz & GWT crawl error conflicting info
Site im working on has zero crawl errors according to SEOMoz (it did previously have lots since ironed out) but now looking at GWebmaster Tools saying 5000 errors. Date of those are not that recent but Webmaster Tools line graph of errors still showing aprox 5000 up to yesterday There is an option to bulk action/tick them all as fixed so thinking/hoping GWT just keeping a historical record that can now be deleted since no longer applicable. However i'm not confident this is the case since still showing on the line graph. Any ideas re this anomalous info (can i delete and forget in GWT) ? Also side question I take it its not possible to link a GA property with a GWT account if created with different logins/accounts ? Many Thanks Dan
Reporting & Analytics | | Dan-Lawrence0 -
Haven't seen anything like this
I personally haven't seen anything like this before. I am optimizing my first worldwide page. The rank checker at seomoz and another rankchecker as well says that I achieve rankings in the Us and in the UK as well for my 2-3 word terms. However if I copy a paste a whole sentence from either of my pages (8-10 words) google can not find my site. If I put the terms in quotes than it is ok, but it still cannot find the home page. Not any sentence with or without quotes from the home page, although this page has the most incoming links. Anybody any idea how can it be?
Reporting & Analytics | | sesertin0 -
List all URL's indexed by google
Hi all i need a list of all urls google has indexed from my site i want this in excel format or csv how do i go about getting this thanks in advance
Reporting & Analytics | | Will_Craig0 -
Do Google penalise you for having too many 404's?
Hi There I have been doing some work reducing the number of 404's displayed in the Crawl Errors found in Googles Webmaster Tools. We had a lot of products that were no longer available so have now been removed to reduce the number of 404s that had been found. However, there are a number of URLs that have been crawled that do not exist on our website and have been flagged in the list of Crawl Errors. I want to know if Google will penalise us for this, perhaps affecting our quality score or if they can see that this is something out of our control. This site for example: http://sibd.com/com_offers_unique_gifts.html has generated a lot of truncated URLs on its site that link to pages that don't exist on our site: http://www.arenaflowers.com/flowers/pri… That is the exact link that it is trying to locate. Here is the report for that particular link. As you can see the content has been scraped by other sites which has spread the problem further. Pages that link to http://www.arenaflowers.com/flowers/pri.. URL Discovery Date
Reporting & Analytics | | ArenaFlowers.com
http://www.justsearchit.com.au/for_flowers_offers,3.html
Sep 12, 2011
http://sibd.com/offers_unique_gifts_for.html
Sep 11, 2011
http://sibd.com/offers_unique_gifts.html
Sep 11, 2011
http://sibd.com/com_offers_unique_gifts_for.html
Sep 11, 2011
http://www.flexfinder.com/flowers_offers_unique_gifts.html
Sep 10, 2011
http://sibd.com/offers_unique_gifts_with.html
Sep 10, 2011
http://sibd.com/com_offers_unique_gifts.html
Sep 10, 2011
http://sibd.com/of_flowers_offers_from.html
Sep 9, 2011
http://arama.frmpc.com/for_flowers_for_less_ltd_includes.html
Sep 9, 2011
http://arama.frmpc.com/flowers_for_less_than_do.html
Sep 9, 2011 I have spotted a lot of these and currently have around 3.3K 404s in total, a majority are from sites we don't control. Is there an acceptable number of 404s a site should aim for and is the above something we should address or are Google smart enough to work out that we can't fix this ourselves? Thanks! Sam.0