Only One page crawled..Need help
-
I have run a website in Seomoz which have many URLs with it. But when I saw the seomoz report that showing Pages Crawled: 1. Why this is happen my campaign limit is OK Tell me what to do for all page crawling in seomoz report.
-
HI,
Thanks for your Ans
Yes it was. after few hours i got my complete report from seomoz.
Thanks again
-
hi,
Thanks for your reply
I think that was due to initial crawling. Now my all pages are crawled and showing proper results for the website.
Wish that will never happen again,
Thanks again
-
Hi lucidsoftech,
The following is from the Crawl Diagnostics section of the SEOmoz Help Hub and might help identify the issue:
Why didn’t you crawl all my pages? I only got a one page crawl. Looks like you missed a bunch!
If you suspect you didn’t get a full crawl, or Rogerbot missed some of your pages, there could be several reasons why this happens.- We only crawl a maximum of 400 links per page. If several pages of your site all have the same 400 links on each page, we may not discover all the pages on your site. Try optimizing your navigation to reduce the number of links.
- Does your navigation rely on JavaScript? Can visitors navigate your site with JavaScript disabled? SEOmoz doesn’t crawl JavaScript, so make sure your links work in all browsing environments.
- Does your site consist of multiple subdomains? Crawls are restricted to the subdomain you set your campaign up on. This means that in general, we don't crawl multiple subdomains. You can solve this by specifying a “Root Domain” crawl in the setup process. (This requires starting a new campaign.)
I have also run across a couple of single page crawl issues in the past which related to secured pages and an edge case. This was identified and sorted out very quickly by the SEOmoz Help Team.
If the suggestions above don't identify the issue for you, then best to go to the Help Hub and click the Contact Our Help Team button to get in touch with them direct.
Hope that helps,
Sha
-
If I'm not mistaken that is showing that the pages that were crawled with errors on them. The 1 here is showing a client error (probably a 404)
It also looks like this could have been your initial crawl from Rogerbot. You'll get a full crawl within 7 days. But just in case, check your robots.txt file to make sure everything is normal in there. It looks to be an initial crawl.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My website was at the top of Google search for some years... suddenly I almost can't reach first page! Moz ranks my website better than the competitors... what might be going one? Could anybody help me out? Thanks!
Hello Guys! My website was at the top of Google search for some years... suddenly I almost can't reach first page! Moz ranks my website better than the competitors... what might be going one? Could anybody help me out? Moz rank us grade A... the competitors B or C .. I think we have better back links than they do... Would you need any kind of data or report to help me here? Thanks!
Moz Pro | | wesleyms0 -
Duplicate Page Content, Indexing and Rel Canonical Just DOUBLED! Need Advice to Fix
Last Friday (Penguin 5/2.1) my website shot way off the grid and I noticed in my MOZ PRO Campaign dashboard that all of the following just doubled in numbers on my website: duplicate page content, Google indexing, and rel canonicals. I also noticed that some of my pages, images, tags and categories now added a /page/2/ or a -2. I just changed noindex for tags, but indexing for media, pages, posts, and categories. I'm currently using All In One SEO for a plugin. Any advice would be much appreciated as I'm stuck on the issue. relconical.png Duplicate-Page-Content.png [Duplicate Content II](Duplicate Content II) index1.png
Moz Pro | | CelebrityPersonalTrainer0 -
Crawl test from tools
Hi, I notice that the crawl test which is from the Research Tools doesn't really get a new crawl even though there is 2 crawl per day. It will only provide the data which was acquire from the crawl diagnostics in my pro account. There is no point for me to get the data which I get from my crawl diagnostic isn't it? Even seomoz provided with more than 2 crawl per day also useless in this case. This whole thing doesn't make sense as the crawl diagnostics will only perform a full crawl test once every week. but even the crawl test also not helping any thing out for me.
Moz Pro | | hanzoz0 -
1 week has passed: Crawled pages still N/A
Roughly one week ago I went pro, and then I created a campaing for the smallish webshop that I'm employed at, however it doesn't seem to crawl. I've check our visitors log and while we find other bots such as google, bing, yandex and so fourth, seomoz bot hasn't been visible. Perhaps I'm looking for a normal useragent, ohwell, onwards. While I thought it might take time, as a small test I added a domain that I've owned for sometime but don't really use, that target site is only 17 pages, now this site was crawled almost within the hour, and I realised that our ~5000pages on the main campaing would take some time, but wouldn't the initial 250 pages be crawled by now? I should add, that I didn't add http:// to the original Campaing, but the one that got crawled I did. I cannot seem to change this myself inorder to spot if that's the problem or not. Anyone has any ideas, should I just wait or is there something I can activly do to force it to start rolling?
Moz Pro | | Hultin0 -
Page authority questions?
I've been analyzing some IT communities ...in order to check how relevant is the page authority vs PageRank. I found one main site which is organized by "communities'..and every community is a sub-domain. The root domain has an authority of 90/100 which it should be great......so the sub-domains "inherit" part of this authority.... Until here everything seems to be perfect. However, I went deeper and I picked one of these communities. Analyzing the "Linking Root Domain" I discovered it only has only 5 root domains pointing to its home page. Those 5 Root Domains have generated more than 134k links. That doesn't seem to be "natural". Checking those 5 Root Domains I discovered that they have been registered by the same Root Domain site. Ex: Main domain: Domain.com Community1.domain.com Community2.domain.com.... Linking Root Domains: DomainXY.com DomainABC.com DomainRST.com DomainFGH.com DomainOPQ.com It seems to me that it is easy to cheat the authority domain score. Just creating others sites developing the same topic and generating back links to your main domain
Moz Pro | | SherWeb0 -
Can I prevent some pages from being crawled from SEOMoz spider and still not affect Google Spider?
Well, basically that's the question 😄 Can I prevent some pages from being crawled from SEOMoz spider and still not affect Google Spider? This is, I have more than 10.000 pages on the website, and I am not interested in having reports for many of them, but I still wanna get SEO visits on them, so I want Google to crawl it easily... Thanks!
Moz Pro | | MattDG0 -
Where do these error 404 pages come from
Hi, I've got a list of about 12 url's in our 404 section on here which I'm confused about. The url's relate to Christmas so they have not been active for 9 months. Can anyone answer where the SeoMoz crawler found these url's as they are not linked to on the website. Thanks
Moz Pro | | SimmoSimmo0