Have a Campaign, but only states 1 page has been crawled by SEOmoz bots. What needs to be done to have all the pages crawled?
-
We have a campaign running for a client in SEOmoz and only 1 page has been crawled per SEOmoz' data. There are many pages in the site and a new blog with more and more articles posted each month, yet Moz is not crawling anything, aside from maybe the Home page. The odd thing is, Moz is reporting more data on all the other inner pages though for errors, duplicate content, etc...
What should we do so all the pages get crawled by Moz? I don't want to delete and start over as we followed all the steps properly when setting up.
Thank you for any tips here.
-
That's a major glitch David if that happened .... I would suggest something that needs to be checked out, as you indicate, there was a fall back in place previously, is that not the case with the new update ?
Oh, and a glitch that took me about 6 minutes of my life responding to an issue I never assumed could be as you described the solution as
-
Hello!
It looks like you setup a subdomain campaign but left out the subdomain heading "www". There should have been a warning that would have popped up when attempting to submit the setting. Let me know if the warning never appeared. Unfortunately you will have to delete and start over.
Here's an example of how you could set it up: http://www.screencast.com/t/Buu05s3EkiU
This also applies to your other question for the other domain.
Hope this helps!
David Lee
Moz Help Team -
Insure you have no pages directories / blocked by your robots.txt file, has Google indexed your pages yet
Is it a WP website ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Page Rank vs Page and Domain Authority - who wins?
A client has found another SEO agency promising various things to do with link building. Most of these promises are based upon links from sites with allegedly high page ranks. So my questions: Page rank seems to be fading out am I safe to stay with PA and DA metrics instead? I don't agree with link building tactics and feel that it should more a networking activity to provide USEFUL links to users... am I being too white hat and missing opporunities? The other company have promised long list of links including 100 SEO friendly web directory listings, 200 PR 8 back links from Pinterest (which i thought was no follow) & 10 long lasting and high quality mini web sites (with three pages/posts, video and pictures). Am I right that this all sounds a little spammy or is this really what I should be doing for me clients?
Moz Pro | | SoundinTheory0 -
Is SeoMOZ Crawl Diagnostics wrong here?
We've been getting a ton of critical errors (about 80,000) in SeoMoz' Crawl Diagnostics saying we have duplicate content in our client's E-commerce site. Some of the errors are correct, but a lot of the pages are variations like: www.example.com/productlist?page=1 www.example.com/productlist?page=2 However, in our source code we have used rel="prev" and rel="next" so in my opinion we should be alright. Would love to hear from you if we have made a mistake or if it is an error in SeoMoz. Here's a full paste of the script:
Moz Pro | | Webdannmark0 -
Https subdomain campaign
Hi! How can I setup a campaign for a subdomain similar to this? https://sub.domain.com Is it possible to use https protocol? Thanks a lot! Jesus
Moz Pro | | adesis0 -
Need to find all pages that link to list of pages/pdf's
I know I can do this in OSE page by page, but is there a way I can do this in a large batch? There are 200+ PDF's that I need to figure out what pages (if any) link to the PDF. I'd rather not do this page by page, but rather copy-paste the entire list of pages I'm looking for. Any tools you know of that can do this?
Moz Pro | | ryanwats0 -
On page links tool here at Seomoz
Hi Seomoz - first of all, thanks for the best SEO tools I have ever worked with (this is my first question in this forum, and also I just subscribed as a paying customer after the 30 days trial you guys offer). My question: After having worked for several weeks on getting the numbers of links in our forum on www.texaspoker.dk down, we are somewhat surprised to see that we didn't succeed in getting lower numbers. For instance, this page: http://www.texaspoker.dk/forum/aktuelle-konkurrencer/coaching-projekt-bliver-du-den-udvalgte has (that's what Seomoz seo tool tells us): 239 on page links. Can this really be true? We can't find these links, and we actuually did a lot to lower the numbers of links, for instance the forum members picture was a link before, and also there was a "go to top" link in each post in the forum. Thanks a lot.
Moz Pro | | MPO0 -
How do i archive a campaign?
Hi my account is saying i have reached my limit (5) and i am wanting to archive some of the current campaigns in have there within the 5 (they were trial play accounts). Can anyone advise on the best way to do this? At this stage i am not in a position to upgrade my account, so i am hoping that this is still a possibility with the current membership i have and the Q&A outside of saying it is easy does not explain how to do it 🙂 HELP PLEASE?
Moz Pro | | SueCook_TAOS0 -
Status 404-pages
Hi all, One of my websites has been crawled by SEOmoz this week. The crawl showed me 3 errors: 1 missing title and 2 client errors (4XX). One of these client errors is the 404-page itself! What's your suggestion about this error? Should a 404-page have the 404 http status? I'd like to hear your opinion about this one! Thanks all!
Moz Pro | | Partouter0