Crawling issue
-
Hi,
I have to set up a campaign for a webshop. This webshop is a subdomain itself.
First question: The two subfolders I need to track are /nl_BE and /fr_BE. What is the best way to handle this? Shall I set up two different campaigns for each subfolder, or shall I just make one campaign and add tags to keywords?
**Second question: **it seems like Moz can't crawl enough pages. There are no disallows in the robots.txt. Should I try putting the following at the top into my robots.txt?
User-agent: rogerbot
Disallow:Or is it because I want to crawl only a subdomain that it doesn't work?
Thanks
-
Hey,
Thanks for reaching out to us!
You can create a Campaign solely for a subdomain or subfolder by selecting the +Advanced setting in the Campaign set-up; just click the check box there and it will limit our Campaign audit to the pages on that specific subdomain or subfolder. From there, you can see in your Campaign Setting if you've set it up for just that chunk of your site, or for the entire root domain.
I've got a guide to this process that I think may help.
With regard to your second question, feel free to reach out to help@moz.com so that we can take a closer look
Looking forward to hearing from you,
Eli
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved What would the exact text be for robots.txt to stop Moz crawling a subdomain?
I need Moz to stop crawling a subdomain of my site, and am just checking what the exact text should be in the file to do this. I assume it would be: User-agent: Moz
Getting Started | | Simon-Plan
Disallow: / But just checking so I can tell the agency who will apply it, to avoid paying for their time with the incorrect text! Many thanks.0 -
Why does Moz only seem to be crawling a snap shot of the site I am working with?
I was wondering if anyone can help? I am working using Moz to help improve the SEO on a website I am working with, the website contains thousands of pages, yet for some reason Moz only seems to be crawling a small snap shot of the website. I know there are particular pages that I had added a couple of weeks ago - about 300 in total - and none of these were showing on the first crawl, so I did another on-demand crawl and some of these showed up then. Despite this, it says it crawled 700ish pages, but there are getting close to 20-30ish thousand live pages on the site. Any thoughts and guidance as to why they crawling may be stopping?
Getting Started | | dsmith8020200 -
Keyword List issue
Hi, I have created a keyword list with 500 phrases and your software has been set as "Gathering Data" for hours now. I have closed, re-opened, refreshed multiple times and still nothing. Does it really take that long or is there some sort of issue?
Getting Started | | Crowleymjc0 -
Attempts to fix MOZ recommended issues resulted in drastic ranking drop.
I have a website built in SquareSpace. https://www.ruffhaus.com/ I recently started working with MOZ to track and improve organic SEO. After my initial site crawl, search visibility was reported at a waping 2.38%. Moz showed several critical crawler issues. Most were redirect, 4xx and long URL. So I started working on fixing the redirect and 4xx issues first. I thought this was a good thing. But now my already sad search visibility has dropped to .07% (-95.97%!). I also went from #1 on keyword, brand implementation plan (and 3 variations), to #27. What? Wondering where I went wrong and how to remedy? This is all new to me so I am sure I'm not providing all the info you need to answer my question. Hoping providing the site URL will help. Fire away!
Getting Started | | RuffHaus0 -
Moz Not Crawling Angular SPA
I have a client that just launched a redesigned website using Angular as a single page app. Google appears to be able to crawl the site just fine, but Moz crawl is only finding one page. We have updated the htaccess to allow for Rogerbot and Dotbot, but still unable to crawl any pages other than the home page. Does anyone have experience with this or ideas of why it won't crawl all pages, and how to allow for Moz to crawl all pages? There is a sitemap with approx. 390 pages. Thanks!
Getting Started | | PIN_Celler1 -
Mozbot Can Not Crawl Entire Domain
I'm trying to crawl Redken.com in Moz Analytics and the Search Diagnostics is only crawling 4 pages. The domain uses a "select your country" the first time you visit, and it seems as though the bot is not getting beyond that (aka, not clicking on "USA") and is therefore not crawling the rest of the domain. There is no country specific URL other than redken.com. I've tried entering both "redken.com" and "www.redken.com" as the URL, but no luck. Any tips?
Getting Started | | LabeliumUSA0 -
A lot of duplicate content issues - does Moz understand canonical URL?
Hi, Since I subscribed to Moz my Magento store has given a lot of duplicate content issues. However, I did have a problem with Canonical URL at the time. It has been settled for a couple of weeks by now and although I had 302 redirects before, I configured Magento to 301 today. Since Moz has been crawling and showing duplicate content for exactly the same Magento pages but with endings like store=us, store=aus etc (since I have several store views enabled), I am wondering whether canonical URL does actually help Google to skip these versions of the duplicate pages and does Moz also understand it and will it reduce the amount of duplicate content errors once the 301 redirects and canonical URLs have been properly set for a week or so? Thanks!
Getting Started | | speedbird12290 -
Daily crawl reports, are they wasting my time?
I am relatively new here, I have 5 campaigns. I get new crawl complete reports almost every day for all of them. Wow great, except when I check the reports nothing has changed. Even if I have gone in and changed things or fixed errors, the same ones are still there and takes 4-7 days for that work to show up. Everytime I get one of these reports I am opening them up going through and not seeing the changes I implemented the previous days before. I'll spend 20-30 minutes going over these and checking details. So the question is, Are these reports wasting my time? Are they actually new reports or am I just getting spammed repeat notices everyday?
Getting Started | | RandyFriesen0