How do I exclude my blog subfolder from being crawled with my main domain (www.) folder?
-
I am trying to setup two separate campaigns for my blog and for my main site.
While it is easy enough to do through the wizard the results I am getting for my main site still include pages that are in my blog sub folder.
Please Advise!
-
Hi,
I would like to know if this idea, of excluding a subfolder from a campaign, is available now? If so, how do I set it up?
We have a forum in a subfolder and all the data from it kind of blurs my main interest using Seomoz for this website.
-
Hey,
now with the new Moz Analytics platform, has this been implemented yet or still on the plan? I would like to exclude a forum placed in a subfolder as this forum is not our main focus to work with SEO on. It has so many URLs and "eats" up the moz crawl budget for us.
Thanks!
-
Hey Henry,
Thanks for writing in. Unfortunately, there isn't a way to exclude the blog from the crawl for the main site at this time. If you exclude our crawler from those pages, then we wouldn't be able to crawl for the separate blog campaign either. We do plan to allow you to ignore or remove certain pages from your crawl when we switch to the new Moz Analytics platform, but it may be some time before that is available for all of our users. I'm sorry for the inconvenience that causes in the meantime.
Please let me know if you have any other questions.
Chiaryn
Help Team Ninja -
i was just thinking, you might be able to put a robots file in the subfolder telling seomoz bot not to crawl that folder, not sure if that's possible. But make sure you only target seomozs bot only (don't want to tell google not to crawl it).
but that would stop the subfolder being crawled completely, which I don't think you want
-
Yes I am referring to the seomoz bot and I understand that google sees the two sub folders as one domain.
I would like them to be separate just for my own work flow/ability to mange each individually with no spill over.
But if it can't be done it can't be done.
Thank you!
-
I'm guessing you mean the seomoz, bot crawling your site? If so why do you want to crawl them separately, because google won't, they are on the same domain.
to answer your question, I don't think it can be done
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz and HubSpot SSL - crawl error?
I'm getting an error message when Moz tries to crawl my site, however when I check in Google Search Console, they return no errors. Our site is hosted on HubSpot. Is Moz still having trouble crawling HubSpot sites that have enabled their SSL? I read an article that this should have been corrected in early 2017, but I'm getting an error.
Moz Pro | | jennygriffin0 -
MOZ Starter Crawl Not Working
Hello, I just added a new subdomain as one of my campaigns on MOZ. The starter crawl report keeps coming back to me with just one page crawled (it should crawl up to 250 pages). I've deleted and added this subdomain three times and it continues to present me with this problem.I've even waited a week for the full crawl report but that also showed just one page crawled. Does anybody know why this is happening? Thanks!
Moz Pro | | jampaper0 -
Functionality of SEOmoz crawl page reports
I am trying to find a way to ask SEOmoz staff to answer this question because I think it is a functionality question so I checked SEOmoz pro resources. I also have had no responses in the Forum too it either. So here it is again. Thanks much for your consideration! Is it possible to configure the SEOMoz Rogerbot error-finding bot (that make the crawl diagnostic reports) to obey the instructions in the individual page headers and http://client.com/robots.txt file? For example, there is a page at http://truthbook.com/quotes/index.cfm month=5&day=14&year=2007 that has – in the header -
Moz Pro | | jimmyzig
<meta name="robots" content="noindex"> </meta name="robots" content="noindex"> This page is themed Quote of the Day page and is duplicated twice intentionally at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2004 and also at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2010 but they all have <meta name="robots" content="noindex"> in them. So Google should not see them as duplicates right. Google does not in Webmaster Tools.</meta name="robots" content="noindex"> So it should not be counted 3 times? But it seems to be? How do we gen a report of the actual pages shown in the report as dups so we can check? We do not believe Google sees it as a duplicate page but Roger appears too. Similarly, one can use http://truthbook.com/contemplative_prayer/ , here also the http://truthbook.com/robots.txt tells Google to stay clear. Yet we are showing thousands of dup. page content errors when Google Webmaster tools as shown only a few hundred configured as described. Anyone? Jim0 -
Only One page crawled..Need help
I have run a website in Seomoz which have many URLs with it. But when I saw the seomoz report that showing Pages Crawled: 1. Why this is happen my campaign limit is OK Tell me what to do for all page crawling in seomoz report. wV6fMWx
Moz Pro | | lucidsoftech0 -
Crawl Diagnostics - Canonical Question
On one of my sites I have 61 notices for Rel Canonical. Is it bad to have these or is this just something that's informative?
Moz Pro | | kadesmith0 -
Can I exclude a sub-domain from SEOMoz campaigns?
We have recently implemented a white label site that is on a sub-domain. The site employs noindex on most of the pages I imagine due to duplicate content concerns on other white label versions of the site. It has led to a spike of over 14 thousand notices on our report. Is there a way to exclude a sub-domain from the SEOMoz scans and reports?
Moz Pro | | TSDigital0 -
Which of these is the best guest blogging site
Which of these is best: http://www.guestblogit.com/http://www.bloggerlinkup.com/ http://www.Myblogguest.comhttps://www.helpareporter.com/ I like the look of MyBlogguest.com so far. We are wanting to guest write articles to be published on quality sites in exchange for one or two links back (link building) Also, how do you choose what article topics to write. So far my strategy is to look at the industry's biggest sites and to use the "Top Pages" tab in OSE to look for hot topics. Thanks!
Moz Pro | | BobGW0 -
Errors on my Crawl Diagnostics
I have 51 errors on my Crawl Diagnostics tool.46 are 4xx Client Error.Those 4xx errors are links to products (or categories) that we are not selling them any more so there are inactive on the website but Google still have the links. How can I tell Google not to index them?. Can those errors (and warnings) could be harming my rankings (they went down from position 1 to 4 for the most important keywords) thanks,
Moz Pro | | cardif0