How do I exclude my blog subfolder from being crawled with my main domain (www.) folder?
-
I am trying to setup two separate campaigns for my blog and for my main site.
While it is easy enough to do through the wizard the results I am getting for my main site still include pages that are in my blog sub folder.
Please Advise!
-
Hi,
I would like to know if this idea, of excluding a subfolder from a campaign, is available now? If so, how do I set it up?
We have a forum in a subfolder and all the data from it kind of blurs my main interest using Seomoz for this website.
-
Hey,
now with the new Moz Analytics platform, has this been implemented yet or still on the plan? I would like to exclude a forum placed in a subfolder as this forum is not our main focus to work with SEO on. It has so many URLs and "eats" up the moz crawl budget for us.
Thanks!
-
Hey Henry,
Thanks for writing in. Unfortunately, there isn't a way to exclude the blog from the crawl for the main site at this time. If you exclude our crawler from those pages, then we wouldn't be able to crawl for the separate blog campaign either. We do plan to allow you to ignore or remove certain pages from your crawl when we switch to the new Moz Analytics platform, but it may be some time before that is available for all of our users. I'm sorry for the inconvenience that causes in the meantime.
Please let me know if you have any other questions.
Chiaryn
Help Team Ninja -
i was just thinking, you might be able to put a robots file in the subfolder telling seomoz bot not to crawl that folder, not sure if that's possible. But make sure you only target seomozs bot only (don't want to tell google not to crawl it).
but that would stop the subfolder being crawled completely, which I don't think you want
-
Yes I am referring to the seomoz bot and I understand that google sees the two sub folders as one domain.
I would like them to be separate just for my own work flow/ability to mange each individually with no spill over.
But if it can't be done it can't be done.
Thank you!
-
I'm guessing you mean the seomoz, bot crawling your site? If so why do you want to crawl them separately, because google won't, they are on the same domain.
to answer your question, I don't think it can be done
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why would only 1 pg of an 18 pg site be crawled?
I signed up yesterday. Added 4 sites. The one I really need data on now has only had 1 page crawled. The other sites have had almost all pages crawled (over 40 each) in one day. What is wrong? The main domain has a 301 redirect to another domain name is that the problem? Is there something wrong at Google analytics? I dont manage this site and I'm picking up behind another firm..where should I start my discovery? Thanks so much!
Moz Pro | | moreidea0 -
How to use Crawl Report for a test directory
I have a client's new site set up in a folder that is not linked to from the main site. When setting up the Crawl Report, I put in the starting url for the new folder, http://oldsite.com/new/start.php. The Crawl Report came back with a crawl of the current site instead. How do folks run the crawl report to test sites before they are public? Thanks!
Moz Pro | | SWDDM0 -
Crawl Diagnostics - unexpected results
I received my first Crawl Diagnostics report last night on my dynamic ecommerce site. It showed errors on generated URLs which simply are not produced anywhere when running on my live site. Only when running on my local development server. It appears that the Crawler doesn't think that it's running on the live site. For example http://www.nordichouse.co.uk/candlestick-centrepiece-p-1140.html will go to a Product Not Found page, and therefore Duplicate Content errors are produced. Running http://www.nhlocal.co.uk/candlestick-centrepiece-p-1140.html produces the correct product page and not a Product Not Found page Any thoughts?
Moz Pro | | nordichouse0 -
Crawl Diagnostics Error Spike
With the last crawl update to one of my sites there was a huge spike in errors reported. The errors jumped by 16,659 -- majority of which are under the duplicate title and duplicate content category. When I look at the specific issues it seems that the crawler is crawling a ton of blank pages on the sites blog through pagination. The odd thing is that the site has not been updated in a while and prior to this crawl on Jun 4th there were no reports of these blank pages. Is this something that can be an error on the crawler side of things? Any suggestions on next steps would be greatly appreciated. I'm adding an image of the error spike Xovep.jpg?1 Xovep.jpg?1
Moz Pro | | VanadiumInteractive1 -
Is it possible to exclude pages from Crawl Diagnostic?
I like the crawl diagnostic but it shows many errors due to a forum that I have. I don't care about the SEO value of this forum and would like to exclude any pages in the /forum/ directory. Is it possible to add exclusions to the crawl diagnostic tool?
Moz Pro | | wfernley2 -
How do you get Mozbot to crawl your website
I trying to get the mozbot to crawl my site so I can get new crawl diagnostics info. Anyone know how this can be done?
Moz Pro | | Romancing0 -
Can I change the crawl day ?
Hi All I hope there is a simple solution to this - we have a number of campaigns setup which are all crawled, and therefore updated, on different days of the week. We review these weekly and it would be much easier if they were all crawled on the same day. Is it possible to change the crawl day for some campaigns? Thanks Roy
Moz Pro | | bluelogic0