Can I exclude pages from my Crawl Diagnostics?
-
Right now my crawl diagnostic information is being skewed because it's including the onsite search from my website. Is there a way to remove certain pages like search from the errors and warnings of the crawl diagnostic? My search pages are coming up as:
- Long URL
- Title Element Too Long
- Missing Meta Description
- Blocked by meta-robots (Which is how I want it)
- Rel Canonical
Here is what the crawl diagnostic thinks my page URL looks like:
website.com/search/gutter%25252525252525252525252525252525252525252525252525252525
252525252525252525252525252525252525252525252525252525252525252
525252525252525252525252525252525252525252525252525252525252525
252525252525252525252525252525252525252525252525252525252525252
52525252525252525252525252525252525252525252525252Bcleaning/
Thank you,
Jonathan
-
Thanks! That was easy.
-
You can do so in robots.txt
SeoMoz bot is called rogerbot
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved On page grader
All of my keywords score a 53 using the on page grader. When I look at the notes it indicates I don't have the keyword in question anywhere on the page which, while true in some cases, is not always factual. Does anyone have a similar experience?
Moz Pro | | josayoun0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Can't figure out why some of my pages are duplicate content
Within the crawl diagnostics area I'm getting duplicate page content issues on several pages. I don't know why, would anyone be able to tell me how these links are duplicate so I can fix them? http://www.sagenews.ca/Column.asp?id=3010 http://www.sagenews.ca/Column.asp?id=2808 http://www.sagenews.ca/Column.asp?id=2998 http://www.sagenews.ca/Column.asp?id=2837 http://www.sagenews.ca/Column.asp?id=2981
Moz Pro | | INMCA0 -
Why is my domain not being crawled anymore?
I just noticed that right around 12/1/2012, SEOMoz stopped crawling all but two pages out of the 400 or so on my website at www.TrustworthyCare.com . I speculate that this is probably due to some dumb mistake I made at that time, but I can't for the life of me figure out what that mistake was. Before that, the weekly crawls included all 400 or so pages. I wonder whether it's something that changed in our .htaccess file. Here's how that file looks now; can anyone see what is wrong there, or perhaps offer other suggestions if it doesn't look like anything is wrong in it? Thanks! Tim PS - I'm a small business owner, not an SEO or software engineer. PPS - I found and read this page, but I've pretty much tried the things described there (I think): https://seomoz.zendesk.com/entries/409821-why-isn-t-my-site-being-crawled-you-re-not-crawling-all-my-pages ================================= RewriteCond %{HTTP_HOST} ^aservantsheartcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartcaremanagement.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartcaremanagement.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartgeriatriccare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartgeriatriccare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartgeriatriccaremanagement.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartgeriatriccaremanagement.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantshearthomecare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantshearthomecare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartseniorcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartseniorcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^aservantsheartservices.com$ [OR]RewriteCond %{HTTP_HOST} ^www.aservantsheartservices.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^careforparents.com$ [OR]RewriteCond %{HTTP_HOST} ^www.careforparents.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^eldercareradio.com$ [OR]RewriteCond %{HTTP_HOST} ^www.eldercareradio.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^helpforyourparents.com$ [OR]RewriteCond %{HTTP_HOST} ^www.helpforyourparents.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^privatedutyseniorcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.privatedutyseniorcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^sandiegocaremanagement.com$ [OR]RewriteCond %{HTTP_HOST} ^www.sandiegocaremanagement.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^sandiegocaremanager.com$ [OR]RewriteCond %{HTTP_HOST} ^www.sandiegocaremanager.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^sandiegogeriatriccaremanagement.com$ [OR]RewriteCond %{HTTP_HOST} ^www.sandiegogeriatriccaremanagement.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^sandiegogeriatriccaremanager.com$ [OR]RewriteCond %{HTTP_HOST} ^www.sandiegogeriatriccaremanager.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^servantsheartcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.servantsheartcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^servantshearthomecare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.servantshearthomecare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^servantsheartseniorcare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.servantsheartseniorcare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^tlccare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.tlccare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^tlcseniorcenter.com$ [OR]RewriteCond %{HTTP_HOST} ^www.tlcseniorcenter.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^tlcseniorhomecare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.tlcseniorhomecare.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] RewriteCond %{HTTP_HOST} ^tlcseniorservices.com$ [OR]RewriteCond %{HTTP_HOST} ^www.tlcseniorservices.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] #php_value upload_max_filesize 8MRewriteCond %{HTTP_HOST} ^trustworthycare.com$RewriteRule ^(.)$ "http://www.trustworthycare.com/$1" [R=301,L] RewriteCond %{HTTP_REFERER} !^$RewriteCond %{HTTP_REFERER} !^http://blog.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://blog.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://test.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://test.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.blog.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.blog.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.test.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.test.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.trustworthycare.com$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.trustworthycare.com/images/files_for_service_inquiries/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://www.trustworthycare.com/images/files_for_service_inquiries$ [NC]RewriteCond %{HTTP_REFERER} !^http://sandbox.trustworthycare.com/.$ [NC]RewriteCond %{HTTP_REFERER} !^http://sandbox.trustworthycare.com$ [NC]RewriteRule ..(jpg|jpeg|gif|png|bmp)$ - [F,NC] RewriteCond %{HTTP_HOST} ^ashsc.com$ [OR]RewriteCond %{HTTP_HOST} ^www.ashsc.com$RewriteRule ^/?$ "http://trustworthycare.com/" [R=301,L] # BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.5" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.5" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.5" # END W3TC Browser Cache# BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule . - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.trustworthycare.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.5) [NC] RewriteCond "%{DOCUMENT_ROOT}/sitectrl/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/sitectrl/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPressRewriteCond %{HTTP_HOST} ^privatedutycare.com$ [OR]RewriteCond %{HTTP_HOST} ^www.privatedutycare.com$RewriteRule ^/?$ "http://www.ageassistance.com" [R=301,L] =================================
Moz Pro | | tcolling0 -
Duplicate content pages
Crawl Diagnostics Summary shows around 15,000 duplicate content errors for one of my projects, It shows the list of pages with how many duplicate pages are there for each page. But i dont have a way of seeing what are the duplicate page URLs for a specific page without clicking on each page link and checking them manually which is gonna take forever to sort. When i export the list as CSV, duplicate_page_content column doest show any data. Can anyone please advice on this please. Thanks <colgroup><col width="1096"></colgroup>
Moz Pro | | nam2
| duplicate_page_content |1 -
Settings to crawl entire site
Not sure what happened but I started a third campaign yesterday and only 1 pages was crawled, The other two campaigns has 472 and 10K respectively. What is the proper setting to choose in the beginning of campaign setup to have the entire site crawled. Not sure what I did different and I must be reading the instructions incorrectly. Thanks, Don
Moz Pro | | NicheGuy210 -
Can i give other accounts access
I would like to be able to give limited access to members of our team so they can see SEO campaign results and print off reports without being able to edit the campaigns. Is this possible?
Moz Pro | | wouldBseoKING0 -
For the "On Page Report Card", Can I see a list of all of the recommendations?
On the "On Page Repord Card", in the "Page Analysis Detail" section, I can click "Read More" for each factor and see a "Recommendation" on how to fix the page to meet that factors criteria. Are these recommendations the same everytime or will they vary depending on the page itself? Can get a copy of each recommendation that would appear if none of the criteria were met? Thanks, Daniel.
Moz Pro | | iSenseWebSolutions1