Crawl Diagnostics Updates
-
I have several page types on my sites that I have blocked using the robots.txt file (ex: emailafriend.asp, shoppingcart.asp, login.asp), but they are still showing up in crawl diagnostics as issues (ex: duplicate page content, duplicate title tag, etc). Is there a way to filter these issues or perhaps there is something I'm doing wrong resulting in the issues that are showing up?
- Ryan
-
Hi Ryan,
try to move the sitemap to the end and leave a space before it. something like this:
User-agent:*
Disallow: /cgi-bin/
Disallow: /ShoppingCart.asp
Disallow: /SearchResults.asp...
...
Disallow: /mailinglist_subscribe.asp
Disallow: /mailinglist_unsubscribe.asp
Disallow: /EmailaFriend.asp -
I added the pages that it was suggesting to the robots.txt file:
http://www.naturalrugco.com/robots.txt
Most of the pages listed in the high priority errors within moz analytics crawl diagnostics are the emailafriend.asp pages which I've disallowed. Ex: http://www.naturalrugco.com/EmailaFriend.asp?ProductCode=AMB0012-parent
-
Hi Ryan,
At the end of this page you will find several ways to block Roger bot from indexing pages: http://moz.com/help/pro/rogerbot-crawler
I hope it helps,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Planning to update my Volusion site to HTTPS protocol. Concerns.
I'm planning to update my Volusion Commerce site to https protocol. A config variable switch will config urls to become secure https urls. I'm looking into what additional steps I must take and issues that I may run into. I guess any relative links within my site will not be affected but how about incoming backlinks. How do I address this. Also concerned about previous redirects. Do I have to create a new xml file incorporating the new protocol in the target and a version of both secure and insecure source urls. I figure a new sitemap has to be submitted to Google Search Console - should there be a secure https://www.example version as well as a https://example version of site and older similar http versions submitted. Thanks Howard
On-Page Optimization | | mrkingsley0 -
Update old article or publish new content and redirect old post?
Hi all, I'm targetting a keyword and we used to rank quite good for it. Last couple of months traffic of that keyword (and variations) is going down a bit. I wrote an extensive new post on the same topic, much more in dept and from 600 to 1800 words covering the same topic. Is it better to update the old article and mention that it's updated recently, or publish a new post and redirect the old post to the new post?
On-Page Optimization | | jorisbrabants0 -
Disavowed links, updated website etc - still no ranking improvements
Hi, Could anyone take a look at www.artificialgrass4u.co.uk - a few years ago it used to rank highly for 'artificial grass' ... then when Google rolled out its algorithms punishing websites with poor links it lost all it's rankings. We've disavowed almost all of the bad links, and have been adding new optimised content etc over the past few months but rankings still haven't improved. Is there anything I'm missing? Thanks
On-Page Optimization | | icansee0 -
How to fix Medium Priority Issues by mozpro crawled report??
How to resolve this issues crawled by mozpro?
On-Page Optimization | | renukishor
Some Medium priority issues like that: Missing Meta Description Tag: 2669
Title Element is Too Long: 523
Duplicate Page Title: 37 How to add missing meta description tag in these pages and how to short title element ?0 -
Massive increase in Moz crawl.
I have a subdomain which has just started to be crawled by Moz, Previously this wasn't the case. The sub-domain had 16,000+ issues. Why has Moz started to count sub-domains as part of the main domain, has Google started to do this aswell?
On-Page Optimization | | danwebman0 -
HTML Improvements is not updated
Hi, I am using Google Webmaster to check 'HTML Improvements' errors and found that I have many Duplicate meta descriptions errors. I think that they occurred because I have change some **old URLs **to **new URLs **and then use redirect 301. This lead meta description of **old URLs **duplicate with meta description of new URLs. For example: old URL: www.abc.com/this-is-a-url--1.html and new URL: www.abc.com/this-is-a-url.html. And then use redirect 301 from www.abc.com/this-is-a-url--1.html to www.abc.com/this-is-a-url.html Who did face this problem? Please help me how to fix this? Thanks
On-Page Optimization | | JohnHuynh0 -
Big problem with my new crawl report
I am owner of small opencart online store. I installed http://www.opencart.com/index.php?route=extension/extension/info&extension_id=6182&filter_search=seo. Today my new crawl report is awful. The number of errors is up by 520 (30 before), up with 1000 (120 before), notices up with 8000 (1000 before). I noticed that the problem is with search. There is a lot duplicate content in search only. What to do ?
On-Page Optimization | | ankali0 -
Does the seomoz crawler that crawls for the onpage reports have a set ip?
I would like to test my site, but its not launched yet and don;t want anybody to see it. But I can allow myself and other to view the site if I have there ip address. So does the seomoz crawler have a static one or range? James
On-Page Optimization | | BarefootJames0