Crawl Diagnostics Updates
-
I have several page types on my sites that I have blocked using the robots.txt file (ex: emailafriend.asp, shoppingcart.asp, login.asp), but they are still showing up in crawl diagnostics as issues (ex: duplicate page content, duplicate title tag, etc). Is there a way to filter these issues or perhaps there is something I'm doing wrong resulting in the issues that are showing up?
- Ryan
-
Hi Ryan,
try to move the sitemap to the end and leave a space before it. something like this:
User-agent:*
Disallow: /cgi-bin/
Disallow: /ShoppingCart.asp
Disallow: /SearchResults.asp...
...
Disallow: /mailinglist_subscribe.asp
Disallow: /mailinglist_unsubscribe.asp
Disallow: /EmailaFriend.asp -
I added the pages that it was suggesting to the robots.txt file:
http://www.naturalrugco.com/robots.txt
Most of the pages listed in the high priority errors within moz analytics crawl diagnostics are the emailafriend.asp pages which I've disallowed. Ex: http://www.naturalrugco.com/EmailaFriend.asp?ProductCode=AMB0012-parent
-
Hi Ryan,
At the end of this page you will find several ways to block Roger bot from indexing pages: http://moz.com/help/pro/rogerbot-crawler
I hope it helps,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How often is your domain authority updated?
I can't seem to figure out how often our domain authority is updated - it seems random, do you know typically when this happens? Thanks!
On-Page Optimization | | regineraab0 -
Moz Crawl Shows Duplicate Content Which Doesn't Seem To Appear In Google?
Morning All, First post, be gentle! So I had Moz crawl our website with 2500 high priority issues of duplicate content, not good. However if I just do a simple site:www.myurl.com in Google, I cannot see these duplicate pages....very odd. Here is an example....
On-Page Optimization | | scottiedog
http://goo.gl/GXTE0I
http://goo.gl/dcAqdU So the same page has a different URL, Moz brings this up as an issue, I would agree with that. However if I google both URL's in Google, they will both bring up the same page but with the original URL of http://goo.gl/zDzI7j ...in other words, two different URL's bring up the same indexed page in Google....weird I thought about using a wildcard in the robots.txt to disallow these duplicate pages with poor URL's....something like.... Disallow: /*display.php?product_id However, I read various posts that it might not help our issues? Don't want to make things worse. On another note, my colleague paid for a "SEO service" and they just dumped 1000's of back-links to our website, of course that's come back to bite us in the behind. Anyone have any recommendations for a good service to remove these back-links? Thanks in advance!!0 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
301 redirected Duplicate Content, still showing up as duplicate after new crawl.
We launched a site where key landing pages were not showing up in google. After running the seomoz crawl it returned a lot of duplicate pages which may expalin this. The actual url of the page is /design and it was telling me the following were dupes: /design/family-garden-design
On-Page Optimization | | iterate
/design/small-garden-design
/design/large-rural-garden-design
/Design All of these URL's were in fact pointing to the /design landing page. I 301 redirected all of the pages so they all now resolve to /design After running another crawl the day after doing this it's still showing up as duplicate content on seomoz. Does seomoz evaluate the new changes right away?0 -
Number of pages crawled in dropping from 4 to 2
The report on our campaign shows that up to 2 pages are being crawled now from 4. However, our site has more pages than this. We recently inserted code to allow crawlers. what can we do to resolve this? please assist.v
On-Page Optimization | | seoworx1230 -
URL 404 errors after crawl? HELP!
I am getting Crawl errors. It shows multiple pages as. I know this is more of a technical question however, I cannot find the answer anywhere. I'm using wordpress www.mydomain.com/title-of-page/mydomain.com/contact WHAT IS THIS?!
On-Page Optimization | | ChristineWeinbrecht0 -
Pages crawled
I noticed there is a limited in the number of pages crawled on galena.org? Will this number increase over time?
On-Page Optimization | | nskislak240 -
Does Frequency of content updates affect likelyhood outbound links will be indexed?
I have several pages on our website with low pr, that also themselves link to lots and lots of pages that are service/product specific. Since there are so many outbound links, I know that the small amount of PR will be spread thin as it is. My question is, if I were to supply fresh content to the top level pages, and change it often, would that influence whether or not google indexes the underlying pages? Also if I supply fresh content to the underlying pages, once google crawls them, would that guarantee that google considers them 'important' enough to be indexed" I guess my real question is, can freshness of content and frequency of update convince google that the underlying pages are 'worthy of being indexed', and can producing fresh content on those pages 'keep google's interest', so to speak, despite having little if any pagerank.
On-Page Optimization | | ilyaelbert0