Is there a way to take notes on a crawled URL?
-
I'm trying to figure out the best way to keep track of there different things I've done to work on a page (for example, adding a longer description, or changing h2 wording, or adding a canonical URL. Is there a way to take notes for crawled URLs? If not what do you use to accomplish this?
-
Hey! Dave here from the Help Team,
There are a couple different things you can do to mark items that you have done. One of the new features we have implemented into Site Crawl is the ability to mark items as "Fixed". This tool can be handy if you know that you have fixed issues with your site but are still waiting for your next update. Another trick you might want to do is download your "all crawled pages" CSV and then create a "notes" column. It wont live in the Moz dashboard but at least you would have a good record! Hopefully those options help you out!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Server blocking crawl bot due to DOS protection and MOZ Help team not responding
First of all has anyone else not received a response from the help team, ive sent 4 emails the oldest one is a month old, and one of our most used features on moz on demand crawl to find broken links doesnt work and its really frustrating to not get a response, when we're paying so much a month for a feature that doesnt work. Ok rant over now onto the actual issue, on our crawls we're just getting 429 errors because our server has a DOS protection and is blocking MOZ's robot, im sure it will be as easy as whitelisting the robots IP, but i cant get a response from MOZ with the IP. Cheers, Fergus
Feature Requests | | JamesDavison0 -
Why my site not crawl?
my error in dashboard: **Moz was unable to crawl your site on Jul 23, 2020. **Our crawler was banned by a page on your site, either through your robots.txt, the X-Robots-Tag HTTP header, or the meta robots tag. Update these tags to allow your page and the rest of your site to be crawled. If this error is found on any page on your site, it prevents our crawler (and some search engines) from crawling the rest of your site. Typically errors like this should be investigated and fixed by the site webmaster i thing edit robots.txt how fix that?
Feature Requests | | alixxf0 -
Errors for URLS being too long and archive data is duplicate
I have hundreds of errors for the same three things. The URL is too long. Currently the categories are domain.com/product-category/name main product/accessory product/
Feature Requests | | PacificErgo
-Do I eliminate somehow the product category? Not sure how to fix this. 2) It has all of my category pages listed as archives showing duplicates. I don't know why, as they are not blog posts, they hold products on them. I don't have an archived version of this. How do I fix this? 3. It is saying my page speed is slow. I am very careful to optimize all my photos in PhotoShop. Plus I have a tool on the site to further compress. I just went with another host company that is supposed to be faster. Any ideas/ I would so appreciate your help and guidance. All my best to everyone, be safe and healthy.0 -
Meta Noindex report: exclude urls and delete all warnings in once
Dear community, I have a question about "Site Crawl" > "Crawler Warnings" > "Meta Noindex" reports. I see 3.6K errors with all the same url base: /register?url={{xxxxxxxx}} The page have the following robots meta tag: 1. Can i exclude some urls from crawling by moz-spider bot? If yes, how?
Feature Requests | | bettingfans.com
2. Can i use the 'mark as fixed' or 'ignore' functionality for all urls in once? Now, i have to execute the action 36 times to 'ignore' all 3600 errors. Hope, someone can help me 🙂0 -
Moz crawler is not able to crawl my website
Hello All, I'm facing an issue with the MOZ Crawler. Every time it crawls my website , there will be an error message saying " **Moz was unable to crawl your site on Sep 13, 2017. **Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster. " We changed the robots.txt file and checked it . but still the issue is not resolved. URL : https://www.khadination.shop/robots.txt Do let me know what went wrong and wjhat needs to be done. Any suggestion is appreciated. Thank you.
Feature Requests | | Harini.M0 -
Is there a way to turn off Meta Noindex warnings?
My site is Wordpress, and indexing tags, categories, dates, etc. just ends up causing potential duplicate page content, and so using the advised SEO Yoast plug in suggestion, we have set those type listings to noindex. However now MOZ flags me with 1.7k+ noindex warnings. I know it is not hurting anything, but is there anyway to disable that warning to clean up my crawl error report?
Feature Requests | | OBIAnalytics1 -
Is there a way to schedule automatic weekly .csv reports for the Tracked Keywords Overview?
Using The Custom Reports tool, I only managed to get PDF reports. It would be useful to automatically receive .csv reports by email. Any idea how?
Feature Requests | | Digital-Sun0 -
Is there a way to add custom dashboards into Moz's "Custom Reporting"?
I have dashboards set up in Google Analytics that track conversions and year over year traffic data and just wanted to see if their was a way to add those extra metrics Moz's Custom Report? Any help or suggestions would be much appreciated!
Feature Requests | | Gruenagency0