Unable to view crawl test
-
After doing a crawl test i get a download report.
It then downloads in csv form and when I go to view it there is a curruption error or just a load of gibberish signs
Can I not see the report onsite?
-
Hi there,
Thanks for reaching out!
I just logged into your account and reviewed both crawl tests in there. They opened fine for me in Excel, Google Sheets and Numbers. This leads me to believe the issue is actually with your spreadsheet program itself. I recommend reaching out to the support for this program to try and troubleshoot the issue with them. Sorry I can't be of more help here
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Web Site Migration Testing and SEO-QA Automation?
Hey Mozzers, Are there any good Migration-SEO-QA Tools out there? Given a prioritized list of URLs and prioritized list of Keywords, is there a tool that can compare basic SEO factors, old URL vs. new URL, and identify all the specific gaps that need to be fixed? Here is a basic SEO-QA acceptance checklist, for porting any website. . . . Until the porting work is completed we cannot accept the new website. Givens: 1. A list of the Top 100 URLs from the old site, prioritized by conversion rates, landing page traffic, and inbound links. 2. A list of the planned 404 - mapped URLs, old to new site, from the porting team. 3. A list of the current Top 200 Keywords, prioritized. 4. A good amount of SEO work has already been done, by several professionals, for the current (old) site. **How to evaluate if the new site will be acceptable to Google? Check ON-PAGE SEO Factors... ** **. . . that is, the NEW site must be AS GOOD AS (or better than) the current (old) site,
Moz Pro | | George.Fanucci
in the eyes of Google, to preserve the On-Page SEO work already done. ** Criteria: URLs ok? :: Is the URL mapping ok, old to new, best web page? LINKS ok :: Are all internal LINKS and keyword Anchor Text ported? TEXT ok :: On-page content, TEXT and keywords ok? TITLE ok :: HTML Title and title keywords ok? DESCRIPTION ok :: HTML Meta Description ok? H1, H2 ok :: HTML H1, H2 and keywords ok? IMG kwds :: HTML IMG and ALT keywords ok? URL kwds :: URL - keywords in new URLs ok? Potential porting defects: Keywords in URL missing: Keywords in HTML Title missing: Keywords in Meta Description missing: Any internal LINKS or Link anchor text missing: Keywords in Page TEXT missing: H1, H2 missing keywords: HTML IMG alt-text, IMG file URLs, any missing keywords: Notes: Until the porting work is completed we cannot accept the new site, or set a target date for potential cutover. There are eight (8) data items per URL, and about one hundred (100) URLs to be considered for SEO-QA before going live. We were expecting to cutover before the end of February, at the latest. There is no point in doing full QA acceptance-tests until the porting work is completed. QA spot-checks have found far too many defects. About 60% of the landing-page traffic comes via the top 40 URLs. With over 100 URLs to look at, it can take more than a week or two just to do SEO-QA in detail, manually, item-by-item, page-by-page, side-by-side, old vs. new. Spot-checks indicate a business disaster would occur unless the porting defects are fixed before going live. _Any Migration-QA Tools?_Given a prioritized list of URLs and prioritized list of Keywords, is there a tool that can compare basic On-Page SEO factors, old URL vs. new URL, and identify most of the specific gaps that need to be fixed before going live with the new site? _ *** Edit: Any comments on the SEO criteria, tools, or methods will be appreciated!_0 -
How do I exclude my blog subfolder from being crawled with my main domain (www.) folder?
I am trying to setup two separate campaigns for my blog and for my main site. While it is easy enough to do through the wizard the results I am getting for my main site still include pages that are in my blog sub folder. Please Advise!
Moz Pro | | sameufemia0 -
Only One page crawled..Need help
I have run a website in Seomoz which have many URLs with it. But when I saw the seomoz report that showing Pages Crawled: 1. Why this is happen my campaign limit is OK Tell me what to do for all page crawling in seomoz report. wV6fMWx
Moz Pro | | lucidsoftech0 -
Crawl Diagnostics - unexpected results
I received my first Crawl Diagnostics report last night on my dynamic ecommerce site. It showed errors on generated URLs which simply are not produced anywhere when running on my live site. Only when running on my local development server. It appears that the Crawler doesn't think that it's running on the live site. For example http://www.nordichouse.co.uk/candlestick-centrepiece-p-1140.html will go to a Product Not Found page, and therefore Duplicate Content errors are produced. Running http://www.nhlocal.co.uk/candlestick-centrepiece-p-1140.html produces the correct product page and not a Product Not Found page Any thoughts?
Moz Pro | | nordichouse0 -
Can i force another crawl on my site to see if it recognizes my changes?
i had a problem w/dup content and titles on my site, i fixed them immediately and im wondering if i can run another crawl on my site to see if my changes were recognized thanks shaun
Moz Pro | | daugherty0 -
Lots of site errors after last crawl....
Something interesting happened on the last update for my site on SEOmoz pro tools. For the last month or so the errors on my site were very low, then on the last update I had a huge spike in errors, warnings, and notices. I'm not sure if somehow I made a change to my site (without knowing it) and I caused all of these errors, or if it just took a few months to find all the errors on my site? My duplicate page content went from 0 to 45, my duplicate page titles went from 0 to 105, my 4xx (client error) went from 0 to 4, and my title missing or empty went from 0 to 3. On the warnings sections my missing meta description tag went form a hand full to 444. (most of these looking to be archive pages.) Down in the notices I have over 2000 that are blocked by meta robots, meta-robots nofollow, and Rel canonical. I didn't have any where near this many prior to the last update of my site. I just wanted to see what I need to do to clean this up, and figure out if I did something to cause all the errors. I'm assuming the red errors are the first things I need to clean up. Any help you guys can provide would be greatly appreciated. Also if you'd like me to post any additional information, please let me know and I'd be glad to.
Moz Pro | | NoahsDad0 -
SEOmoz crawl error questions
I just got my first seomoz crawl report and was shocked at all the errors it generated. I looked into it and saw 7200 crawl errors. Most of them are duplicate page titles and duplicate page content. I clicked into the report and found that 97% of the errors were going off of one page It has ttp://legendzelda.net/forums/index.php/members/page__sort_key__joined__sort_order__asc__max_results__20 http://legendzelda.net/forums/index.php/members/page__sort_key__joined__sort_order__asc__max_results__20__quickjump__A__name_box__begins__name__A__quickjump__E etc Has 20 pages of slight variations of this link. It is all my members list or a search of my members list so it is not really duplicate content or anything. How can I get these errors to go away and make search my site is not taking a hit? The forum software I use is IPB.
Moz Pro | | NoahGlaser780 -
How to Stop SEOMOZ from Crawling a Sub-domain without redoing the whole campaign?
I am using SEOMOZ for a client to track their website's performance and fix any errors and issues. A few weeks ago, they created a sub-domain (sub.example.com) to create a niche website for some of their specialized content. However, when SEOMOZ re-crawled the main domain (example.com), it also reported the errors for the subdomain. Is there any way to stop SEOMOZ from crawling the subdomain and only crawl the main domain? I know that can be done by starting a new campaign, but is there any way to work around an existing campaign? I'm asking because we would like to avoid the setting up the campaign again and losing the historical data as well. Any input would be greatly appreciated. Thanks!
Moz Pro | | TheNorthernOffice790