Crawl test csv has lost its formatting??
-
All the columns/heading merged into column A.
Anyone else noticed this over the past few days?
-
Great tip! Thanks
-
Thanks a lot. Had the same problem in Win 10
-
Ok worked out a solution for anyone updating to win 10 and having trouble opening csv files.
To change
- Settings - Time & Language
- Additional date, time & regional setting (at the bottom
-change date, time or number formats - Additional settings ... (Button)
Change the List Separator from the semi colon to a comma.(,)
-
Thanks for the extra information!
-
Pasting from sheet to excel works, would appear to be a win 10 change in formatting issue.
Thanks any ways.
-
Hmm..
That does seem to be the potential problem here. It's one we unfortunately won't be able to troubleshoot for a bit though. The PC users at Moz have been told not to upgrade until a vulnerability has been patched so we don't have an environment to test in. I know this isn't the most elegant solution, but I recommend sticking with Sheets for now. I'll be sure to pass this along though so we can get this looked at as soon as we can. Apologies I don't have a better answer for you
Please let us know if there's anything else you need in the interim though.
-
I agree it opens fine with Google sheets, lets see if anyone else has issues with win 10 and excel..
-
Have been using office 2016 excel for the past three years or so with no issues. Only recent change is that I have updated to Win 10 from 8.1
Updated to the latest office still no joy.
This is a major issue for me
-
Hey Eric!
I downloaded and reviewed a couple of crawl test reports in both your and my own accounts using Excel, Numbers and Google Sheets. Everything seemed to work ok for me. What program are you trying to use? Have you run the most recent updates for it?
-
Had a look/download all the past reports. All data has merged into one column??
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How To Stop Moz Crawl From Prepending /blog/ on all our site urls that it crawls
Hello, At some time in the past our WP site had urls like this: www.oursite.com/blog/post-title-pretty-link The site has not used that url structure for quite some time, but Moz crawl is still hitting every post with /blog/prepended and as a result is generating thousands of 404s. When the /blog/ is removed from the url, then the urls work fine. Where are those old urls being stored and how can we update them? How do we address this issue? Any assistance will be appreciated. Thanks!
Moz Bar | | dbcooper1 -
If we put the disavow links in google, does MOZ crawl the same links?
I have put bad or spam links in disavow file, but still showing in MOZ backlinks. So, I want to know that Why is MOZ not removing the spam links from their system?
Moz Bar | | insidewebanalytics0 -
Crawl Test is now On-Demand Crawl!
If you've been with Moz a while, you may have used our old Crawl Test tool. A year ago we launched an all new, campaign-based Site Crawl (with an entirely rebuild crawl engine), but Crawl Test fell into disrepair and we haven't had a solid tool for crawling non-campaign domains. I'm happy to announce that we've just launched an all new On-Demand Crawl, built on the new Site Crawl engine, with a UI that's focused on quick insights. Moz Pro Standard tier customers can run up to 5 crawls per month at 3,000 page per crawl (crawls are saved for 90 days), with per-month limits increasing at higher levels. Most On-Demand Crawls should run in a few minutes, making the tool perfect to get quick insights for sales meetings, vetting prospects, or analyzing competitors. We've written up a sample case study or logged-in customers can go directly to On-Demand Crawl. Try it out -- we'd love to hear your use cases (either here or in the blog post comments).
Moz Bar | | Dr-Pete6 -
Http:// https:// google search console crawl errors
How to direct http:// to https:// to get rid of 404 errors in google webmaster search console (http:// crawl errors)
Moz Bar | | O.D.0 -
Crawl Test Takes Long Time
Hi Moz, I have submitted our website for a crawl test. Usually it would only take a few hours to do the crawl. However this time, it takes quite long time and the result still shows in progress 😞 This is a small website which only contains less than 10 pages. Just wondering if this is our website setting issue or it is a technical issue at your end? Many thanks in advance. sFjAERG.png
Moz Bar | | russellbrown0 -
How do you stop Moz crawling a page?
Hello, I have a contact form which generates thousands of duplicate crawl errors. I'm going to use to block Google indexing these pages. Will this also block MOZ from crawling these pages and displaying the error? Thanks!
Moz Bar | | Seaward-Group0 -
Not getting foreign characters in crawl diagnostics .csv
The crawl diagnostics .csv file is showing high-ascii characters instead of the correct language (foreign language website) e.g. Vietnamese, Chinese (both kinds), etc. Is there a way to get this right?
Moz Bar | | trainSEM0 -
Moz "Crawl Diagnostics" doesn't respect robots.txt
Hello, I've just had a new website crawled by the Moz bot. It's come back with thousands of errors saying things like: Duplicate content Overly dynamic URLs Duplicate Page Titles The duplicate content & URLs it's found are all blocked in the robots.txt so why am I seeing these errors?
Moz Bar | | Vitalized
Here's an example of some of the robots.txt that blocks things like dynamic URLs and directories (which Moz bot ignored): Disallow: /?mode=
Disallow: /?limit=
Disallow: /?dir=
Disallow: /?p=*&
Disallow: /?SID=
Disallow: /reviews/
Disallow: /home/ Many thanks for any info on this issue.0