Recommended log file analysis software for OS X?
-
Due to some questions over direct traffic and Googlebot behavior, I want to do some log file analysis. The catch is this is a Mac shop, so all our systems are on OS X. I have Windows 8 running in an emulator, but for the sake of simplicity I'd rather run all my software in OS X.
This post by Tim Resnik recommended Web Log Explorer, but it's for Windows only. I did discover Sawmill, which claims to run on any platform.
Any other suggestions? Bear in mind our site is load balanced over three servers, so please take that into consideration.
-
Hello! I just came across your question. Here is a recent Moz post of mine on how to use log files for technical SEO -- the example screenshots are from the SEO dashboard in Logz.io. Feel free to sign-up at our site and test our public beta to see if it works for you.
Disclosure: Yes, I work for the company.
-
Hiya ufmedia,
Funny you bring this up, I just spent a few hours pulling my hair out trying to get a log parser to work on my Mac. Ultimately, I went back to my PC and ran Web Log Explorer. I have heard good things about Splunk, but seems too robust for SEO purposes alone. Their free version allows for 500/MB per day. If you are under that, It might be worth giving it a go. Sawmill looks like it could do the trick, but may require a decent amount of setup. Thanks for the tip! I will check it out.
Thanks!
Tim
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Confused about repeated occurences of URL/essayorg/topic/ showing up as 404 errors in our site logs
Working on a Wordpress website, https://thedoctorwithin.comScanning the site’s 404 errors, I’m seeing a lot of searches for URL/essayorg/topic, coming from Bingbot, as well as other spiders (Google, OpensiteExlorer). We get at least 200 of these irrelevant requests per week. Seems like each topic that follows /essayorg/ is unique. Some include typos: /dissitation/Haven't done a verification to make sure the spiders are who they say they are, yet.Almost seems like there are many links ‘in the wild’ intended for Essay.Org that are being directed towards the site I’m working on.I've considered redirecting any requests for URL/essayorg/ to our sitemap… figuring that might encourage further spidering of actual site content. Is redirection to our sitemap xml file a good idea, or might doing so have unintended consequences? Interested in suggestions about why this might be occurring. Thank you.
Technical SEO | | linkjuiced0 -
Do old website files in the public_html effect SEO?
My client has about a dozen old folders filled with old websites including index files, robots, htaccess files. They are all located in separate files with in public_html. Does this effect them negatively?
Technical SEO | | Renalynd0 -
Robot.txt : How to block a specific file type in several subdirectories ?
Hello everyone ! I need help setting up a robot.txt. I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site. Block files of a specific file type (for example, .gif) | Disallow: /*.gif$ 2 questions : Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ? Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$ Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files. Let's say I want to block pdf files in all these 3 directories /fileadmin/directory1 /fileadmin/directory1/sub1 /fileadmin/directory1/sub1/pdf Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple : Disallow: /fileadmin/directory1*/ Many thanks in advance for any insight you may have.
Technical SEO | | LabeliumUSA0 -
Domain change recommendations
We recently migrated one of our websites to a new domain. Obviously we were expecting a decrease in traffic initially, but it has actually gone down by 70% week-over-week since we made the switch. We set up a 301 redirect from the old domain to the new domain, changed all internal links to the new domain and changed all inbound links that we owned to the new domain. Our research suggested the best way to approach a domain change was by keeping it simple and not making too many changes at once. So my questions are: 1. Are these the kinds of results we should expect initially after a domain change? And if not, 2. What are the steps we should take from here? Thanks!
Technical SEO | | gouldtr0 -
Can someone interpret this entry in my htaccess file into english so that I can understand?
There are a number of entries in my htaccess and I'd like to understand what they are doing so that I can understand if they need to be there or not. So, can someone tell me what this says...in plain english? RewriteCond %{HTTP_HOST} ^legacytravel.com$ [OR]
Technical SEO | | cathibanks
RewriteCond %{HTTP_HOST} ^www.legacytravel.com$
RewriteRule ^carrollton-travel-agent$ "http://www.legacytravel.com/carrollton-travel-agent" [R=301,L] Thank you a million times in advance.0 -
Is it possible to export Inbound Links in a CSV file categorized by Linking Root Domains ?
Hi, I am performing an analysis of the total inbound links to my homepage and I would like to have the total amount of inbound links categorized by the Linking root domains. For example, the Open Site explorer does offer the feature to show you the Linking Root Domains to your page. Then when you click on the first Linking Root Domain, it also shows you the Top Linking Pages ( Which means all the pages that link to your page from this particular top level domain) Now I would like to export this data to a CSV file, but open site explorer only exports the total amount of top level linking domains. Does anyone has a solution to this problem ? Thank you very much for the help in advance!
Technical SEO | | Feweb0 -
What is the recommended or "best practice" Permalink Structure?
I have always been under the impression that by connecting pages to their parent pages as described in a.) below is best practice and makes sense to me. a.) yoursite.com/category/sub-category/product/ b.) yoursite.com/product But then i also understand the importance in terms of link juice being spread out across so many sub pages, and by using Example b.) you keep the link juice in tact. Your thoughts on this? Greg
Technical SEO | | AndreVanKets0 -
Can anyone recommend a good template club for joomla
Hi, i am looking for a good template club for joomla. I am looking for a template club that offers magazine and news style templates which are optimised and are not slow. A template i have been using has been reported as running slow and i want to change our template. any help would be great.
Technical SEO | | ClaireH-1848860