Recommended log file analysis software for OS X?
-
Due to some questions over direct traffic and Googlebot behavior, I want to do some log file analysis. The catch is this is a Mac shop, so all our systems are on OS X. I have Windows 8 running in an emulator, but for the sake of simplicity I'd rather run all my software in OS X.
This post by Tim Resnik recommended Web Log Explorer, but it's for Windows only. I did discover Sawmill, which claims to run on any platform.
Any other suggestions? Bear in mind our site is load balanced over three servers, so please take that into consideration.
-
Hello! I just came across your question. Here is a recent Moz post of mine on how to use log files for technical SEO -- the example screenshots are from the SEO dashboard in Logz.io. Feel free to sign-up at our site and test our public beta to see if it works for you.
Disclosure: Yes, I work for the company.
-
Hiya ufmedia,
Funny you bring this up, I just spent a few hours pulling my hair out trying to get a log parser to work on my Mac. Ultimately, I went back to my PC and ran Web Log Explorer. I have heard good things about Splunk, but seems too robust for SEO purposes alone. Their free version allows for 500/MB per day. If you are under that, It might be worth giving it a go. Sawmill looks like it could do the trick, but may require a decent amount of setup. Thanks for the tip! I will check it out.
Thanks!
Tim
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Temporary redirect from 302 to 301 for PNG File?
#302HTTP #temporaryredirect
Technical SEO | | Damian_Ed 0
Hi everyone, Recently I have faced a crawl issue with my media images on website. For example this page url https://intreface.com/wp-content/uploads/2022/12/Horion-screen-side-2.png has 302 HTTP Status and the recommendation is to change it 301. I have read the article on temporary redirections here:
https://moz.com/learn/seo/redirection?_ga=2.45324708.1293586627.1702571936-916254120.1702571936
but its not written here how to redirect in my HTML 1 image url not the landing page.
Screenshot 2023-12-15 at 11.02.40.png
I have messaged to MOZ Support but they recommended to go for the MOZ Community!
Screenshot 2023-12-15 at 11.06.02.png Could you assist me wit this issue please? I can reach HTTML of the necessary page and change what I need for permanent redirection but firstly I need to understand how to do that correctly.0 -
Duplicate content analysis
Hi all,We have some pages being flagged as duplicates by the google search console. However, we believe the content on these pages is distinctly different (for example, they have completely different search results returned, different headings etc). An example of two pages google finds to be duplicates is below. if anyone can spot what might be causing the duplicate issue here, would very much appreciate suggestions! Thanks in advance.
Technical SEO | | Eric_S
Examples: https://www.vouchedfor.co.uk/IFA-financial-advisor-mortgage/harborne
https://www.vouchedfor.co.uk/accountant/harborne0 -
Content available only on log-in/ sign up - how to optimise?
Hi Mozzers. I'm working on a dev brief for a site with no search visibility at all. You have to log in (well, sign up) to the site (via Facebook) to get any content. Usability issues of this aside, I am wondering what are the possible solutions there are to getting content indexed. I feel that there are two options: 1. Pinterest-style: this gives the user some visibility of the content on the site before presenting you with a log in overlay. I assume this also allows search engines to cache the content and follow the links. 2. Duplicate HTTP and HTTPS sites. I'm not sure if this is possible in terms of falling foul of the "showing one thing to search engines and another thing to users" guidelines. In my mind, you would block robots from the HTTPS site (and show it to the users where log in etc is required) but URLs would canonicalise to the HTTP version of the page, which you wouldn't present to the users, but would show to the search engines. The actual content on the pages would be the same. I wonder if anyone knows any example of large(ish) websites which does this well, or any options I haven't considered here. Many thanks.
Technical SEO | | Pascale0 -
How to add specific Tumblr blogs into a disavow file?
Hi guys, I am about to send a reconsideration letter and still finalizing my disavow file. The format of the disavow is Domain:badlink.com (stripping out to the root domain) but what about those toxic links that are located in tumblr such as: badlink.tumblr.com? The issue is that there are good tumblr links we got so I don't want to add just tumblr.com so do you guys think I will have issues submitting badlink.tumblr.com and not tumblr.com? Thank you!
Technical SEO | | Ideas-Money-Art0 -
Is there any value in having a blank robots.txt file?
I've read an audit where the writer recommended creating and uploading a blank robots.txt file, there was no current file in place. Is there any merit in having a blank robots.txt file? What is the minimum you would include in a basic robots.txt file?
Technical SEO | | NicDale0 -
Can anyone recommend a good UK service provider for Dedicated Servers?
Hi, we have just had a shocking experience moving from shared hosting to a dedicated server with a very large service provider. Our website slowed down, the server has been down more times than up so we have moved back to shared hosting until we can find a supplier that can deliver. nightmare!! If anyone can recommend a good company it would be appreciated. Thanks
Technical SEO | | SGIMarketing1 -
Submitting Sitemap File vs Sitemap Index File
Is it better to submit all sitemap files contained in a Sitemap Index File manually to Google or is it about the same as just submitting the Master Sitemap Index File.
Technical SEO | | AU-SEO0 -
Keywords in file names vs folder names
We understand the value of a keyword phrase included in the URL. Is there more value to having that phrase in the folder name of the URL or the file name or does it matter? Example: http://www.biztoolsone.com/website-design.php or http://www.biztoolsone.com/website-design/ Which is best? Thanks, Wick Smith
Technical SEO | | wcksmith0