Recommended log file analysis software for OS X?
-
Due to some questions over direct traffic and Googlebot behavior, I want to do some log file analysis. The catch is this is a Mac shop, so all our systems are on OS X. I have Windows 8 running in an emulator, but for the sake of simplicity I'd rather run all my software in OS X.
This post by Tim Resnik recommended Web Log Explorer, but it's for Windows only. I did discover Sawmill, which claims to run on any platform.
Any other suggestions? Bear in mind our site is load balanced over three servers, so please take that into consideration.
-
Hello! I just came across your question. Here is a recent Moz post of mine on how to use log files for technical SEO -- the example screenshots are from the SEO dashboard in Logz.io. Feel free to sign-up at our site and test our public beta to see if it works for you.
Disclosure: Yes, I work for the company.
-
Hiya ufmedia,
Funny you bring this up, I just spent a few hours pulling my hair out trying to get a log parser to work on my Mac. Ultimately, I went back to my PC and ran Web Log Explorer. I have heard good things about Splunk, but seems too robust for SEO purposes alone. Their free version allows for 500/MB per day. If you are under that, It might be worth giving it a go. Sawmill looks like it could do the trick, but may require a decent amount of setup. Thanks for the tip! I will check it out.
Thanks!
Tim
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots File
For some reason the robots file on this site: http://rushhour.net.au/robots.txt Is giving this in Google: <cite class="_Rm">www.rushhour.net.au/bootcamp.html</cite>A description for this result is not available because of this site's robots.txtLearn moreCan anyone tell me why please?thanks.
Technical SEO | | SuitsAdmin0 -
Personalization software and SEO
Hi guys, I'm just testing a personalization software in our website, basically changing the "location" text depending on the user's IP. I can see in my software that when the Google bot comes to our site the personalization software triggers an action changing the location based text to "California". Can this make Google understand that our website targets only users in California and thereof hurt our rankings in other locations nationwide? I'll appreciate your opinions.
Technical SEO | | anagentile1 -
Is it good practice to update your disavow file after a penalty is removed.
I was wondering if you could use the disavow file by adding to it - even after your site has recovered from a partial site penalty. As a recurring SEO procedure, we are always looking at links pointing to our Website. We then ascertain those links that are clearly of no value. In order to clean these up, would it be good practice to update your disavow file with more of theses domains. Is the disavow file just used for penalty issues to alert google of the work you have done? (we have had penalty in the past but fine now) Would this method help in keeping high quality links to the fore and therefore removing low quality links from Googles eyes? I would welcome your comments.
Technical SEO | | podweb0 -
Recommendation for SEO plugin for Wordpress
Dear Moz Community, Could I pick your brains on SEO plugins for WordPress? Our web developer has installed an SEO plugin called Yoast, and I am not quite sure of it's efficiency. The problem we have at the moment is that the Page Title is not updating on Google the way we anticipated. To solve this issue we unchecked forced rewrite under the title options, but this had no effect. For instance our name on Google appears as Man Van London all the time, despite any amendments we make it always has Man Van London at the start of the title. ( website: www.manvanlondon.co.uk) If Yoast is the best SEO plugin for wordpress, is there any solution to fix this issue? Or is anyone familiar with another plugin? Does anyone suggest to not use plugin's at all? Thank you for your time. Looking forward to your wisdom. Monica
Technical SEO | | monicapopa0 -
Disavow file and backlinks listed in webmaster tools
Hi guys, I've sent a disavow file via webmaster tools. After that, should the backlinks from domains listed in that file disappear from the list of links to my website in webmaster tools? Or does webmaster tools show all the links, whether I've sent disavow file or not?
Technical SEO | | superseopl0 -
Kill your htaccess file, take the risk to learn a little
Last week I was browsing Google's index with "site:www.mydomain.com and wanted to scan over to see what Google had indexed with my site. I came across a URL that was mistakenly indexed. It went something like this www.mydomain.com/link1/link2/link1/link4/link3 I didn't understand why Google had indexed a page like that of mine when the "link" pages were links that were on my main bar which were site wide links. It seemed to be looping infinitely over and over. So I started trying to see how many of these Google had indexed and I came across about 20 pages. I went through the process of removing the URL's in Webmaster Tools, but then I wanted to know why it was happening. I had discovered that I had mistakenly placed some links on my site in my header in such a manner link1 link2 link3 If you know HTML you will realize that by not placing the "/" in the front of the link I was telling that page to add that link in addition to the URL that is was currently on. What this did was create an infinite loop of links which is not good 🙂 Basically when Google went to www.mydomain.com/link1/ it found the other links which then told Google to add that url to the existing URL and then go to that link. Something like: www.mydomain.com/links1/link2/... When you do not add the "/" in front of the directory you are linking too it will do this. The "/" refers to the root so if you place that in front of your directory you are linking too it will always assume that first "/" as the root then the url will follow. So what did I do? Even though I was able to find about 20 URL's using the "site:" search method there had to be more out there. Even though I tried to search I was not able to find anymore, but I was not convinced. The light bulb went on at this point My .htaccess file contained many 301 redirects in my attempt to try and redirect those pages to a real page, they were not really relevant pages to redirect too. So how could I really find out what Google had indexed out there for me since Webmaster Tools only reports the top 1000 links. I decided to kill my htaccess file. Knowing that Google is "forgiving" when major changes to your site happen I knew Google would not simply just kill my site for removing my htaccess file immediately. I waited 3 days then BOOM! Webmaster Tools was reporting to me that it found a ton of 401's on my site. I looked at the Crawl Errors and there they were. All those infinite loop links that I knew had to be more out there, I was able to see. How many were there? Google found in the first crawl over 5,000 of them. OMG! Yeah could you imagine the "Low quality" score I was getting on those pages? By seeing all those links I was able to determine about 4 patterns in the links. For example: www.mydomain.com/link1/link2/ www.mydomain.com/link1/link3/ www.mydomain.com/link1/link4/ www.mydomain.com/link1/link5/ Now my issue was I wanted to keep all the URL's that were pointing to www.mydomain.com/link1 but anything after that I needed gone. I went into my Robots.txt file and added this Disallow: www.mydomain.com/link1/link2/ Disallow: www.mydomain.com/link1/link3/ Disallow: www.mydomain.com/link1/link4/ Disallow: www.mydomain.com/link1/link5/ Now there were many more pages indexed that went deeper into those links but I knew I wanted anything after the 2nd URL gone since it was the start of the loop that I detected. With that I was able to have from what I know at least 5k links if not more. What did I learn from this? Kill your htaccess file for a few days and see what comes back in your reports. You might learn something 🙂 After doing this I simply replaced my htaccess file and I am on my way to removing a ton of "low quality" links I didn't even know I had.
Technical SEO | | cbielich0 -
Submitting Sitemap File vs Sitemap Index File
Is it better to submit all sitemap files contained in a Sitemap Index File manually to Google or is it about the same as just submitting the Master Sitemap Index File.
Technical SEO | | AU-SEO0 -
Should I move x-cart installation or 301 redirect?
We have an existing e-commerce site built on x-cart. The default store location is www.site.com/store. The domain root however is just a static HTML page (currently using mainly graphics) and a nav menu. What would be a better option: 1. Move the install location to the root directory and get rid of the static HTML page. We would have to manually 301 redirect all the old pages to the new location. Not sure if there are negative implications with that. 2. Just optimize the HTML landing page? Seems like it is better to have products and categories as close to the root domain as possible... 3. 301 redirect the domain to www.site.com/store/ and optimize the homepage within the store. This option means we dont have to worry about 2000 redirects or the hassle of moving the store. Anyone had any experience with this and suggestions?
Technical SEO | | BlinkWeb0