Four Ways to Analyze Logs Like a Pro!

Hi there! My name is Meredith and I love to eat logs (they’re big, they’re heavy, they’re wood) for Breakfast, Lunch, Dinner, and as an awesome 2am snack.

In this blog I will show you how to analyze logs like a pro, using four of my favorite methods: manually, Excel, command line, and finally, Splunk.

Breakfast: Manually Reviewing Logs.

For log files smaller than 100 lines where I do not need to use special tools to get large counts, but am rather hunting for small details in a small grouping of logs, it isn’t unheard of for me to print the logs out and review them on paper as if I were reading three pages of a book.

Now, before you discount this immediately and move on to lunch, I will always benefit from reading through 30 or 40 individual logs to understand their structure, so that whenever I come across them again, I have a better grasp of what they are, and why they are organized the way they are. This also helps me immensely when I need to know what fields are what when I am utilizing Excel, Splunk, or the command line.

Lunch: Using Excel

Excel, or one of its equivalent counterparts, is a wonderful tool for parsing logs, and when using it with a CSV (comma separated values) or TSV (tab separated values) file, everything comes pre-sorted, and each column has a header (hopefully) to easily distinguish between the values.

Excel has built in tools that allow for an easy way to sort using buttons that will help you visualize what you are trying to sort or count or organize. When using Excel to parse logs, I like to equate them as though Excel is a GUI for command line. During the National Cyber League (NCL) Games, I will often use command line tools first, and when I need a bit of guidance, turn to Excel before moving on to more advanced (Splunk) measures.

Dinner: Command Line

Command line was the first way to parse logs, and arguably the most efficient. No additional application is required, a GUI is optional, and can usually be achieved in one or two commands. The efficiency is only as good as the way you go about using the tool, and there are a few command line utilities that will always come in handy, and quickly become staples in day-to-day computer use. For example, Grep, my favorite command line utility is perfect for searching for specific strings, and I will utilize the count command to count the frequency of items found in the grep. Using these two utilities together will produce quick results perfect for simple tasks, but these will not be able to efficiently gather all information.

The challenges in the NCL Games will often have very large files, so using these command line tools will often be best to start the challenges to either get rid of unnecessary data, or find the first few things you’re looking for and help get you on the right track.

If you are using the command line and the data you’re given is not either in a CSV or a format that has pre-separated values, you can always use sort and cut in the command line to help you separate the data out to make it easier to search. Around 70% of the command line utilities that I would use when analyzing some log files would be before I have mentioned above: grab, sort, cut, and count.

Quick important tip: when you are just starting out, it is very important to remember to save your file frequently in the event that you run a command you did not intend to or it produces unexpected results and you want to go back to your last file.

My 2am fun snack: Splunk

Analyzing logs with Splunk is sinfully fun and easy. I like to think of Splunk as that chocolate cake you realize you need at 2am. Splunk is a mixture of the best functions of Excel, the most useful command line utilities, as well as the benefit of reviewing logs manually, but, only showing you the necessary bits of information that you are searching for.

When analyzing a log file in Splunk, Splunk uses what are known as field extractions and will extract all data related to a particular field that you specify. You are able to give that field a name and search by that name and specific modifiers to say to find exactly what you want. For more common types of logs, such as Windows event logs or PFsense firewall logs, Splunk and the software/hardware companies make small applications called TAs, which is short for technology add-ons. These TAs are able to compute the field extractions on their own simply by using the information stored in the TA, with minimal or often, no required user configuration. These add-ons may also contain additional useful bits of data, including an explanation as to why data is behaving in a particular way, or providing you with metrics that may answer some of the questions that you didn’t know you had.

One of the wonderful benefits of Splunk is their community forums. Any question you could’ve ever dreamed of asking is likely already on there. Think of this as Stack Overflow, but only for Splunk. Their community is open to the public and is filled with data parsing questions galore. If I don’t find the exact answer I’m looking for, I’m usually able to find somebody who has a similar enough question that while reading some of the documentation I am able to figure out how to solve my problem.

Splunk is also great for the large files used in the NCL Games because Splunk can quickly parse through large amounts of data with minimal processing power and provide you an easy frontend with which to look at all of these logs. You can further drilldown from each search and run and it will save them in the recent search history so that you can review them as many times as needed to your hearts content. If you set up Splunk during the NCL Team Game and host it somewhere it can be accessible by the rest of your teammates, you can even share links of search results with your teammates to either double check the answers as a sanity check, or to have a team member help you through the challenge. Splunk is also great as a way to track the challenges you have completed and the ones you have not but, that is the story for another time.

All of these options are feasible during the NCL Games, however, some log analysis routes may take more time than others and some may be faster to learn than others. Don’t let anybody judge you for doing what you are comfortable with even if it is just reviewing the logs manually and reading them like a book. If that’s where you are most comfortable, that is a great starting point and I would like to personally challenge you to try and solve one challenge using another tool to step outside your comfort zone once and begin learning a new tool. Just remember, if you have a question either search for that question using your finely honed OSINT skills, or ask the question on a forum, unless this is during the games of course. During the Games, this would be considered attempting to get outside help which is considered cheating. If you have any questions on what is permitted and what is not permitted, check out the blog about cheating here. I strongly suggest that If you have questions during the Games, save them for after and post your question to the world! Somebody will be more than happy to answer and help you to the best of their ability.

And with these little bits of advice on logs and how to evaluate them, I bid you all adieu. Remember, as a very wise teenager who was struggling with many things once said, “Nobody’s Perfect, I gotta work it, again and again ‘till I get it right”.


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.