Searching In Splunk 101
Welcome To Splunk Search…A Journey Of Discovery
The Splunk query language is a powerful tool to help you interpret, analyze and present your data. It can also be one of the main reasons why people are put off using Splunk at all. Although it can be complex and sometimes overly complicated you don’t need to be a power user to start getting useful information from the search bar.
Instead of tackling this as learning a new language, why not look at the Splunk Query Language as going on a “journey of discovery.”
1. Start With A Search Term, Eg; ERROR
Back in the day, if you didn’t know the index name you wanted to search against, your searches would result in a blank green screen. But now all you need is a term to search against and you can start to find useful information. As an example let’s use the term “ERROR”. This will now provide any log file that has this term inside it.
2. Wildcards To Expand Your Search Terms
If you are using a version of Splunk earlier than version 6.3, you’ll still need to add an index for the term to be searched against but this is where you can start to use wildcards. For example; our original search for ERROR could now run as “index=* ERROR*”. This will search across all our indexes and look for and log files with the term ERROR, ERRORS or even ERRORnotanothererror.
3. Narrow Down Your Searches With Date And Time Range
For our first two examples we’ve used, we’d be searching across all the data that your Splunk environment would have available in its index. This could result in a very long search or a huge amount of data in your resulting report. To narrow our search down further, the search interface allows you to add a date or time range to narrow things down further. These time and date values can be added directly into your the search bar but for now, to the right of the search bar we are able to select from a drop down list of available preset time and date ranges, or narrow things down even further.
4. Look Through Your Extracted Fields
Just as we have preset values for date and time, Splunk will also try and provide useful information on your indexed data. When your data is indexed, Splunk will perform field extraction across log files and return useful information that can be used when searching and analyzing data. On the left of your search screen you will see a list of the more important and hopefully relevant fields extracting relating to your search. When you click on this data, Splunk will also give you a summary breakdown of the data which can also help narrow things down further.
5. Add Functions To Provide More Depth
Use functions to present and chart your data. The search bar provides useful hints on functions as you’re typing to help you complete your function. Combine the search terms you are looking for with functions to help transform and present your data. The three main functions we can start with to help present out data are:
The stats function is used to aggregate statistics over the search results we have previously found. For example if we wanted to see the number of errors by host, we would pipe our outputted results into | stats count by host…the complete search would look similar to the search below:
index=* ERROR* | stats count by host
The chart function is considered a transformation command as it returns its results in a table format. Much like stats these values can then be used in a chart or visualization.
index=* ERROR* | chart values(process) by date_minute
The timechart function allows you to create a time series chart with the defined table of statistics. Even though you can create these charts with the chart function, timechart is specifically designed to manage data and times and the amount of time that each data point is spanned.
index=* ERROR* | timechart span=1h count by source
Found this post useful? Kindly tap the ❤ button below! :)
About The Author
Vince has worked with Splunk for over 5 years, developing apps and reporting applications around Splunk, and now works hard to advocate its success. He has worked as a system engineer in big data companies and development departments, where he has regularly supported, built, and developed with Splunk. He has now published his first book via Packt Publishing — Learning Splunk Web Framework.