Skip to content
SPL // Splunk

Using the eval command

KGI Avatar
 

Written by: Ellis DeVaney | Last Updated:

 
May 8, 2024
 
Search Command of the Week: eval
 
 

Originally Published:

 
May 8, 2024

Splunk’s Search Processing Language (SPL) empowers users to search, analyze, and visualize machine data effortlessly. Using the eval command allows you to apply various operations for data manipulation. Mastering the eval command enables you to create more meaningful and insightful searches. In this article, we discuss benefits of using the eval command in your Splunk searches. We also provide some real-world examples for how it can be used.

Understanding the eval Command

The eval command evaluates expressions and assigns the output to a field. It performs arithmetic operations, string manipulations, conditional logic, and more. With the eval command, you can create new fields or modify existing ones based on complex criteria. This enables you to customize your search results and extract valuable insights from your data.

Benefits of Using eval

  • Performing calculations: With eval, you can perform mathematical operations on numeric fields, such as calculating averages, sums, or percentages, directly within your SPL queries.
  • String manipulation: Like basic calculations, string manipulation is one simple way to use the eval command. It does not require more advanced functions. With eval, any Splunker can create a field containing custom text. This field can then be tailored using the content of other fields.
  • Handling multivalue data: Some eval functions are specifically designed to read, create, or modify fields that contain multiple values per event. When working to meet precise data presentation format requirements, these operations are crucial.
  • Interpreting time data:  Aggregating logs from different technologies presents a challenge in both parsing and creating time data. Time functions with the eval command provides the flexibility to utilize and present time data in Splunk events.
  • Applying conditional logic: The eval command supports conditional expressions, giving precise control of modifications to data based on content in the existing dataset.
  • Field value assignment: Any eval operation assigns values to new fields or overwrites values of existing fields.

Proper Command Syntax

The basic syntax for the eval command is as follows:

				
					index=<index>
| eval <new_field> = <expression>
				
			

This command accepts new or existing field names and uses combinations of strings, calculations, and eval functions to create expressions that modify existing data values.

Sample Use Cases

Example 1: Mathematical Calculations

Use Case: Gain further insights on Website Transaction data using Splunk Tutorial Data.

				
					index="tutorial" sourcetype=access_combined_wcookie
| stats sum(bytes) as bytes by action
| eventstats sum(bytes) as total_bytes
| eval percentage=(bytes/total_bytes)*100
				
			

This search generates summary statistics on the sum of bytes for each action. Using the eventstats command, we create an additional field (total_bytes) with the total sum of bytes in the data. A simple calculation with eval can then create a new field showing the percentage of all traffic volume for each action. 

For more information on the eventstats command used in this example, see the previous Search Command of the Week article Using the eventstats Command.

Example 2: String Manipulation
Use Case: Create a detailed, dynamic event message using field values from the original dataset.
				
					index=_internal component=LicenseUsage
| stats sum(b) as bytes by idx, h, st
| eval gb=round(bytes/1024/1024, 2)
| eval message="Splunk has ingested "+gb+" GB of data
with a sourcetype of "+st+" into the index "+idx+".
This data can be found by searching:
index="+idx+" sourcetype="+st+" host="+h)
				
			

Any Splunk instance can use this search with internal Splunk log data to show a breakdown of ingest-based license usage. The initial stats command produces a summarized table, where an eval command performs a calculation. This calculation also uses the round function for data readability. Then, another eval command combines a user-defined string with inserted data for each unique combination of index, sourcetype, and host to create the desired output of a custom event message.

Example 3: Handle Multivalue Data

Use Case 1: Use a multivalue list of hosts to create a field for supplemental reporting context.

				
					index=_internal log_level=ERROR
| stats values(host) as host_list by component
| eval host_count=mvcount(host_list)
				
			

The query above searches Splunk’s internal logs for ERROR messages. Moreover, it produces a table listing all hosts with errors for each Splunk logging component. The multivalue eval function mvcount is utilized to create an additional field (host_list), indicating the number of hosts listed for each logging component. This provides administrators with additional context to prioritize potentially-widespread issues in the environment.

Use Case 2: Use a multivalue field to normalize hostnames.

				
					index=tutorial
| eval host_group=(random() % 10) + 1
| eval domain=case(host_group<4, ".org.com", host_group>=4 AND host_group<8, ".local", host_group>=8, "")
| eval hostname=case(host_group<6, upper(host), host_group>=6, lower(host))
| eval host=hostname+"-"+host_group+domain
| eval split_host=split(host, ".")
| eval shortname=mvindex(split_host, 0)
| eval host=upper(shortname)
| stats count by host
				
			

This query introduces some additional concepts for simulating various hostname formats. Later in this article, we’ll revisit the type of conditional logic demonstrated with case to further illustrate this common use of the eval command. The scenario provides varying domains (and lack thereof), as well as varying case of hostname, for the hosts in Splunk Tutorial Data. This logic produces 20 unique hostname patterns for what could be uniquely identified as 5 hosts.

Using multivalue eval functions in this scenario works by splitting the hostnames on the “.” character using split. First, the query selects the first segment of the hostname using mvindex by identifying the 0 index value of the “split_host” field, it is then writes it to the field “shortname”. We make the data sortable and consistent, and also case normalize with the upper function. Then, we use this last string to overwrite the “host” field, showing uppercase versions of the 5 original Tutorial Data hosts.

Example 4: Interpret Time Data

Use Case 1: Modify an epoch timestamp to use a chosen time format.

				
					| tstats latest(_time) as latest_event where index=_internal earliest=-7d latest=now() by host
| eval latest_event=strftime(latest_event, "%Y-%m-%dT%H:%M:%S.%Q")
				
			

The search identifies the latest event in the_internal index for each Splunk server and forwarder. It’s helpful for missing forwarders, but it writes an epoch timestamp for the “latest_field” field. Using the strftime function with eval parses and formats the timestamp to a user-friendly timestamp format specified by ISO 8601.

Use Case 2: Parse an additional timestamp field for use in SPL logic.

				
					index=tutorial sourcetype=vendor_sales
| rex field=_raw "\[(?<timestamp>\d{1,2}/\w+/\d{4}:\d{2}:\d{2}:\d{2})\]"
| eval timestamp=strptime(timestamp, "%d/%b/%Y:%H:%M:%S")
| eval ingestion_delay=_indextime-timestamp
| stats avg(ingestion_delay) as avg_delay
				
			

This example search ignores timestamp extraction from Splunk Tutorial Data to focus on troubleshooting ingestion latency. Initially searching “vendor_sales” sourcetype, a timestamp is extracted as a string to a new field “timestamp”. This is done with the rex command, which can be further explored in the Search Command of the Week article Using the rex Command.

The strptime function is used with the eval command to read this string as a valid timestamp. Splunk can perform a simple calculation to show the ingestion latency between the origination of the event and the time that it was indexed in Splunk. With the static dataset, a significant difference will appear, highlighting an example of a potential issue with active Splunk Universal Forwarder data streaming.

Example 5: Applying Conditional Logic

Use Case: Create a field identifying priority events.

				
					index=tutorial sourcetype=access_combined_wcookie
| eval priority_event=if(action=="purchase" AND status>=400, 1, 0)
| where priority_event=1
				
			

Like the case function seen in an earlier example, the if function evaluates a condition to apply values to new or existing fields. This scenario is using an action of “purchase” and a status code representing a range for potential errors to identify events of interest.

If the only objective in this scenario was to filter data, this SPL deviates from the best-practice approach to filter as early in the search as possible. However, using this conditional logic with the eval command provides a discrete field that can assist with producing more detailed visualizations and statistics.

The search below shows a modification to produce a chart emphasizing that less than 2% of events meet the chosen criteria to raise concern.

				
					index=tutorial sourcetype=access_combined_wcookie
| eval event_category=if(action=="purchase" AND status>=400, "concern", "deprioritized")
| stats count by event_category)

				
			

Conclusion

Incorporating the eval command into your Splunk searches greatly expands your ability to extract meaningful information and make data-driven decisions. Additionally, as you continue to explore its capabilities, you’ll find endless possibilities for transforming your data and gaining valuable insights.

In summary, the eval command in Splunk SPL is a powerful tool for manipulating and deriving fields, enabling you to unlock deeper insights from your data. 

Remember, by mastering the eval command, you can create more context for producing insightful reports and visualizations.

Moreover:

  • The eval command allows you to perform calculations, manipulations, and conditional logic on fields.
  • It enables you to derive new fields based on existing ones, enhancing your data analysis capabilities.
  • By mastering the eval command, you can create more context for producing insightful reports and visualizations.
Helpful? Don't forget to share this post!
LinkedIn
Reddit
Email
Facebook