Skip to content
SPL // Splunk

Using the accum Command

KGI Avatar
 

Written by: Robert Caldwell | Last Updated:

 
May 8, 2025
 
Search Command Of The Week: accum
 
 

Originally Published:

 
May 6, 2025

Splunk’s Search Processing Language (SPL) provides a robust framework for analyzing and visualizing machine data. Among its versatile command set, the accum command stands out as a powerful tool for running calculations across events in your dataset. Many Splunk users need to perform cumulative calculations or track running totals across time-series data. The accum command addresses this need by allowing you to accumulate values as events are processed, enabling trend analysis, pattern detection, and progressive calculations that would otherwise require complex workarounds. 

Understanding the accum Command

The accum command, short for “accumulate,” performs ongoing calculations as it processes events in your dataset. Unlike simple aggregation commands that produce a single result, accum maintains a running calculation that updates with each event. This enables you to track progressive changes over time or sequence. This command can also be used with other commands that work with calculations like stats and chart. Understanding how accum can be used with time and calculations will unlock its true potential. 

Benefits of Using The accum Command

The accum command offers several advantages that can enhance your Splunk experience: 

  • Enhanced Time-Series Analysis: By calculating running totals, averages, or other metrics over time, accum enables progressive analysis than single-point aggregations. This proves invaluable when tracking system behavior patterns or business metrics that evolve over time. 
  • Simplified Query Complexity: In lieu of writing complex queries with multiple subsets and joins, accum provides a simple calculation of progressive metrics that reveal trends and patterns in your data while being lightweight. 
  • Baseline Anomaly Detection: By using cumulative values against expected thresholds or baselines, accum can help identify anomalies that might not be apparent when looking at individual events. This is particularly valuable in security monitoring where gradual changes might reveal indicators of compromise. 

Proper Basic Syntax

The basic syntax for the accum command is: 

				
					... | accum <field> [as <field>] 
				
			

Where: 

  • <field>: Specify the field you want to track the accumulation of values. This will overwrite the field’s values with the accum values. This should be a field containing a numerical value.
  • as <field>: Creates a new field for the accum values so the original values are maintained. 

Example Use Cases

Example #1: Monitoring Cumulative Network Traffic
In this scenario, you need to analyze network traffic volume accumulating throughout the day to identify potential bandwidth issues or unusual patterns. 
				
					`cim_datamodel_network_traffic`  
| bucket _time span=1h  
| stats sum(bytes) as hourly_bytes by _time  
| eval GB=round(hourly_bytes/1024/1024/1024,2) 
| sort _time  
| accum GB as accum_GB  
| table _time, GB, accum_GB 
				
			

This search leverages the Common Information Model macro to retrieve network traffic data, groups it by hour, calculates the bytes transferred in each hour. We then convert the bytes to gigabytes for easier interpretation. Finally, we sort by time to get a chronological accumulation before counting by our new GB field. We then table out our time which will be each hour, the sum of gigabytes each hour with GB, and the accumulated amount over time with accum_GB. 

Example #2: Tracking Authentication Failures

Security analysts often need to monitor authentication failures to detect potential brute force attacks. Using accum with reset conditions provides an elegant solution. 

				
					`cim_datamodel_authentication`  
| where action=failure 
| stats count by action, _time, user, src_ip 
| sort _time 
| accum action as accum_actions 
| where accum_actions > 5  
| table _time, src_ip, user, accum_actions 
				
			
This example uses the CIM authentication datamodel to track login failures. We are then only going to search for failures to authenticate. We will count these failures and then sort them over time to see how they add up. After using accum we want to see any case after someone fails to authenticate 5 times. This will give us a way to track unauthorized attempted logins to accounts. 
Example #3: Financial Transaction Analysis

When analyzing financial data, calculating running balances provides crucial insights into cash flow patterns. 

				
					index=financial sourcetype=transactions account_id=017237543 
| sort _time 
| accum transfer as balance 
| eval status=if(balance<0, “OVERDRAWN”, balance>=0, “OK”) 
| table _time, transfer, balance, status 
				
			
In this example, we are tracking the transactions of a specific bank account as money is added and subtracted. We can track these as a total with the accum command calling it the balance. We also added an extra field that will tell us if a balance is overdrawn by being under 0. 

Conclusion

The accum command in Splunk SPL delivers powerful capabilities for progressive calculations that enhance your data analysis workflow. By enabling cumulative calculations across your dataset, it provides: 

  • Deeper Temporal Insights – Revealing patterns and trends that might be missed by point-in-time aggregations, allowing for more comprehensive analysis of how metrics evolve over time. 
  • Streamlined Complex CalculationsEliminating the need for multiple searches or external processing to track running totals, counts, or other accumulating metrics across your data. 
  • Enhanced Analytical Flexibility – Supporting diverse use cases from financial analysis to security monitoring through its versatile syntax and ability to reset based on conditions. 

Understanding and leveraging the accum command effectively can significantly enhance your analytical capabilities in Splunk, especially when working with time-series data or scenarios requiring progressive calculations. Therefore, mastering this command is highly recommended for Splunk users looking to elevate their data analysis skills. 

To access more Splunk searches, check out Atlas Search Library, which is part of the Atlas Platform. Specifically, Atlas Search Library offers a curated list of optimized searches. These searches empower Splunk users without requiring SPL knowledge. Furthermore, you can create, customize, and maintain your own search library. By doing so, you ensure your users get the most from using Splunk.

Helpful? Don't forget to share this post!
LinkedIn
Reddit
Email
Facebook