Skip to content
SPL // Splunk

Using the rex Command

KGI Avatar
 

Written by: Robert Caldwell | Last Updated:

 
December 12, 2025
 
Search Command Of The Week: rex
 
 

Originally Published:

 
December 12, 2025

Splunk Processing Language (SPL) serves as the backbone for searching and analyzing machine data within Splunk. This powerful query language enables security analysts, IT professionals, and data engineers to extract meaningful insights from vast amounts of log data. Among the many commands available in SPL, the rex command stands out as an essential tool for field extraction and data manipulation. 

The rex command uses regular expressions to extract new fields from existing event data. In practical applications, this proves invaluable when working with unstructured or semi-structured data that doesn’t conform to standard field extractions. This is extraordinarily useful when you want to do something like search web server logs where critical information like session IDs or error codes are embedded within raw text strings. Because of challenges like this, the rex command becomes your go-to solution for pulling out this hidden data. 

Understanding the rex Command

At its core, the rex command performs pattern matching on event fields. It searches for specific patterns using regular expressions and creates new fields based on what it finds. The command operates on the _raw field by default, though you can specify any field in your events. 

The power of rex lies in its flexibility. Unlike static field extractions defined at index time, rex allows you to dynamically extract fields during search time. This means you can adapt your data extraction to meet changing requirements without re-indexing your data. The command supports both extraction and replacement operations, giving you multiple ways to manipulate your data. 

Regular expressions might seem daunting at first but mastering even basic patterns will significantly enhance your Splunk capabilities. The rex command bridges the gap between raw, unstructured data and the structured fields you need for effective analysis. 

Benefits of the multisearch Command

#1 Dynamic Field Extraction Without Re-indexing

One of the most significant advantages is the ability to extract fields on-the-fly. Instead of waiting for data to be re-indexed with new field extractions, you can immediately parse and analyze information as your needs evolve. This agility is particularly valuable in scenarios where time is critical like incident response. 

#2 Handling Complex & Unstructured Data

Many data sources don’t follow predictable formats. The rex command excels at extracting information from messy, unstructured text that automatic field extraction might miss. Whether dealing with custom application logs or unusual error messages, rex gives you the precision control needed to capture exactly what you want. 

#3 Enhanced Search Performance & Precision

By creating specific fields through rex, you can write more targeted searches. Rather than using wildcard searches across raw text, you can search against extracted fields, which often improves performance. Extracting fields are a key part of using Splunk and enables better use of its statistical commands and visualization tools. 

Basic Syntax

The fundamental syntax for the rex command follows a straightforward pattern (brackets means that argument is optional): 

				
					rex [field=<field>] "<regex-pattern>" [max_match=<int>][offset_field=<string>] 
				
			

Key components include: 

  • <field>: Specifies which field to search (defaults to _raw) 
  • <regex-pattern>: Use (?<name><regex>) to create new fields 
  • max_match: Controls how many matches to capture 
  • offset_fieldFILL IN

Here’s a simple example structure: 

				
					| rex field=_raw "user=(?<username>\w+)" 
				
			

This would create a new field with the extracted values. We could then search this field and get the captured strings as values to be used in our searches. 

There is also a “sed” mode where can modify or change current field values: 

				
					rex mode=sed [field=<field>] "<regex-pattern>" 
				
			

Key components include: 

  • modeDetermines to extract in sed mode 
  • <field>: Specifies which field to search (defaults to _raw) 
  • <regex-pattern>: Use /s/<regex>/<replacement>/<flags> to replace text in your data or “y/<string1>/<string2>/”  to substitute characters 
    • /s: Makes you replace text 
    • <regex>: Captures the text to replace 
    • <replacement>: The string to replace the pattern with 
    • <flags>: You can replace all matches with “g” or use a number to specify a certain matched string 
    • y/: Has you substitute characters 
    • <string1>: Your characters that will be replaced by those in <string2> 
    • <string2>: The characters that will replace those in <string1> 

Here’s an example if I wanted to make a report but need to mask a series of IP addresses: 

				
					| rex mode=sed field=_raw "s/(\d{1,3}\.){3}\d{1,3}/XXX.XXX.XXX.XXX/g" 
				
			

Here’s another example if I wanted to convert all the characters in a field to uppercase: 

				
					| rex mode=sed field=_raw "y/abcdefghijklmnopqrstuvwxyz/ABCDEFGHIJKLMNOPQRSTUVWXYZ/" 
				
			

These show how we can not only extract new fields, but also modify our data to suit our uses based on what we would like to do. 

Usage Examples & Practical Applications

Example #1: Extracting IP Addresses from Log Files

Use Case: We need to extract IP addresses from log files to identify potential security threats or analyze network traffic patterns.  

In this example, we will use the rex command to extract IP addresses from the _raw event field: 

				
					sourcetype=access_combined  
| rex field=_raw "(?<client_ip>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})" 
| table client_ip 
				
			

This search starts by filtering down toaccess_combined sourcetype events. Then the rex command is used to extract IP addresses using the provided regular expression pattern from the raw data. Last, the search stores the extracted IP addresses in the client_ip field and displays them in the final table output. 

Example #2: Substituting Personal Information to Protect Sensitive Data

Use Case: We need to mask account numbers that are going to be displayed on a dashboard where we do not want them to be shown. These are sensitive account numbers spanning 16 characters in length. 

In this example, we’ll use the rex command with the sed mode to replace sensitive data in a data set: 

				
					sourcetype=vendor_sales 
| rex field=AcctID mode=sed "s/(\d{12})/XXXXXXXXXXXX/g" 
| stats sum(sales_amount) by AcctID 
				
			

This search filters to only the vendor_sales sourcetype before we use rex with the mode set to “sed”. This changes the input we use compared to the example above. We are also going to look at the “AcctID field which already holds the values we want to mask. This reduces resource consumption and makes the search run faster since it will scan a smaller string than it would if we used _raw. Since we want to replace full strings, we use “/s” and follow this with regex to match the AcctID field values’ first 12 digits using “(\d{12})/”. We then follow this up with our replacement, a series of 12 X’s to hide the potentially sensitive information. Then we use a stats command to get the sum of sales for each AcctID. As a result, the table will show sales amounts by account ID. 

Conclusion

The rex command represents a fundamental skill for any Splunk practitioner. By leveraging regular expressions, you can unlock valuable information trapped within unstructured data. The command’s flexibility allows you to adapt quickly to new data sources and changing analytical requirements. 

Throughout this guide, we’ve explored how rex enhances your SPL toolkit. From basic syntax to real-world applications, the command proves indispensable for field extraction. Whether you’re investigating security incidents, monitoring application performance, or analyzing user behavior, rex empowers you to extract exactly the data you need. 

Key Takeaways: 

  • Dynamic extraction capability: The rex command enables on-the-fly field creation without requiring data re-indexing, providing flexibility and speed in data analysis 
  • Pattern matching power: Regular expressions allow precise extraction from complex, unstructured data sources that automatic field extraction cannot handle effectively 
  • Enhanced analytical workflow: Extracted fields improve search precision, enable better statistical analysis, and standardized reporting 

To access more Splunk searches, check out Atlas Search Library, which is part of the Atlas Platform. Specifically, Atlas Search Library offers a curated list of optimized searches. These searches empower Splunk users without requiring SPL knowledge. Furthermore, you can create, customize, and maintain your own search library. By doing so, you ensure your users get the most from using Splunk.

Helpful? Don't forget to share this post!
LinkedIn
Reddit
Email
Facebook