Know Your Knowledge Objects in Splunk

Are you having trouble getting your data from different sources to be useful? Often, our customers have many data sources coming into Splunk, but don’t know how to get the full benefits of the data.  Why is this a consistent issue for Splunk customers? In our experience, it’s mostly to do with an unfamiliarity with Knowledge Objects in Splunk.

The Sales Scenario

In this scenario, let’s take a look at how you can connect your alerts function to sales performance. As a sales leader, you want to be alerted on and track the following criteria, met with your data…

  • Sales performance on products
  • Heavy increase/decrease in sales of a product
  • Track/trend data history including item # references, location, customer name
  • A customer has chosen a competitor’s product

First things first — where can we find this data? In this scenario, check out your billing system. We’ve seen that you can get most of this information there. As a general rule, a developer needs that data to troubleshoot issues. Now, you could turn in a request to IT… but we know that process is lengthy.

It’s likely you have the data but it is spread out into multiple log files, databases and files. Getting that very large volume of unstructured data and then meshing it together can be an almost impossible task. Thankfully, Splunk has the ability to create knowledge objects which can save the day. Allow Splunk to complement the processes your data warehouse folks have put in place. Splunk knowledge objects can meet a need for real-time data gaps that exist in many organizations.

Look Back at Old Data

Let’s set some steps to operationalize your data. Once you grab your data, here’s some help on how to get reports, generate alerts, set thresholds, etc. on things you need to be watching.

Check out these sources:

Log_file_1: It collects and captures Summary Order Information: Account_Number, Order_Number, and Total_Order_mount.

Log_file_2: Captures Order Detail Activity: Order_Number, Product_ID, Status (cancel, activate, suspend), Location, Item_Amount.

Table_1: Has customer information: Account_Number, Customer_Name, Customer_Contact, Customer_Phone, Customer_Location, etc.

You’ll want to use the Splunk DB Connect to pull a list of active customers and get it ready to enhance your logs. You’ll run a quick DBConnect SQL job to pull the fields back you want on a schedule. The best part about Splunk knowledge objects, is they can be scheduled to run as needed. If I need to pull the list once a day I can, or if I need to pull it every 15 minutes…done.

To make the data useful, check out these steps…

  1. Sync-up Account Number in your DB Connect pull to the Account_Number in Log_file_1. Now, you’ve created a lookup from the DB Connect pull and match on Account_Number in Log_file_1.
  2. Next, you’ll want to create a new field in your Log_file_1 _index called Customer_Name, Customer_Contact, and Customer_ Phone.
  3. Finally, you’ll sync all your account information with the data in the Log_File_1_index  so that it’s is available for searching.

Great! You just created a search to pull back the time periods you need to report on. It might look something like this:

index=Log_file_1_index |fields Customer_Name, Account_Number, Total_Order_Amount, Customer_Location | table Customer_Name, Account_Number, Customer_Location, Total_Order_Amount

Look Ahead at New Data

Now you’re tracking on historical data, but you still need to track this by current products and statuses. Let’s make it easy on the users.  We have all the extra fields we need in Log_file_1_index now that we have enhanced it.  Why not use the same approach as we did in Step 2?  Essentially, you’re creating a stacked approach to data enhancement.  You would create a lookup that extracts all the field we need from Log_file_1_index and enhance the Log_file_2 index.  What does this look like?

  1. Pull your fields. We create a search that pulls the fields needed from Log_file_1_index. It looks like we need Customer_Name, Account, Number, Customer_Location, Order_Number, and Total_Order_Amount.  Then, we use that lookup to propagate the data into Log_file_2_index.  Do we have any matching fields?  Whew, Order_Number looks like our ticket!  We match on Order_Number and output the Customer_Name, Account,Number, Customer_Location, and Total_Order_Amount .  Once that’s done, let Splunk do it’s magic! It makes those fields available in Log_file_2_index.
  2. Create your search. This might look something like this: index=Log_File_2_index |stats sum(Item_Amount) as Amount by Cutomer_Account, Customer_Name,Customer_Location, Order_Number, Total_Order_Amount,  Product_Number, status
  3. Visualize your data. Finally, add this to a dashboard with various views that my end users can see and dissect the information further. You’ll want to set a threshold of alerts on that dashboard, defined by end-user feedback on various products.  Now, you’re set to get alerts on product performance.  In real-time, you can see what is causing the changes in sales.

Kinney Can Help

Kinney Group has years of experience developing unique solutions and creating knowledge objects to support every type of Splunk environment. Need some help creating knowledge objects or other custom features in Splunk? Contact us and fill out the form below.

Author

Start typing and press Enter to search