Splunk 101: Creating Pivots

 

Hello, Josh here, to walk you through another quick Splunk tutorial that will save you time and give your team a tool that everyone can use. In this video tutorial, I’ll discuss the importance of using the Pivot function of Splunk. Who wants to make Splunk easier… for just about any person? Pivots are the perfect way to get a non-Splunker started on pulling visualizations from your data. Here are some takeaways from the video when you’re using pivot in Splunk…

Key Takeaways from Creating Pivots in Splunk

When working with a Pivot in Splunk and ensuring you get the right visualizations… it all starts with your data models…

  • In simple terms, a Pivot is a dashboard panel. Every Pivot relies on your Data Models.
  • Pivots exist to make Splunk easy – any user, whether they have existing search language knowledge or not, can utilize the Pivot function.
  • Be careful not to delete or edit your data models while building your pivots (unless you know what you’re doing?).
  • The Pivot function has a drag and drop UI that makes creating dashboards simple.
  • You can manipulate the visualizations around your data – test out which chart looks best with your data!
  • There are limitations to the click and drag functionality of Splunk Pivot visualizations… all dependent on the limitations of your data set.

You may have read a few of my Splunk Search Command Series blogs, both myself, and our engineers here at Kinney Group produce weekly content around Splunk best practices. My team, the Tech Ops team, runs our Expertise on Demand service, which I’ll touch on a little more below. Our EOD team is responsible for knowing everything and anything around Splunk best practice… that’s why you’ll get access to a ton of video and written content from our team.

Meet our Expert Team

If you’re a Splunker, or work with Splunkers, you probably have a full plate. Finding the value in Splunk comes from the big projects and the small day-to-day optimizations of your environment. Cue Expertise on Demand, a service that can help with those Splunk issues and improvements to scale. EOD is designed to answer your team’s daily questions and breakthrough stubborn roadblocks. We have the team here to support you. Let us know below how we can help.

Case Study: Improved Visualization and Dashboarding Success with Splunk for Judicial Entity

With a mission to provide justice through systems and operations, visualizing data across all teams is essential to this judicial court entity. While comprising over 100 locations across the state, this customer needs to process massive quantities of information, fast. The customer’s network team was set on a mission to modernize their entire system in order to process information from each location with speed and precision. While this SLED customer works fast, this group needs to work with other teams through heatmap and ticketing processes that can slow down their system. With years of data sitting in their systems, maintaining and restoring historical data is a top priority.

In order to advance their data analytics capabilities, this customer purchased and implemented Splunk. After moving off of their legacy, open-source system, this customer knew they would be pressed to prove the value in the new Splunk platform. With little to no official Splunk training, this team needed a guide to show them the power of visualization and dashboarding in Splunk.

Challenges

  1. Following a new Splunk implementation, the customer needed to see results with Splunk fast. After moving over from their legacy system, this customer needed to ensure that they were accurately recording all historical data and moving it forward.
  2. Working with over 100 locations across the country, this customer needed the ability to process their data and display information across all sites with consistent dashboards and visualizations.
  3. With a small team of engineers, also had to work with multiple other teams within their site, the customer had to ramp up their engineers into the Splunk platform.

Solutions

  1. Our Expertise on Demand team quickly jumped in to show the value in the Splunk platform. After multiple resolved issues with ticketing and dashboarding, the judicial entity was able to adopt the Splunk platform across multiple teams.
  2. Through monthly “Lunch and Learns” and best practice knowledge transfer, our EOD team was able to educate this five-person Splunk team to a higher level of success in understanding and utilizing the platform.
  3. The customer was able to gain significant ground on searching, reporting, and dashboarding capabilities.

Business Impacts

Taking on investment in Splunk can be challenging. With added pressure to see instant results with the platform, our customer was placed under the spotlight to start utilizing the platform and enabling the adoption of Splunk across multiple teams. Through dashboarding and visualization support through our Expertise on Demand service offering, the customer did just that. Through advanced visualization methods in Splunk, the customer can now actively monitor and report metrics across their 100+ sites on all current and historical data.

4 Reasons an Accountant Should Use Splunk

Splunk for Accountants Use Case

I’ll admit, I am not very technical. I mean, I am in the field of Accounting, after all.  I can get around Excel spreadsheets pretty well and I can read some coding to get a pattern out of what it’s trying to do, and I’m good at breaking things when I think I’ve figured it out. That’s the gist of it for me though, and I imagine there are many others out there like me in the Accounting world.

As we all know, data mining is time-consuming and can be a little overwhelming. There is SO much data that it’s hard to figure out what data to use and what to lose that will help our leadership team(s) make data-driven decisions.

Insert Splunk… it’s something I knew nothing of when I first came to Kinney Group. Working with our engineers, I see the vast pool of Splunk use cases for every department in our company. There is so much more that we can do to help our leaders make the data-driven decisions with accuracy and in real-time.

 

You may be asking… “What could a Finance and Accounting team possibly use with Splunk?”

 

Well, in not-so-technical terms, it’s simple if you have data in excel or a CSV file- it can be Splunk’d.  Anything. No joke.

In most Accounting systems, some of the financial set up on the system leaves…  a lot to be desired.  As accountants, we want to push our systems beyond what they can do. You can pull hundreds of reports, dump massive amounts of data into Excel, then sift through all the data, knowing most of this data won’t get you the information or graphs you need.

With Splunk, we get visuals of all that data, summarized on colorful dashboards that allow us to see trends and help us make decisions. If you’re looking for automating some of your recurring reports, you don’t have to create a complicated macro in excel… Splunk it!

Now you’re asking, “What else can I do with Splunk?” Glad you asked. In my day to day as an accountant, here’s how I utilize Splunk to make my life easier.

 

4 Ways Splunk Makes My Life Easier

 

Utilization Tracking

We’re a professional services organization, in our world tracking against billable utilization, project hours, burn rates, and bonus plan targets are essential.

Tracking engineer utilization within Splunk allows us to watch our burn rates on projects… and the burn out the potential of our engineers – although we have a team of go-getter engineers, it’s important to keep an eye of the hours (and extra hours) they’re working!

 

Time Tracking

As any accountant knows, timesheet tracking is a constant battle. Splunk helps hold managers and colleagues accountable to submitting their time ON time. In fact, we just installed a Splunk alert that notifies colleagues if they have a timesheet missing before the deadline!

We even have Splunk set up to monitor our colleague’s progress and trends in submitting time. It’s great information for managers, and also great for us to send a little extra motivation to colleagues who notoriously submit their time late.

 

Schedule Tracking

In our organization, it’s crucial to see the availability of engineers and when they’re available for a project. Balancing existing projects, training, education, internal efforts, and of course their personal time off can be a lot. As a resource-driven organization, we set a high standard for planning our colleague’s time and capacity.

And with time segmented correctly, our forecast becomes stronger. We can now forecast accurate revenue numbers and bookings months out.

 

Project Tracking

I’m not a project manager but working in our systems and with projects being so closely tied to finances – project tracking is crucial. Splunk goes hand in hand with our availability matrix, giving us visibility to track if a project is on schedule, behind, or ahead of schedule.

And then there’s the immediate, real-time visibility into the hours/billings on any given project. Cut the manual spreadsheets and let your data speak to you through Splunk.

 

Splunk Made Easy

If you can dream it-and communicate it- then it can be achieved through Splunk’s amazing dashboards.  The sky truly is the limit with the right data and engineers in Splunk. If you know your company is using Splunk… but you’re unsure how to get it adopted in your department – reach out! Our Expertise on Demand service is designed to do just that – spread adoption of the Splunk platform across your organization.

 

Data Models Made Easy in Splunk

As Splunkers know, searching and reporting in Splunk Enterprise are quick and powerful tools for extracting valuable insights from your data. How are you maximizing your searching and reporting abilities?

That’s where the Splunk Data Model comes in. Splunk Data Models can take your searches—and their efficiency—to the next level. Let’s walk through how we do this…

Why You Should Use Data Models

One advantage of Data Models is the ability to combine various sourcetypes into a common model by utilizing field aliases. While vendors such as Cisco, Juniper, and Palo Alto may develop products with similar roles, their devices often log in different formats. The Splunk Data Models in the Splunk Common Information Model (CIM) utilize common field names for searching events regardless of the original vendor or format. A Splunk Add-on for any proprietary log format may comply with the CIM by defining field aliases and tags. The CIM Data Models then pull in the logs from various vendors and sourcetypes by utilizing a simple Splunk query with the appropriate tags.

Figure 1 - Data Models using Splunk queries as tags
Figure 1 – Data Models using Splunk queries as tags

The Database Data Model from the Common Information Model includes any events tagged “database” and stored in an index included in the configurable cim_Databases_indexes macro.

When particular reports are used frequently in Splunk, report acceleration can be useful for improving report load time and reducing duplicate indexer activity. Similarly, certain sets of data may be frequently utilized for a variety of reporting. Similar to Report acceleration, Data Model acceleration provides faster search performance and reduces duplicate activity by your indexers. A variety of reports can be run against the accelerated Data Model without pulling the results from your raw logs each time.

Splunk searches extract valuable information from your data, but Splunk Processing Language (SPL) can be hard to learn for some users. Data Models provide a benefit for these users in the form of Pivot searches. Splunk’s Pivot search allows you to search without using an SPL query. You can table results using both column or row splits and statistics functions such as sum or average.

Figure 2 - Count of the Data Model’s events split by rows of host and columns of sourcetype
Figure 2 – Count of the Data Model’s events split by rows of host and columns of sourcetype

This pivot search can now be converted to many of the standard Splunk visualizations such as a column chart.

Creating a Data Model

The first step in creating a Data Model is to define the root event and root data set. The root data set includes all data possibly needed by any report against the Data Model. For example, the Web Data Model:

Figure 3 - Define Root Data Set in your Data Model
Figure 3 – Define Root Data Set in your Data Model

Additionally, you can define child data sets so that you can search smaller subsets of your data. For instance, you can search the “proxy” child dataset of the Web Data Model.

Figure 4 - Define child data in your Data Model
Figure 4 – Define child data in your Data Model

After creating one or more datasets, you can then add fields to your Data Model. While entire raw events are stored in your Splunk indexes, Data Models only store the fields you specify. You can add fields from eval expressions, lookups, regular expressions, or automatic field extractions. Your child datasets inherit the fields of their parents while optionally having their own additional fields.

Figure 5 - Add fields to your Data Model
Figure 5 – Add fields to your Data Model

Data Model… Done!

And there you have it! A quick breakdown on Data Models and how they can take your Splunk efficiency to the next level. After defining your root data, child data, and fields – creating your data model gives you a completely new set of eyes on your data. From this lesson, you can take away tips on how to improve search performance and speeding up the report processing time.

Setting up Data Models in Splunk can do wonders on search performance. Kinney Group services can do wonders for your overall Splunk performance. We make Splunk easy for our customers – fill out the form below to learn how.

Splunk 101: Workflow Actions

 

Hey, and welcome to the video! My name is Elliot Riegner and I’m here with the Kinney Group to bring you a tutorial on Splunk Workflow Actions.

To get started we’ll learn about different types of workflow actions, how to configure them on Splunk’s graphical interface and going over a few use cases.

In order to implement workflow actions in your Splunk environment, you’ll need to assess which action works best for what you are trying to achieve. Then, you will create a new workflow action and configure it using Splunk’s Web interface, and validate the results

Splunk provides two main workflow actions: GET and POST. Both of these will create HTTP requests in order to either receive field-specific results or push out data.

The GET workflow action allows a user to use a web resource, and then a selected field, or fields to gain results on another website. An example of this workflow action is using a HTTP error code found within an event and googling what the code means.

The POST workflow action allows a user to send data to a remote web server. Examples include filling out online forums and creating tickets based on alerts

The more advanced Search workflow actions launch secondary searches that use specific field values from an event, such as a search that looks for the occurrence of specific combinations of ipaddress and http_status field values in your index over a specific time range.

Using the Whois Lookup Website, I want to create a Workflow Action that will search for any chosen IP address found within Splunk events.

Let us take a look at how this works:
Manually typing in any IP will generate a report providing useful information. I now see that the address I searched is Google.
Looking at the URL, I can see that it contains the IP address previously entered. This looks perfect for a GET workflow action.

Let us take a look at some events within Splunk Web. This example will be using Splunk’s tutorial dataset. After increasing my search time range to maximize results and searching within my main index, I see quite a few events with interesting fields. Let’s dive in. Taking a closer look at an event, I can see that the field clientIP could be usable for out GET workflow action. You can view configured Workflow Actions via the Event Actions dropdown.

Next, let’s configure the workflow action.

To get started, on Splunk Web navigate to Settings > Fields > Workflow Actions.

After doing so, we can click Add New to create a new Workflow Action.

Our GET workflow action will take any IP found in an event and send the HTTP request to The WHO_IS Lookup website, so let’s name it accordingly.
A label is needed, which can be dynamically named with the event’s field value by enclosing the field name in dollar signs. We will see what this looks like shortly.

Next, I will be choosing to apply this workflow action to only events that have the ClientIP field.

To configure the Link, Copy the website URL, and field surrounded by dollar signs where it would be located when actually searching an IP.

I will leave the default open in a New Window and lastly select GET as the link method

As you can see, my Workflow action was successfully created. Now, let’s see what it looks like under the Event Actions dropdown

After searching again for events within my main index, and expanding upon one returned, the newly created and dynamically named Workflow Action can be found. The name is dependent upon the ClientIP field of each individual event. After clicking, the Workflow Action is executed within a new window, and information regarding the IP address is found

Thank you for joining me in today’s video, I hope you enjoyed yourself and learned something new about workflow actions!

Meet our Expert Team

Be on the lookout for more Splunk tutorials! My team, the Tech Ops team, runs our Expertise on Demand service, which I’ll touch on a little more below. Our EOD team is responsible for knowing everything and anything around Splunk best practice… that’s why you’ll get access to a ton of video and written content from our team. EOD is designed to answer your team’s daily questions and breakthrough stubborn roadblocks.  Let us know below how we can help.

A Lesson on Macros in Splunk (Part Two)

Let’s talk about macros (again).  Macros in Splunk are built into a lot of apps found on Splunk Base and heavily used in the Monitoring Console.  In part one, we talked through all the prep work and foundation Splunk macros. Now in part two, let’s jump into some methods to create macros, and talk a little about context and sharing. Let’s get cooking!

 

Making Macros

Let’s go ahead and make some macros.   We’ll create a few macros through the web interface, then I’ll take you behind the scenes to see what actually happens in the conf files.

In my lab, I’m going to switch to the Search & Reporting app and add a macro for a search I personally run quite often.  As I’m experimenting and developing in the lab, I always run this search when I get unexpected behavior as a starting point for root cause analysis.

 

index=_internal AND earliest=-5m AND (log_level=WARN* OR log_level=ERROR) AND sourcetype=splunkd

 

But, it’s kind of long and I get lazy so I’m going to set it as a short macro.  So now that we’re in the Search & Reporting app, I’m going to go back to Settings > Advanced Search and click on the “+ Add new” button on the Search macros line.

Figure 1 - Add new Splunk macro
Figure 1 – Add new Splunk macro

The destination app is already set correctly.  I’ll name the macro “myissues” – that should be a unique name that’s descriptive of its purpose.  In the Definition field, I’ll paste in my search from above.  We’ll keep it simple for now and not use another of the options to include arguments.  Click Save.

Now we’ll go back to the Search & Reporting app and use our macro to run a search.  Enter `myissues` (remember the backticks) and click the search button.

Figure 2 - Use macro to run a search in Splunk
Figure 2 – Use macro to run a search in Splunk

There you have it!  A short macro name in the search bar and I have my results.  And with way less typing that, to be honest, usually includes a typo or two.

Add Parameters to Your Macro

It’s kind of inflexible.  What if the issue isn’t caught in Splunk?  What if it occurred more than 15 minutes ago?  Or less than 15 minutes ago in a large environment so I want to restrict the time and speed up results?  Sure, you can use the time picker, but where’s the macro fun in that?  Let’s add some parameters to our macro to make it more useful.

Going back to the Advanced Search settings, I’m going t0 click the Clone button to create a copy of my macro, then edit that clone.  This time, I’m going to give my search a unique name and add “(2)” to the end of the name, indicating that it will expect two arguments.  Then, in the Definition field, I’m going to tokenize the search so Splunk knows where to place the arguments in the search.  In the Arguments field, I’m going to list my arguments, separated by commas.

Now, I’m going to add a little validation to this macro.  The timeframe submitted should be a number here.  Any text would cause the search to fail, so before running the search we’ll validate that field is in fact numbers.  In the Validation Expression box, I’m going to put a simple eval statement that should return TRUE if the input is correct.  If that validation fails, I can write a custom error message to show when the macro runs.  Once set, click Save.

Figure 3 - Add validation to your macro in Splunk
Figure 3 – Add validation to your macro in Splunk

Now we’ll go back to the Search app and test it.  I’ll search the following, to find web errors in the last 30 minutes: `myissues2(30,splunk_web_service)`

And we get results!

Figure 4 - Splunk validation of macro results
Figure 4 – Splunk validation of macro results

And a look at the Job Inspector shows the search expanded with the tokens replaced by my parameters.

Figure 5 - Review job inspector
Figure 5 – Review job inspector

And if I use something other than numbers for my $earliest$ token, I get an error with the message we just set.

Figure 5 - Watch out for this error in Splunk
Figure 6 – Watch out for this error in Splunk

Storing Macros in Splunk

Great!  Now, if you’re a fan of the command line and get tired of GUI’s, let’s look behind the scenes.  If you’re not interested in how Splunk actually stores macros, then jump ahead.

OK, we know that most of Splunk’s knowledge objects and settings are stored in .conf files, and so it’s no surprise that macros are in a file called macros.conf.  Macros are user-level knowledge objects, at least when you create them in the web interface.  Since I was logged in as admin and working in the Search & Reporting app, I’ll navigate to /opt/splunk/etc/users/admin/search/local to find my personal configs for that app.

cat’ing the macros.conf file, we’ll see both of my macros in their own stanzas.  The settings we provided are now in alphabetical order rather than how we saw them on the web interface, but it should look familiar.

Figure 6 - Find your personal configs for the app
Figure 7 – Find your personal configs for the app

Permissions and Contexts

As mentioned briefly above, macros are privately owned by default – they’re only available to the user that created them and only in that app.  In the image below, I’ve logged in as a regular user and tried to run the admin macro created above.

Figure 7 - Error running macro in Splunk
Figure 8 – Error running macros in Splunk

The error is actually really descriptive.  We’ll follow instructions and go share this macro as the admin user.

Now, even though this particular macro is more admin oriented and not really useful to most users, I still want to be able to access it across apps.  That way I can troubleshoot from anywhere.  So it’s back to the Search Macros page under Settings > Advanced Search.  Filter for my macros and click the Permissions button.

Figure 8 - Filter for your macros in Splunk
Figure 9 – Filter for your macros in Splunk

To make sure I can use this macro in any app context, I’m going to select “All apps” under the “Object should appear in” section.  I’ll go ahead and leave the Permissions the same here because a regular user won’t have the right to read the _internal index anyways.  But that gets into another discussion of user roles and permissions that’s best left for another post.

Just to check that I can access my macro from anywhere, I’ll switch to another app and try accessing it again.  (Don’t mind the blanked out names, this is experimental and not ready for release….yet).

Figure 9 - Test your macros from anywhere
Figure 10 – Access your macro from anywhere

If you still have a terminal open where you just cat’ed macros.conf, hit the up arrow and enter.  You can see that the macro we just shared globally is gone.  It’s been moved to the search app, in the macros.conf in the local directory.  It hasn’t changed at all, but it did find a new home.

Figure 9 - Find your macros with open terminal
Figure 11 – Find your macros with an open terminal

One other advanced tip, if you want to see all the macros available to a given user, you can use a simple rest search:

 

| rest /servicesNS/-/-/admin/macros

 

This may be more information than you need to see but could help with some admin down the line, so I thought I’d share it.

Good luck with your Splunking!

Until Next Time…

Hopefully, this helps your understanding of macros: what they are, how to create and use them, and how to share them with other apps and users.  I’d love to give you a list of custom macros that everybody should have, but every Splunk customer and the user has different needs and different environments.

Splunk Search Command Series: Table and Fields

In the Splunk search world, table command and the fields commands are really similar, but they have different functions. The fields command allows you to bring back specific fields that live within your data, cutting down the time it takes for Splunk to retrieve the events associated with those fields. The table command does the exact same thing; however, it also lists the fields’ values. Let me show you an example of both of these commands in action. First up: the fields command.

Fields Command

In this first example, notice the search and the fields that it brings back.

Figure 1 - Start with your Splunk search
Figure 1 – Start with your Splunk search

By the way, that search above took a little over 10 seconds to complete. Let’s see how much faster Splunk can retrieve the data once we specify the fields that we’re looking for.

Job inspector results before using the fields command:

Figure 2 - Job Inspector results from Splunk search
Figure 2 – Job Inspector results from Splunk search

The interesting fields that were brought back from the above search:

Figure 3 - Interesting Fields list
Figure 3 – Interesting Fields list

Now that you have seen the interesting fields in the main index and the sourcetype in the above search, let’s say that we are only interested in action, ProductName, file, and JSESSIONID. By using the fields command, we can bring only these four fields back once Splunk completes the search. After that, we’ll check the job inspector to see how much faster Splunk was able to accomplish this search.

Here we have our new search introducing the fields command:

Figure 4 - New search with Splunk fields command
Figure 4 – New search with Splunk fields command

The results from the job inspector after using the fields command:

Figure 5 - Job inspector results with fields command
Figure 5 – Job inspector results with fields command

As you can see, after introducing the fields command to specify what fields we’re interested in, we cut the time Splunk takes to complete the search by almost seven seconds. Notice that Splunk only brought back the fields specified by the fields command.

Figure 7 - Splunk fields command results
Figure 7 – Splunk fields command results

Table Command

Switching gears to the table command. We are going to use the table command on the same four fields that we used in the fields command demonstration. The table command is a transforming command, which means it will take your search results and output the results into a tabular format. Like I mentioned before, it will only bring back fields specified after the command. Let’s take a look at the table command in action.

Here you can see the table command used in the same four fields. The results are now put into a table format displaying the values of the fields specified after the table command.

Figure 7 - Table command results in Splunk
Figure 7 – Table command results in Splunk

There you have it! The fields command and the table command: two very useful and powerful commands that you should definitely add to your arsenal of search commands. Enjoy!

 

Ask the Experts

Our Splunk Search Command Series is created by our Expertise on Demand (EOD) experts. Every day, our team of Splunk certified professionals works with customers through Splunk troubleshooting support, including Splunk search command best practice. If you’re interested in learning more about our EOD service or chat with our team of experts, fill out the form below!

The Results: Modernizing Your Splunk Environment

We’ve taken the past few weeks to dive into a powerful solution that proves every bit of value in your Splunk environment. In this solution, 3 Powerful Benefits of Modernizing Your Splunk Environment White Paper, you get a well-tested and credible design that will launch your Splunk environment to the next level. After recapping the huge benefits a design like this can provide, let’s talk about the objectives and results that this reference design could bring to your organization. And if that’s not enough, we want to show your the power of our Reference Design via our upcoming webinar.

First, let’s walk through a quick recap of why modernizing your Splunk environment with the Kinney Group and Pure Storage reference design could lead to unprecedented results for your organization.

3 Key Benefits of Modernizing Splunk

 

1) Unmatched Performance

The beauty of this reference design lies in the unmatched performance provided by combining PureStorage FlashBlade, Splunk SmartStore, and Kinney Group’s advanced Splunk configuration tuning in a virtualized environment.

2) Simplified Scaling

Accommodating scale is an ever-present struggle for IT teams and data center operators — providing sufficient infrastructure to facilitate more demanding requirements such as increasing compute, storage, and network needs. Complexities introduced by Splunk’s specialized data ingest requirements only make the situation more challenging (not to mention costly).

3) Lower Cost of Ownership

The distributed data center model provides high availability by replicating data, but effectively eliminates any benefits gained from Splunk data compression by increasing storage requirements. Co-locating storage and compute means when you need more storage, you have to add both compute and storage. To further increase the total cost of ownership (TCO), Splunk indexers with a distributed scale-out architecture usually have more servers with less storage to minimize the amount of data and time associated with server maintenance and failures.

 

The Power of Pure Storage and Kinney Group

Why are Pure and Kinney Group the right resources? We’ve built a reference design serving as your proof. Let us show you the power of what you can do in Splunk… through our 3 Powerful Benefits of Modernizing Your Splunk Environment White Paper. Let’s take a glimpse into what goals and objectives you’ll find met within our reference architecture.

The goal of this reference architecture is to showcase the scalability, performance, manageability, and simplicity of the virtualized FlashStack® solution for a large scale Splunk Enterprise Security deployment…

  • Design repeatable architecture that can be implemented quickly in production sites
  • Utilize VMware to reduce datacenter footprint and scale environment quickly
  • Utilize SmartStore to take advantage of shared object storage to improve operational efficiency and reduce overall disk requirements for the environment

I’m not going to give away ALL of our testing results right here, that’s why you should download your own copy. But here’s a sneak peek behind our findings…

 

“For data ingest of 500GB/day, Splunk recommends a minimum of 5 indexers* (100GB per indexer), meaning you would need 20 indexers or more for 2TB of volume. This reference design allows for 2TB+ of daily ingest using only 5 indexers — a 75% decrease in hardware requirements.”

Now, Let Us Show You…

In the past few weeks, we discussed the three key benefits of modernizing your Splunk environment through our reference design model in our blog series. Now, let us show you. On Thursday, October 1, we’ll show you the value and benefits of utilizing our Reference Architecture during our webinar, Pure Splunk: 3 Powerful Benefits of Modernizing Your Environment.

This webinar will provide benefits, insights, and a technical overview of a high-performance, scalable, and resilient data center infrastructure for the Splunk Enterprise platform comprised of VMware virtualization, Pure Storage hardware, Splunk SmartStore, and Kinney Group engineering expertise. Sign up for the webinar today!

Case Study: Military Organization Increases Utilization of Splunk and Adoption of the Platform

Increases Utilization of Splunk and Adoption of the Platform

A Military Organization of the U.S. government faced the significant challenge of adopting a new technology platform, Splunk, within their organization.

This customer leans on technology solutions that drive mission-critical operations. By leading the development, design, production, and sustainment enterprise technology systems, this customer plays an important role in ensuring our military services are operating safely and efficiently. Adoption and education behind Splunk was critical for this customer to achieve operational success.

Because the team failed to adopt the platform, the Splunk license sat untouched and underutilized for years. Throughout those years of underutilization, Splunk could have been doing what it does best for organizations like this: providing logging and secured services while maintain operations, audit trails, and RMF processes. This customer was disappointed in the platform and was hitting tight budget numbers that pushed the team to find help with adopting Splunk.

Challenges

1. The customer purchased a 100 GB Splunk license that sat underutilized for years due to poor adoption of the platform. When faced with a tight budget, the customer needed to prove the value of Splunk, fast.

2. The team needed help tracking and managing audit trails and RMF processes, a process that was extremely time-consuming.

3. The small team managing Splunk was unable to onboard new users within their organization and expand Splunk usage and collaboration across teams. With one expert Splunker, this team member did not have the capacity to educate others on the platform. They needed help educating other users on the platform to create dashboards and other visualization graphics to support operations.

Solutions

1. Expertise on Demand provided this team with the ongoing support and education needed to expand the Splunk utilization. Within months of using the EOD service, this team rapidly began educating team members on the platform.

2. In total, Expertise on Demand has educated and trained over 10 additional members of this division on Splunk best practices, dashboard building, and basic visualization methods in Splunk. The team is actively seeking education, looking to Kinney Group as the Splunk evangelists to continue spreading Splunk education to other divisions.

3. Due to the rapid and successful adoption of the Splunk platform, this customer purchased an additional 200 GB in Splunk licensing covering Core, ES, and ITSI.

Business Impacts

An underutilized Splunk environment is costly to both teams and their budgets. Prior to onboarding the Expertise on Demand service and Kinney Group guidance on Splunk, this customer spent hours of their week dedicated to slow processes and frustrating results. This customer also faced tight budget restrictions, needing to see immediate value in Splunk. Now, we’ve seen adoption across multiple teams and users of the Splunk platform. We’ve delivered quick proof of value behind the Splunk platform, seeing their license size nearly x3 their license size in under 2 years of engaging our Expertise on Demand team. Adding Expertise on Demand is like hiring an entire team of Splunk-certified ninjas to support and educate your team. It’s an immediate value-add that has an immediate and sustained impact on your business objectives.

Splunk 101: Creating Event Types and Tags

Josh again, to walk you through another quick Splunk tutorial that will help you track to CIM compliance. In this video tutorial, I’ll discuss the importance of creating event types and tags in Splunk. Creating event types and tags may seem simple… but taking the steps to categorize your data early on is crucial when building your data model. Here are some takeaways from the video when you’re using event types and tags in Splunk…

Key Takeaways from Creating Event Types and Tags in Splunk

  • Utilize event types and tags to categorize events within your data… making searching easier to collectively look at your data.
  • Match your actions with your tag names. For example, if your field pair value is action = purchase, your tag name will be purchase. You can create custom values if there is a specific type of information you want to see.
  • Within Enterprise Security, there are a lot of dashboards and searches that run off information that’s being pushed to the data models. Not only should that information be CIM compliant, but it also needs to have event types and tags because it’s looking for those specific types of that information.
  • Within a data model, there could be different types of events there. Examples of events:  logins, logoffs, timeouts, lockouts, etc. Use tags and event types to categorize these events… rather than just using your default fields and index and source types… allowing your data models to quickly identify what data you’re looking for.
  • Tagging and naming with event types is an essential step BEFORE you start building your data models and save you time in the long run.

Meet our Expert Team

You may have read a few of my Splunk Search Command Series blogs, both myself, and our engineers here at Kinney Group produce weekly content around Splunk best practices. My team, the Tech Ops team, runs our Expertise on Demand service, which I’ll touch on a little more below. Our EOD team is responsible for knowing everything and anything around Splunk best practice… that’s why you’ll get access to a ton of video and written content from our team. EOD is designed to answer your team’s daily questions and breakthrough stubborn roadblocks.  Let us know below how we can help.