Reducing Costs with Splunk

As of the writing of this post, we are arguably in turbulent times. Publicly traded companies have recently entered a bear market, crypto currencies are down 70% (or more) from recent highs, and inflation is measured at a 40-year high. Leaders of companies big and small are rightfully concerned that the US and global economies are entering into a recession.

In preparation for a potential economic downturn, most organizations are looking internally to determine where costs can be reduced, what platforms are enablers for weathering an economic storm, and what should be cut.

Since 2013, our team has helped hundreds of commercial and public sector organizations with their implementation of Splunk, both on-prem and in the cloud. From many customers, we hear a recurring refrain of “Splunk is expensive.”

My first reaction to this comment is always “Splunk is expensive? Relative to what?”

Before Splunk, getting real-time analytics from disparate critical systems to address security, operations, and observability was really, really tough. Regardless of good times or bad, all organizations must be vigilant on security and optimal application performance — this is the new reality of a software-driven world. The ability to harness insights from “digital exhaust” produced by logs and machine data is invaluable in today’s modern, software-driven world. Splunk remains the best platform of its kind for gaining real-time intelligence from machine data organizations that have chosen Splunk have chosen wisely.

I understand the “Splunk is expensive” observation. If organizations are not getting enough tangible returns on their Splunk investments, then Splunk is expensive, regardless of how good the Splunk technology is. For that matter, any enterprise software or SaaS offering that does not provide measurable mission, financial, or human returns on investment should justifiably be viewed as “expensive.”

New call-to-action

Optimize Splunk, and Turn It Into a Cost Reducer

We at Kinney Group view “reducing costs with Splunk” through two lenses:

  1. How can we reduce the costs associated with deploying, operating, and sustaining investments in Splunk technologies?
  2. How can we harness the power of Splunk to be a cost-reduction engine?

In 2021, our organization released Atlas — the Creator Empowerment Platform for Splunk. Purpose-built from the ground up to help customers in their Splunk journeys, Atlas accomplishes the two views of cost reduction referenced above.

Addressing lens #1 referenced above, we suggest pursuing a “1-2 punch” using the Atlas platform.

First, diagnose the health of a Splunk environment via the Atlas Assessment application, available free on Splunkbase. Using Atlas Assessment, customers can get visibility into areas of cost reduction and optimization for Splunk technologies, whether on-prem or in the cloud. Remarkably, Atlas Assessment returns actionable insights in less than 30 minutes.

The second punch is using the Atlas platform to address the identified areas of improvement that have been illuminated by the Atlas Assessment. Not sure if Atlas can help? We offer a full, 30-day trial of the Atlas platform absolutely free. Our experience is that Atlas Assessment, combined with the Atlas platform, provides tangible optimization and cost-reduction results for any Splunk implementation. And you can get started without spending a single dollar.

More specifically, customers find that Atlas reduces Splunk operating costs in the following manners:

  • License optimization: Whether the license is based on data ingest or workload, Atlas specifically identifies how any Splunk Enterprise or Splunk Cloud license can be optimized for maximum ROI.
  • Operational optimization: Atlas streamlines the daily operation and sustainment of Splunk implementations. These capabilities provide direct labor savings, while at the same time freeing valued personnel to spend more time creating analytics value from Splunk.
  • UX and adoption optimization: Splunk admins and users are the “creators” that drive organizational value from Splunk. Atlas helps drive adoption by making the use of Splunk much easier. More people using Splunk means more value for your organization.

Splunk as a Powerful Cost Reduction Engine

All systems and applications produce log data. And Splunk is the best platform on the planet for turning log data into insights for security and observability. Since we began using Splunk in 2013, we consistently find that Splunk can help organizations reduce the sprawl of siloed, single-use tools and monitors.

As organizations look to reduce costs, we encourage them to take a hard look at their entire landscape of software tools. If Splunk can deliver the outcome, why does an organization need another tool to deliver the same results?

When we optimize a Splunk environment using Atlas, we magically create additional Splunk capacity with existing license investments. This newfound added capacity can then be leveraged to help any organization reduce their footprint (and costs) associated with the sprawl of single-use tooling.

Reducing Costs Now for Weathering a Potential Storm

With Atlas and Atlas Assessment, we can deliver tangible cost savings immediately, and do so through the two lenses referenced above. Now is the time to prepare for the potential of an economic storm brought on by a recession. Atlas can help get you prepared.

Is Splunk expensive? Yes — it sure can be if it isn’t optimized and delivering tangible returns for the organization.

Is Splunk expensive when fully optimized with Atlas? NO! When running correctly, Splunk is the most powerful platform of its kind in the industry. Splunk customers have chosen wisely. We argue that once customers get Splunk optimized, it can be one of the most powerful cost-reduction weapons any organization can have.

Ready to take your next step?

Download the FREE Atlas Assessment application from Splunkbase for actionable (and no-cost) discoveries in your environment, or get started with a free 30-day trial of the Atlas Platform. Have questions? We’d love to answer them! Click here to schedule an introductory discover call.

New call-to-action

Defining Data Sprawl in Splunk: Why it Matters, and What it’s Costing You

“Data Sprawl” isn’t really a technical term you’ll find in the Splexicon (Splunk’s glossary). Here at Kinney Group, however, we’ve been around Splunk long enough to identify and define this concept as a real problem in many Splunk environments.

What exactly is Data Sprawl? It’s not one, single thing you can point to, rather a combination of symptoms that generally contribute to poorly-performing and difficult-to-manage Splunk implementations. Let’s take a look at each of the three symptoms we use to define Data Sprawl, and break down the impacts to your organization:

  1. Ingesting unused or unneeded data in Splunk
  2. No understanding of why certain data is being collected by Splunk
  3. No visibility into how data is being utilized by Splunk

Ingesting unused or unneeded data in Splunk

When you ingest data you don’t need into Splunk, the obvious impact is on your license usage (if your Splunk license is ingest-based). This may not be terribly concerning if you aren’t pushing your ingest limits, but there are other impacts lurking behind the scenes.

For starters, your Splunk admins could be wasting time managing this data. They may or may not know why the data is being brought into Splunk, but it’s their responsibility to ensure this happens reliably. This is valuable time your Splunk admins could be using to achieve high-value outcomes for your organization rather than fighting fires with data you may not be using.

Additionally, you may be paying for data ingest you don’t need. If you’re still on Splunk’s ingest-based pricing model, and you’re ingesting data you don’t use, there’s a good chance you could lower Splunk license costs by reducing your ingest cap. In many cases, we find that customers have license sizes higher than they need to plan for future growth.

We commonly run into scenarios where data was being brought in for a specific purpose at one point in the past, but is no longer needed. The problem is that no one knows why it’s there, and they’re unsure of the consequences of not bringing this data into Splunk. Having knowledge and understanding of these facts provides control of the Splunk environment, and empowers educated decisions.

New call-to-action

No understanding of why certain data is being collected by Splunk

Another common symptom of Data Sprawl is a lack of understanding around why certain data is being collected by Splunk in your environment. Having the ability to store and manage custom metadata about your index and sourcetype pairs — in a sane and logical way — is not a feature that Splunk gives you natively. Without this knowledge, your Splunk administrators may struggle to prioritize how they triage data issues when they arise. Additionally, they may not understand the impact to the organization if the data is no longer is coming in to Splunk.

The key is to empower your Splunk admins and users with the information they need to appropriately make decisions about their Splunk environment. This is much more difficult when we don’t understand why the data is there, who is using it, how frequently it is being used, and how it is being used. (We’ll cover that in more detail later.)

This becomes an even bigger issue with Splunk environments that have scaled fast. As time passes, it becomes easier to lose the context, purpose, and value the data is bringing to your Splunk mission.

Let’s consider a common example we encounter at Kinney Group.

Many organizations must adhere to compliance requirements related to data retention. These requirements may dictate the collection of specific logs and retaining them for a period of time. This means that many organizations have audit data coming in to Splunk regularly, but that data rarely gets used in searches or dashboards. It’s simply there to meet a compliance requirement.

Understanding the “why” is key for Splunk admins because that data is critical, but the importance of the data to end users is likely minimal.

(If this sounds like your situation, it might be time to consider putting that compliance data to work for you. See how we’re helping customers do this with their compliance data today with Atlas.)

The Atlas Data Management application allows you to add “Data Definitions,” providing clear understanding of what data is doing in your environment.

No visibility into how data is being utilized by Splunk

You’ve spent a lot of time and energy getting your data into Splunk but now you don’t really know a lot about how it’s being used. This is another common symptom of Data Sprawl. Making important decisions about how you spend your time managing Splunk is often based on who screams the loudest when a report doesn’t work. But do your Splunk admins really have the information they need to put their focus in the right place? When they know how often a sourcetype appears in a dashboard or a scheduled search, they have a much clearer picture about how data is being consumed.

Actively monitoring how data is utilized within Splunk is extremely important because you can understand how to effectively support your existing users and bring light to what Splunk calls “dark data” in your environment. Dark data is all of the unused, unknown, and untapped data generated by an organization that could be a tremendous asset if they knew it existed.

Kinney Group’s Atlas platform includes Data Utilization — an application designed to show you exactly what data you’re bringing in, how much of your license that data is using, and if it’s being utilized by your users and admins.

Conclusion

Most organizations may not realize that Data Sprawl is impacting their Splunk environment because it doesn’t usually appear until something bad has happened. While not all symptoms of Data Sprawl are necessarily urgent, they can be indicators that a Splunk environment is growing out of control. If these symptoms go unchecked over a period of time they could lead to bigger, more costly problems down the line.

Knowledge is power when it comes to managing your Splunk environment effectively. Kinney Group has years of experience helping customers keep Data Sprawl in check. In fact, we developed the Atlas platform for just this purpose. Atlas applications are purpose-built to keep Data Sprawl at bay (and a host of other admin headaches) by empowering Splunk admins with the tools they need.

Click here to learn more about the Atlas platform, to get a video preview, schedule a demo, or for a free 30-day trial of the platform.

New call-to-action