Splunk Enterprise 6.4 claims to reduce big data storage cost by 80%

As organizations collect, analyze and retain data, storage becomes the most expensive aspect of data analytics. Big Data Analytics company, Splunk, has announced an upgrade (v6.4) to its lead on-premise software Splunk Enterprise, that claims to reduce storage costs of historical data by 40 percent to 80 percent, whether deployed on-premises or in the cloud. The company also has a SaaS product Splunk Cloud. The company is engineering long-term data archiving functionality into Splunk Cloud, which it expects to deliver later this year.

What’s new?

Both Splunk Cloud and Splunk Enterprise include new interactive visualizations and an open library on Splunkbase where customers and partners can develop and share their custom visualizations. Other new features in both platforms include enhanced big data analytics, improved query performance, platform security and management improvements. The company also released new cloud analytics apps for Akamai Content Delivery Network (CDN) services, Amazon Web Services (AWS) and ServiceNow. “Splunk is passionate about making big data analytics more affordable for organizations of every size. Reducing the cost of historical data retention and analysis is a major part of delivering that value to our customers,” said Shay Mowlem, Vice President (Product Marketing and Management), Splunk. The company claims that a customer indexing 10TB of data per day, with a data retention policy of one year, may save over $4 million of storage costs over five years with Splunk Enterprise 6.4; and if the customer keeps the data longer or replicates it, savings multiply. Splunk claims that Splunk Enterprise can collect machine data from virtually any source and location. Its schema-on-read technology can “freely analyze and correlate data without the limitations of conventional database structures.”