Data Growth: How Financial Organisations Can Contain Rising Costs

  • Mark Molyneux, CTO of EMEA at Cohesity

  • 19.10.2023 10:15 am
  • #finance #data

Organisations will need more storage as their data grows by an average of 40 to 50 percent each year. At the same time, storage prices are increasing, whether on-premises or in the cloud. Mark Molyneux, CTO of EMEA at Cohesity, explains strategies to counteract this, in light of a report by Enterprise Strategic Group (ESG) underlining the enormous potential of data reduction.

Several factors contribute to making data more expensive to store. The war in Ukraine increased energy prices and, among other factors, pushed the inflation rate in industrialised countries to more than nine percent in 2022, the highest it has been since the 1980s. Global Technology Market Analysts such as Canalys expect that public cloud providers, especially in Europe, will increase their prices by at least 30 percent to account for rising energy costs.

At the same time, the amount of data continues to grow rapidly, as a report by ESG shows. For every TB of production data, companies need an additional four TB of storage for secondary data, which they store for privacy and other non-production reasons. There are also other factors that will ensure that this development will not slow down in the future. For example, IBM’s Storage Evangelist Shawn Brume predicts that autonomous driving of more than 48 million vehicles on US roads will generate 23 exabytes of data for deep storage. These emerging modern services will by default generate massive amounts of data that organisations will need to store for 20 to 30 years before it becomes valuable.

As a result, organisations are faced with storing more data as costs increase, coupled with higher prices for new storage resources, as shown by the US government's most recent December 2022 Producer Price Index. The price of computer storage rose 1.1 percent in December after rising 3.9 percent in November. So companies seem to be spending more money on storage to make room for data growth.

So if, as Gartner expects, IT budgets should increase by an average of 2.4 percent this year, the bottom line is that CIOs urgently need to find ways to reduce costs, because inflation alone will more than equalise the budget growth. At the same time, it is important not to jeopardise forward-looking steps towards more digitisation or agile IT, despite economic pressure. Because these new services will open up additional revenue streams and will help organisations to address customers in a modern way.

Renaissance of data reduction

The data explosion is a scientific fact and shows how important any form of smart data reduction technology is today. With a data management platform with hyperscale architecture, all data is automatically compressed and deduplication algorithms look for redundant data structures, which they can replace with small placeholders.

Data reduction can massively help companies to store data more cost-effectively, since these mechanisms automatically reduce the amount of secondary data in the background as soon as it is generated, without anyone having to actively initiate the process. This pays off immediately when you take a close look at the costs. Although hardware is generally becoming cheaper because the disks provide more storage per euro, the operating costs drive the price up. According to an analysis by Nasuni, it costs $3,351 a year to store one TB of file data. Existing storage resources are protected by data reduction, so that investments in new storage resources can be postponed.

Synergies for enhanced cyber resiliency

Organisations should consolidate their disparate application data silos onto a single centralised data management platform that is based on a scalable hyper converged file system. In this case the data stored will be automatically analysed by the deduplication and compression functions to achieve the highest reduction rates across the organisation. 

To protect stored data, such platforms take the Zero Trust model even further by implementing strict access rules and multi-factor authentication, encrypting the data automatically, both during transport and at rest, to further enhance security against cyber threats like ransomware. And it generates immutable backup snapshots that cannot be changed by any external application or unauthorised user. 

These backup snapshots are analysed by AI-driven algorithms to identify indications of possible anomalies. These can be passed on to security automation tools from vendors such as Cisco or Palo Alto Networks, in order to examine the potential incident in more detail. 

Finally, modern data management platforms also provide more insights from data analysis thanks to integrated classification. Organisations can better understand their compliance risks by getting visibility into their dark data, which according to Gartner affects between 55% and 80% of the data a company stores. They can decide with confidence whether to keep certain records or delete them with no risk.

All of these synergy effects found in a modern data platform enhance cyber resilience, reduce the operating and storage costs and help organisations to manage the growing volumes of their data in the long term.

Related Blogs

Other Blogs