Adding Software Measurement to Due Diligence during the M&A process

  • Francois Ruchon, Senior Vice President and Head of North America Banking, Financial Services and Insurance at CAST

  • 19.08.2016 09:30 am
  • M&A , Francois is the Senior Vice President and Head of North America Banking, Financial Services and Insurance at CAST, a leader in software measurement. Francois has previously served as the company’s Group CFO. To learn more, visit www.castsoftware.com.

2015 was a banner year for M&A with $4.7 trillion in transactional value reported. Many analysts expect this trend to continue especially as the acceleration of technology speeds up and upsets entire markets. Consolidation coupled with these ‘digital disruptions’ puts any organisation into stress-mode.

When a company moves forward with an M&A or integrates post-merger, one of the key roles of the CFO is to take inventory of the company’s acquisition – assessing the risks and costs – in almost every department. They have to be responsible for not just the finances, but other operations like HR, marketing and technology. IT in particular is one of the hardest areas to conduct proper due diligence or post-merger integration, especially when it comes to reviewing the software of a business. In that respect, figuring out complementarities or redundancies of functionalities is a standard process. But CFOs also need to look at how well newly acquired software performs, how effective it is and most importantly how secure it is. Without a proper review, acquired software can be a big barrier to the success of the deal or a heavy and painful inheritance after integration.

However, CFOs are often faced with a nagging question – how do you take account for something that you can’t see? One way is software measurement and analysis. The ability to analyse and monitor over time how well complex applications – developed in-house or outsourced – are engineered isn’t a new field. The CIO and COO who understand the benefits of measuring these aspects, have often used it as a performance assessment. However, it’s now becoming more important for the CFO as his /her role changes to include more technology management and oversight.

Why is software measurement important for a CFO during M&A? Well, it’s a bit like comparing two cars produced by two different car manufacturers. Because they have been engineered differently, one of them will run without any issues for many years, while the second will frequently be in the repair shop, requiring heavy maintenance costs and potentially even causing a major accident. The same is true of software manufactured by two separate vendors – one piece of software may run smoothly without security or resiliency issues, while another may be prone to glitches and malfunctioning. This analogy between cars and software works very well; applications behave like cars and software economics are like car economics. One organisation has plenty of bits and pieces from a multitude of software vendors under the hood of its ‘car’ – now imagine two and the daunting task of getting them to work together without upsetting the delicate software ecosystem that runs the business.

One reason CFOs don’t typically get involved in this initiative is because they have a misconception that it can’t be done objectively. Without objectivity on the actual underlying risk or technical debt, establishing IT budgets can be a pain, and it doesn’t help determine maximum value to the CEO. This issue is further compounded when a company enters into a merger or acquisition.

CFOs must take a vested interest in the company’s core technologies and IT roadmap in order to understand the full value technology assets it brings to the table. Software measurement and analysis can help by objectively measuring the key characteristics of software risk, quality and functional size, normalised across the application portfolio to determine priorities. By arming the CFO with fact-based data, it enables much more meaningful discussions with the rest of the C-suite.

In the specific case of M&A, when companies enter into a merger, they are also taking on the technology assets of the selling company, including all the buggy software and applications that have been messily cobbled together to support operational needs. More often than not, these technology assets are not considered part of the M&A due diligence process. This results in a lack of objective understanding by the buy side when it comes to application portfolios. Put bluntly, a buyer could be acquiring a system that could introduce severe risk without knowing it.

According to the latest data from CAST Research Labs, there are more than 550 thousand lines of code in today’s average application, with multiple intertwined technologies. That’s a lot to get through, and ideally, IT should go through acquired applications within the first three to six weeks after a merger to set benchmarks and prepare for a successful integration. If there is no common technology estate there will surely be few common practices. The application portfolio, plays the most crucial role because it drives infrastructure and business processes. Faults in the application layer impact the quality of the technology at the newly merged or acquired organisation. One of the main reasons for the failure of M&As is predominantly due to a lack of awareness and understanding of the application portfolio.

In general, IT organisations need to demonstrate more agility and seamless execution quickly to stakeholders. This pressure increases the risk that applications won’t comply with architectural best practices and will suffer from poor structural security, resiliency and efficiency. When companies rush to demonstrate their technology prowess or speed of implementation, there is a good chance that architectural “yellow lines” are crossed, making systems more vulnerable to experiencing crashes, downtime or even cyber-attacks. All of this can damage the company’s reputation.

Establish a Baseline of Your Software Risk & Improve

Before CFOs help define mid-term projections, they must understand what their current situation is. With regards to software risk, CFOs first need to put together a list of tangible and intangible benefits that software measurement can provide. Secondly, working alongside the CIO, they should target business critical applications first – the ones your company can’t afford to have issues with because it will make a financial impact.

To determine the effectiveness of mission-critical applications, especially during M&A, measure the resiliency, security, efficiency and maintainability of these at the system-level. This is typically the only way to guarantee the discovery of deep hidden risky hotspots. Measuring the structural risk of your core IT systems will save money and increase customer satisfaction in the short, mid and long run. Once existing projects are de-risked, it establishes a strong basis for future software developments. Those programs will result in fewer resiliency issues, more productive and agile development teams, and a significant shift from Run-the-Business budgets to Change-the-Business budgets.

On this journey, it’s also critical to adopt and adhere to industry standards, such as the ones set forth by the Consortium of IT Software Quality (CISQ). CISQ is a computable metrics standard for measuring software quality & size focused on improving IT application quality to reduce cost and risk.

For CFOs, requesting the CIO provides a baseline for software measurement is the foundation for transparency between the CEO, CFO and CIO – ultimately driving very significant value for the whole organisation and its business units. As a CFO, if you can help unlock “hidden” IT information, you will solidify your position as an invaluable resource to your company – setting the new standard for CFO capabilities and surpassing expectations when it comes to M&A.

Related Blogs

Other Blogs