Banking on AlgoSec: How a Top Commercial Bank Manages its Network Security
- Christopher Walsh, VP Information Security at Itaú Corpbanca
- 07.06.2022 12:30 pm #security
The financial services industry is a cybercrime playground. Not only do institutions have to stay ahead of the ever-changing cyber threat landscape, but they must uphold the standards expected in a complex and highly regulated industry. Banks and other financial institutions are lucrative targets for cybercriminals. In fact, according to a report by IBM, the average cost of a data breach in the financial services sector now exceeds $5 million, nearly $2 million higher than the average cost across all other sectors.
As a customer-centric industry, we are constantly seeking to serve our clients better and stay ahead of the curve through new technology. However, we cannot simply install software on a whim in our highly regulated industry. If implemented at pace or not done properly, poorly executed applications could make us, and our customer, more susceptible to cyber-attacks. Research shows that 48% of consumers have negative opinions about banking precisely due to data and cybersecurity concerns.
Almost two thirds of global financial institutions have experienced a rise in destructive attacks over the last year and at Itau, some of our biggest cyber concerns are in phishing, ransomware and DDoS (distributed denial of service) attacks.
Security teams under stress
Unfortunately, so often network security teams are unable to focus 100% on the evolving cyberthreat landscape as they are held back by manual, slow and error-prone change-management processes. At Itau Corpbanca, we are inundated with hundreds of change requests each month, often requiring several days to process just one.
It means that while we work hard to create and maintain a clean and optimized network security policy that reduces points of attack, errors that arise from manual processes put the bank at risk, opening up vulnerabilities that attract bad actors. I’m sure this is a familiar challenge for many financial institutions. Combined with complex compliance requirements and keeping up with numerous regulations, it’s easy to see how things become a box-ticking exercise.
Take firewalls rules for example. We are obliged to review our firewall rules for risk at least once a year. Up until recently, this was completed by staff members without the intimate knowledge they needed, so details would get glossed over and it became merely a sanity check – do we need this rule at all? - rather than whether it’s done correctly to support least privilege or least access.
As a small branch of a very large organization in Latin America, the rules that were set up were permissive, allowing free flow of information back and forth. And since ransomware has become one of our biggest concerns in recent years, we started to review those rules and our processes around them.
Turning to technology for network object management
Fast forward to today and we review our firewall rules thoroughly and regularly, at least once a quarter. This is because we now have the right technology in place to dig deep into the details and execute processes sometimes beyond our skillset, making sure that we are on the right path.
By implementing this technology and partnering with AlgoSec, we haven’t needed to invest in any additional people, which is a huge operational saving. We also work in partnership with several disparate security vendors. For IT, we have three primary vendors and a few third-party vendors that plug the gaps when needed. Unfortunately, this can increase the risk of vulnerabilities as third-party entities have access to data and internal systems. AlgoSec’s Firewall Analyzer helps us to make sure that we allow only what is needed, providing high assurance that they can only access network assets that are part of the contract.
Mapping out the road ahead
We are starting to implement and manage micro-segmentation initiatives. One of our audit concerns was the flat network, and we started to work on creating a test segment. With a tool such as Firewall Analyzer, we can validate the traffic that will be allowed into that new segment, restricted to only what is necessary.
There is no point in creating an isolated network if risky elements can still flow back and forth between test and production. Thanks to syslog counters, we can tell over time whether we did a good job in the first place and whether the remaining rules that we configured are still needed for business purposes. The extra visibility that we now have on our network security policies is ideal. As the information security officer, I am able to monitor everything about the firewall - its rules, configurations, and usage patterns, and make sure we are doing everything we can to keep it tight.