The impact of SAP HANA on Bank Architectures

8 July 2012

Mike Curl

Mike Curl

Head of Finance, Services & Media

Banks have a lot of data. Transaction volumes have soared over the years, driven by factors such as payment cards, internet banking, online shopping and the shift to direct debits. Now add in the calculations and statistical models that are being run on the data for things like regulatory reporting and risk and profitability analysis and you have some seriously large data volumes. And that's before considering the system and data impact of new regulations such as Dodd Frank and Basel III which will require lower levels of granularity and an awful lot more number crunching.

Not content with having some of the biggest data around, banks spend a large portion of their IT budgets moving it around, joining it back together, adding it up, reconciling it and hunting down discrepancies. This has become the norm in most banks due to system proliferation driven by factors such as mergers and acquisitions, new product offerings and new analytical reporting requirements.

Yesterday, all my data seemed so far away

Current bank architectures still largely reflect the technology limitations of yesterday. Slow links with branches result in thousands of batch jobs at the end of the day to collect and post transactions. The recent problems at RBS were caused by problems patching a batch payment processing system. And Interfaces between operational and analytical systems still rely heavily on overnight batch processing.

Traditional databases and hardware are also partly responsible. Take for example the data warehousing industry, which has grown from the inability of databases to cope with the demands of inserting and updating high volumes of records at the same time as being asked tough analytical questions. The solution up until now has been to create copies of data on separate systems, manipulate, augment, cleanse and then store it in a format optimised for "decision support".

Most bank processes and thinking are still organised around these limitations.

It doesn't have to be this way

In-Memory computing platforms such as SAP HANA have the potential to disrupt and make fundamental changes to underlying data architectures. What if core banking systems were able to be both transactional and analytical at the same time? What if these systems were fast enough to allow a real-time view of customer profitability or risk? That would reduce the number of additional systems and databases needed, and in turn, the need to move data around. This would lead to fewer copies of data and lower the potential risk of introducing data quality problems and numerical discrepancies and errors.  And the end result is that that banks would have less complexity and be a few steps closer to the dream of "one version of the truth". The cost savings on storage alone should be significant.

Admittedly, this vision is going to take some time to become reality but for the smaller, newer banks, it isn't that far fetched.

For the big guys, a more realistic start point is on the analytical side where much of the complexity and inefficiency lies. Removing years and years of 'kludge' will not be easy or for the faint hearted. Some of my banking clients have decided to steer clear of touching certain management information systems because they have become too complex or are not well understood. Or usually both.

How SAP HANA can help simplify bank architectures

Simplifying and rationalising these banking systems is like peeling an onion because of the different layers involved. In many cases, it might actually be easier to start afresh and build up new parallel architectures and systems based on current and future requirements rather than trying to unravel and reverse engineer what these systems do at the moment and why.
What SAP HANA offers is the ability to re-design and re-think the analytical landscape. If you've got a system that's capable of processing billions of granular records in near real-time, you don't need to create snapshots and aggregates. You can have a number of different analytical views or windows onto the same data, which update automatically when the source data changes. No more batch. No more aggregates. No more reconciliation.

The potential rewards for those willing to rise to the challenge are significant. They include: 

  • lower support and maintenance costs as a result of reduced complexity (data flows, batch jobs, systems, reports, consolidations)
  • ease of meeting new requirements and regulatory reporting requirements
  • significantly reduced storage costs
  • greatly  improved data quality and integrity.

SAP HANA needs architects to think differently too. Undoing years and years of conditioning and "best practice", is not going to be easy. We saw this recently on a credit risk proof of concept where the client's immediate inclination was simply to underpin the existing solution with SAP HANA. Yes, this would have led to some immediate performance benefits, but to really get the best from this amazing technology, we needed to simplify, rationalise and remove the redundant copies of data and superfluous layers. This quote came to mind at the time:

"A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." 

Antoine De Saint-Exupery

View comments

Comments

Blog post currently doesn't have any comments.

About the author

Mike Curl

Head of Finance, Services & Media

My interest in technology started in 1981 when a friend’s father built a Sinclair ZX81 and then relied on the two of us to program it. I soon moved on to my own BBC “B” computer (thanks dad!) and amused the family with some very rudimentary INPUT and PRINT statements…

My passion continues but on a much bigger scale than I could have ever imagined.  Today, I advise business and IT teams at some of the largest organisations in the world, helping them design, implement and exploit the latest technology in support of their business priorities and challenges.

I co-founded Bluefin in 2002 after spending many years in the consulting industry, having started at Andersen Consulting in 1994.  With Bluefin, I have been fortunate enough to be involved with some truly ground-breaking projects and technologies over the years.

What I really enjoy is finding the business value of new technology and leading the pioneering engagements to implement it successfully for the first time. Barclays, a client I lead at Bluefin, has won several industry and partner (SAP) awards for the innovative work we did with them around enterprise mobility. At another client we’ve also been doing some truly amazing work with SAP HANA that has the potential to disrupt established business models.

Bluefin and SAP S/4HANA - welcome to the one horse race

We use cookies to provide you with the best browsing experience. By continuing to use this site you agree to our use of cookies.