SAP HANA: In-Memory computing - how 'new' is it?

26 January 2012

Mark Chapman

Mark Chapman

Consultant

We've had quite a few 'energetic' debates amongst ourselves (and many blog posts) as we work on pilots and propositions in our own HANA systems around the use cases for SAP HANA and the decision making processes customers are likely to go through as they prepare to put a 'toe in  the water' of  'in-memory' analytics.

As in any investment decision that a business makes there is the natural human approach to balance the 'pro's and cons' - the potential risks against the desired benefits. SAP HANA is just another technology-led solution that customers are trying to balance the risk of the unknown against the promoted benefits. As we discussed these balancing processes it struck me that we were perhaps lending a little too much weight to the newness factors associated with the SAP HANA offering. 

In-Memory processing with SAP APO

I thought back over projects I'd worked to deliver over the last few years, and recalled that in some of them the concept of In-Memory processing and analysis, with large (40 Gbyte) in memory data sets was a core part of the design. These solutions, also from SAP, were known as APO applications.

MarkChapmanInMemorySAPHANA

(c) SAP AG   (Joerg Hoffmeister Dec 2003)

In these applications not only is the large volume of planning data stored in memory, but much of the business logic used in the network modeling is pushed down (as C++ Function modules) in to memory as well, avoiding the 'round trips' between application server the data.

So how does this relate to analytics, and specifically to a real life context? 

In the SAP APO project I recall, there was the need to support the following business scenario in the DP (Demand Planning) domain.

The forward looking demand forecast (24 months) was constructed using a 2 year historic (actual) demand input, against which various statistical models were applied to produce a baseline forecast. Then various 'overlays' to this forecast were made by different functions in the organisation: centrally from Brand Marketing and Central S&OP planners as well as regionally from the Sales Organisations charged with ultimately delivering this forecast. Each month, a consensus was reached between these parties, and a final, official (new) Demand Plan was issued.

This process started on the first working day of the month, and had to complete before the tenth working day - hence the WD10 Plan or Forecast. Thereafter, every week latest updates to the forecast were issued, with particular emphasis on the near term horizon (in flight corrections). The 'as designed' BI processes extracted both the official monthly plan  - WD10 (once per month) as well as the weekly updates (each Sunday) and used this data to provide the required variance analytics, forward looking.

However, the crucial stage in this process was the achievement of consensus necessary to publish the new monthly baseline. The ability to analyse the complete picture with respect to the impacts of the various overlays on the final forecast was provided by the SAP BI process and the delivered reports. However the data available to make these assessments of variance was only refreshed weekly at best.  However, the business process required the users to be able to analyse the latest weekly outlook against the statistical (and previous month's baseline) but also, crucially against the latest (real time) over-lay forecast.

This only existed in the in-memory 'Live-cache' and not in BI. The solution that was delivered was to leverage the data source structure in the SAP APO system used to extract the weekly forecast update (the 'Planning Book' - a type of database view) as a real time 'Virtual provider' enabling a managed read of the live data, in an identical structure to the weekly and monthly extracts. This permitted the users to analyse all versions of the forecast within the same easy to 'slice and dice' analytical report, enhancing the ability to understand the impacts of all the overlays and so reach the necessary consensus in a timely and efficient manner.

View comments

Comments

Blog post currently doesn't have any comments.

Bluefin and SAP S/4HANA - welcome to the one horse race

We use cookies to provide you with the best browsing experience. By continuing to use this site you agree to our use of cookies.