SAP HANA: In-memory computing

SAP HANA platform is the part of SAP's in-memory computing strategy. SAP has set goal of availability of business data anytime(in-memory computing), anywhere(on-demand) and on any device to its customers.
Source: http://www.news-sap.com/glossary-sap-hana/
So what is in-memory computing? Suppose the company's whole data is available on memory instead of the traditional database (disks). Imagine the speed with which this data can be accessed!  
Main memory uses simpler algorithms to access data instead of hard-disk. In-memory computing is subject of research and study for many years but there were technological restrictions - lack of processing power, expensive memory, etc. 
Though there is a limit to maximum frequency which can be cloaked, around 2.93 Ghz, a smart way was found around it. More processing power could be unleashed by having more CPUs in a processor unit. For example, Intel Core 8 X 8 core processor for the enterprise class server has 64 cores. And even 8 X 10 core processor with 80 cores is available in the market.
The processing power increased substantially and the cost of main memory (RAM)reduced considerably. Roughly, the rack server with Intel 8 X 8 core processor with a memory of 2TB (2000GB) of DDR3 RAM costs around USD 100,000.00. So at that cost, the businesses can address 20TB to 100TB of their data using in-memory computing (based on a compression rate of 10-50 times).
And 8 servers with 512 cores (running 512 threads in parallel) and upto 16TB of main memory cost around USD 1-million which is not much. This set-up can address 160-800TB of enterprise data in memory (based on compression rate of 10-50 times).
So the hardware prices are favorable for leveraging main memory through higher processing power and 64-bit OS which came into picture some time back.
The research in recent years carried out by SAP in partnership with Hasso Plattner Institute (HPI) has resulted in software innovations which could lead to harnessing the modern hardware.
For example, using the column-oriented data storage instead of the row-oriented storage in traditional databases, a compression to a factor of 10 was reached (by eliminating redundancies).
What does that mean? It means the organization with say 10TB of data in its databases will occupy roughly 1TB of space in memory which a modern day server can easily accommodate.
As per SAP claims the compression factor may be in the range of 10-50. So a factor of 10 is the minimum.
In-memory computing is a combination of state-of-the-art development in hardware and software. And now is the time when the hardware and software innovations have reached the point of exploiting these resources.
Traditionally, the database systems are divided into two categories based on their use: On Line Transaction Processing (OLTP) and On Line Analytical Processing (OLAP).
OLTP systems (like SAP R/3) facilitate and manage transaction-oriented applications typically for data-entry and retrieval.
On the other hand, OLAP systems (like SAP Business Objects) are designed for analysis of data which is the result of transactions in OLTP systems.
Companies need to make decisions on the basis of analysis of available data. So the OLAP systems need to be high performance and faster. There are separate databases – one optimized for transaction-oriented (OLTP) system and another for analytics (OLAP). The data is synchronized from OLTP to OLAP datases using batch jobs which involve expensive ETL (Extract, transform and Load) process. Another drawback is the data is as new as the last time it was synchronized.
In-memory computing is going to change this scenario by combining the working of OLTP and OLAP database systems. Whole enterprise data will be available in main memory which can be accessed by transaction-based systems or analytical systems thereby resulting in a true real-time system.
To get an idea of the gains possible SAP has released a few results. For example, Dunning process which normally takes 20 mins of time under standard conditions, could be completed in just 1 second on SAP HANA platform which is 1200 times faster !!!
The software – the current applications – have to be optimized to make efficient use of this new database layer. Currently Business Explorer, the BI tool is found to scale well with increased hardware capabilities. So that is the first application making use of HANA platform. Next big news is SAP BW on HANA platform.
As per SAP, in-memory computing will make the existing business applications faster and efficient and new business applications possible which were impossible earlier. So SAP HANA seems to change the way the businesses are run today and the business leaders will be taking decisions with truly real-time data at their finger-tips! 

Numerous applications and products now sport the "powered by SAP HANA" tagline. But what does this actually mean? Let SAP.info guide you through the tangle of terminology.

Featured Post

Top 5 Reasons you should be on Google+: [1] Immersive Look-and-Feel

Updated on August 5, 2017  due to remarkable changes in the landscape of Google+ making the post up-to-date and current. Enjoy the updated...