Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

What is in-memory tool?can u explain breifly

What is in-memory tool?can u explain breifly

4 Replies
JonnyPoole
Employee
Employee

when a user clicks on a dashboard, the data is fetched from a noSQL database that can be directly retrieved from RAM without the latency of reading disk, running SQL and other. 

Its generally viewed as resulting in industry leading query performance and allows for deep , successive and rapid analytics and exploration of large data sets.

Siva_Sankar
Master II
Master II

WHAT IS IN-MEMORY BI? In-Memory Business Intelligence (BI) refers to business intelligence software that utilizes an in-memory database (IMDB) for data processing. In-Memory BI’s claim to fame is to provide an alternative to the hefty data warehouse and OLAP projects. An IMDB is a database management system (DBMS) that is designed for best performance when there is enough computer memory (RAM) to hold the needed data. This is in contrast to Relational Database Management Systems (RDBMS), for example, which are designed for best performance when the data cannot fit entirely in memory and (slow) disk I/O operations must take place in real time. WHY ALL THE HYPE OVER IN-MEMORY BI? In-memory databases have been around for 30 years, but they have only been catching headlines in the Business Intelligence space for the past few years. The main reason in-memory BI gained popularity recently is because it wasn’t feasible before 64-bit computing became commonly available. Before 64-bit processors, the maximum amount of RAM a computer could utilize was barely 4GB, which is hardly enough to accommodate even the simplest of multi-user BI solutions. Only when 64-bit systems became cheap enough did it became possible to consider in-memory technology as a practical option for BI. IN-MEMORY, OR OUT-OF-MEMORY BI? However – unlike hard disk-based database solutions, for which it is easy to continuously add more storage at low cost, memory-based solutions require more and more relatively expensive memory to grow. While 64-bit PCs theoretically provide a very high maximum memory threshold, in practicality, deploying the required volumes of memory becomes prohibitive.This is because, with the in-memory approach, the entire data set must be loaded into memory at once. When the size of the data (after compression) exceeds the amount of RAM, in-memory BI solutions become unusable (or even crash). Of course, as data volumes grow, so does the amount of memory required to hold it all at once. Companies with rapidly-growing data volumes will find that in-memory solutions will soon reach limits which make them impractical. This becomes substantially more obvious when there are multiple users accessing the same in-memory store.

awhitfield
Partner - Champion
Partner - Champion

Hi Raja,

Associative In-Memory Technology:

QlikView uses an associative in-memory technology to allow users to analyze and process data very quickly. Unique entries are only stored once in-memory: everything else are pointers to the parent data. That’s why QlikView is faster and stores more data in memory than traditional cubes. Memory and CPU sizing is very important for QlikView, end user experience is directly connected to the hardware QlikView is running on. The main performance factors are data model complexity, amount of unique data, UI design and concurrent users.

See the attached Technical Brief for further information

Andy

Not applicable
Author

In simple words In Memory is storing the dashboard in RAM rather than Physical Memory.

So when the user accessing the dashboard the data will be fetched from RAM. So the accessing speed will be good.