InetSoft BI Webcast: In-Memory Analytics

This is the continuation of the transcript of a Webinar hosted by InetSoft titled "What's New in BI." The speaker is Mark Flaherty, CMO at InetSoft.

Mark Flaherty: Now let’s talk about in-memory analytics. Clearly the benefit is speed-of-thought queries. So accessing data in memory is a million times faster than accessing it via disk, and if you minimize the cost of ownership because IT has another way of speeding up queries rather than the expensive process of creating aggregates and summarizations and figuring out what to index. But the next big benefit is for answering complex business questions.

So if any of you have tried to author a challenging query, such as what are my sales versus a percent of total or who is buying both products A and B, you know that potentially involves authoring a subquery which can be really slow in a traditional relational environment where the data is coming from disk.

There is a lot of confusion in the marketplace because people will use in-memory as if it’s synonymous with advanced visualization, and it’s not. In-memory is a type of technology that different vendors, it could be advanced visualization ones, it could be a BI platform, it’s a type of technology that anyone can be taking advantage of.

#1 Ranking: Read how InetSoft was rated #1 for user adoption in G2's user survey-based index Read More

So if we look at the current approaches to fast BI, if you have a slow query environment, usually the burden is on IT to speed up the performance, or the user might change their approach, and just because their queries time out or they take too long, they’ll ask smaller simpler questions. This is one reason why sometimes BI is left to the experts because as you move to those outer spectrums of users, they don’t have time to wait for answers to their questions.

So if you look at how traditional queries work. Somebody runs a query. The results are usually retrieved a real atonal database from disk. Then you look at those results, and you have to first say “is it what I expected to get back?” And maybe it’s not right. Maybe you forgot to add a particular filter, so you repeat this process. Depending on your environment, any single query can take a few seconds, a few minutes, or potentially hours. So the whole idea of getting to interactivity is a final step after what has been a very slow process.

Whereas with in-memory architecture, instead what happens is somebody takes a big extract, it could be from a data warehouse, it could also be from a source system, and that data is loaded into memory. From a user perspective, the user only interacts with a cached result set. So this is where many advanced visualization tools will take that approach where you are looking at a finished dashboard and interacting there. Some BI vendors, like InetSoft, will also use in-memory to accelerate even the traditional queries as part of the total BI platform. So the iterative process is fast.

When it comes to evaluating in-memory technologies, whether the software is 64-bit or 32-bit dictates how much of the memory can be addressed. 64-bit operating systems can address up to a full terabyte of data. So you can almost store your whole data mart or even your data warehouse in memory. Whereas 32-bit operating systems can only address 3 gigabytes of memory.

Read what InetSoft customers and partners have said about their selection of Style Scope for their solution for dashboard reporting.

This innovation from an industry perspective is immature. There is no consistent approach, and it’s not widely adopted, yet, because not many customers have wide adoption of 64-bit. So it’s a good idea to plan for 64-bit down the road. This will be something that will be helpful for higher value and lower TCO and it’s something that benefits all users.

Previous: Example of an Advanced Visualization
Next: Mobile BI