A Cloud BI Tool to Close All the Gaps

This is the continuation of the transcript of a Webinar hosted by InetSoft on the topic of "Analytics and Agile BI" The speaker is Abhishek Gupta, product manager at InetSoft.

What is it that business users can do in the spreadsheet that they potentially cannot do in your desktop BI tool or cloud BI tool that you provide? Find out, and then you need to make some hard decisions. You don’t really want to limit them. What is it that a BI tool needs to have to basically close all the gaps in functionality, between their features and what Excel.

Empower your business users with their own sandboxes. Then empower them with capabilities to share their content with their colleagues with as little involvement from IT as possible, and then empower yourself. Empower your BI support organization with the ability to monitor what these people do, and selectively, proactively and reactively promote that content to production.

I would say you notice a user who has created at web-based BI application that loads billions of rows from ten different data sources, creates a giant in memory data mart, and distributes it via email to hundreds of his colleagues. That’s a big success, right. All this needs to be productionalized or operationalized.

The Business User Has Done All the Work

The good news is that the business user already has done all the hard work. You don’t really need to inundate him or her with business requirements gathering. He or she already identified all the data sources. He already identified the data transformation logic. He already built a data model. He already built the front end so all you really need to do is harden and productionalize that content, and that’s one of the key features that you should look for in BI tools.

Can you empower business users to basically do whatever it is they need to do to get their job done, but then can you take what they created and with minimal effort promote it to production? And you need to do all the testing and UAT and quality assurance and set up backups and redundancies, everything that you need to have in the real large enterprise production environment. User Acceptance Testing (UAT) is the final phase in the software testing process, where the actual end-users validate the system to ensure it meets their requirements and functions correctly in real-world scenarios. UAT involves executing predefined test cases that represent typical user activities and verifying that the software performs as expected. This phase is crucial because it provides an opportunity for users to identify any issues or discrepancies that may have been overlooked during earlier testing stages. By involving end-users, UAT ensures that the software not only meets technical specifications but also aligns with user needs and business objectives. Successful completion of UAT indicates that the software is ready for production deployment, providing confidence that it will deliver the intended value and functionality to its users.

#1 Ranking: Read how InetSoft was rated #1 for user adoption in G2's user survey-based index Read More

Supplement Your SQL with No SQL Technologies

And last but not the least everything that we just talked about is going to be academic discussion unless you augment your BI environment, your BI infrastructure with agile BI technologies. As we said early SQL is definitely very mature, and you can do a lot with it, but it’s not agile. Supplement your SQL with no SQL technologies.

Look to the cloud for elasticity. Elasticity in cloud computing refers to the ability of a cloud service to dynamically allocate and deallocate resources as needed to match the workload demands. This means that a cloud environment can automatically scale up resources during periods of high demand and scale down when demand decreases, ensuring optimal use of resources and cost-efficiency. Elasticity enables organizations to handle varying workloads without manual intervention, maintaining performance and availability while minimizing costs associated with over-provisioning or under-utilization of computing resources. This flexibility is fundamental to the cloud's value proposition, providing users with the ability to respond quickly to changes in workload and business needs.

Look at new types of databases like columnar databases for scale. Columnar databases scale up by efficiently handling large-scale data analytics and read-heavy operations through their unique data storage architecture, which organizes data by columns rather than rows. This format allows for highly compressed storage and faster access to specific data points, significantly reducing I/O operations. To scale up, columnar databases leverage powerful hardware enhancements such as adding more CPUs, increasing memory, and utilizing high-performance storage systems. By distributing the workload across these enhanced resources, they can process complex queries and large datasets more rapidly. Additionally, columnar databases often incorporate advanced indexing, parallel processing, and in-memory capabilities, further enhancing their ability to scale and perform under increased data loads and concurrent query demands. This scalability is crucial for organizations needing to perform real-time analytics and gain insights from vast amounts of data quickly and efficiently.

If SQL is something that you need to live with for the foreseeable future, hybrid in memory and disk based caching is key. Hybrid in-memory and disk-based caching for database access combines the speed of in-memory storage with the capacity and persistence of disk-based storage to optimize the performance of running dashboards. In this approach, frequently accessed data is stored in memory (RAM), which provides ultra-fast access times and significantly reduces latency for real-time data queries and updates. This is particularly beneficial for dashboards, which require quick data retrieval to display up-to-date information. Less frequently accessed data, or data that does not require real-time performance, is stored on disk. This method ensures that even large datasets can be managed effectively without overwhelming the available memory resources. By dynamically managing what data resides in memory and what data is offloaded to disk, hybrid caching maximizes both performance and storage efficiency, enabling seamless, high-speed access to data for dashboard analytics while ensuring scalability and cost-effectiveness. So is mobile delivery so that people can make decisions on the spot, not waiting to get back to their office when it’s already too late.

view gallery
View live interactive examples in InetSoft's dashboard and visualization gallery.

There are lots of different agile BI technologies that you need to consider implementing for you to have an agile BI environment. So with that in mind the recommendation is to think in terms of this pyramid. I think that one of the reasons you really need to embrace and start implementing and deploying agile BI is that agile BI forms about the one-third of the foundation for business agility, and business agility is key in the age of a customer, not just to succeed but actually research has proved is a key capability for survival.

So hopefully you can deploy some of the agile BI best practices that we talked about in the last half an hour. You can actually have the cake and eat it, too. You can indeed today with the modern technology and the best practices. You can have a scalable and robust environment to support your mission critical apps that are also at the same time agile and flexible and can support business agility.

Previous: Create a Shared BI Support Organization