InetSoft Webinar: Migration from Legacy Technology to Advanced Analytics

Below is the continuation of the transcript of a webinar hosted by InetSoft on the topic of Using Analytics to Increase Staffing Productivity and Improve Hospital Operations. The presenter is Abhishek Gupta, Chief Data Scientist at InetSoft.

Hopefully this presentation will provide some of those lessons learned for a migration from legacy technology to advanced analytics for anyone who is watching this now or on-demand in the future. Hopefully they can learn to overcome some of those hurdles that you had to overcome to implement this project and benefit from the lessons learned that they can take away from it.

The next question is, specific to InetSoft, when you hire technical team members for your team, do you require that they have knowledge of InetSoft beforehand?
No, we do not. Originally, in my team we had five, one DBA and four SQL developers. So then we learned InetSoft from there, in doing that we just went through the documentation on the website and we also bring in the educational professional team from InetSoft to come in and teach us and help us to setup the database also do maintenance and upgrades for the first two years.

From those two years my team learned from it, and later when we added more team members, we never put on our posting the requirement to know InetSoft, because I believe 99% of the time you don't need to write code, and when you it's either SQL or similar to JavaScript.

#1 Ranking: Read how InetSoft was rated #1 for user adoption in G2's user survey-based index Read More

So my team members who know SQL can do InetSoft as well.  The one only skill set that we need for the engineer to need to know a little more about is LINUX, but those skills can be acquire after you have InetSoft, and then you can work with the InetSoft team, and they can support you before you can get up and run with internal skills. I think they have a very good team there to help us as well.

SQL still is the language of choice for most database administrators and for people doing analytics. And so having an SQL engine obviously makes it easier to hire the right staff to know that language. Another question, this one is more about InetSoft. The question is, do you extract large data to InetSoft servers to improve InetSoft performance?

The answer we don't extract data into InetSoft servers. We put InetSoft right on top of Vertica.

view demo icon
View a 2-minute demonstration of InetSoft's easy, agile, and robust BI software.

That's the importance of having that integration with Vertica or other data stores. InetSoft can run on top of Vertica relatively smoothly, and you don't necessarily have to load data between tools. So another question is, this is a fun one, what will be your team's next project?

Yesterday we already met with business teams, the revenue cycle management team would like to use AI, Vertica and InetSoft to help the team to reduce denials. It's a great opportunity to learn, and let's find out how large data, raw data, billions of rows would we fed into InetSoft, and then let's see how it goes. My team was so excited yesterday to talk to the revenue cycle team, and hopefully we will go back to this group to present how it goes in the next few months.

Great and I just want to mention one thing, we mentioned IDOL a few times in this presentation. IDOL is another Micro Focus product, it's more for unstructured data like Text Analytics, Speech Analytics and Video Analytics. And if you want to learn more about that product you can go to www.microfocus.com and there are a number of tools and resources to learn about it there.

view gallery
View live interactive examples in InetSoft's dashboard and visualization gallery.

So another question that came in, how can we educate the students about InetSoft, please send tutorials to initiate the journey, excellent presentation. So good question, we have, if you go to www.inetsoft.com there is a Evaluate tab, under that Evaluate tab there are a number of tutorials and web-based training videos that you can utilize, learn more about InetSoft and then obviously you know the InetSoft website has plenty of resources and blogs and documentation about InetSoft.

So another question that came in, can you talk a little bit about the connectivity with the EMR systems that you are using for real-time or batch extract?

We went live with Epic last June, so actually our team is new to Epic, just six months old, so what we did so far for Epic is that we had the approval for five huge data extractions from Epic that support inpatient, ED, OR, Radiology, and also throughput. Those are the five extracts that from the beginning when we went live with Epic. My team works with the Epic team to build these data extractions.

Read what InetSoft customers and partners have said about their selection of Style Scope for their solution for dashboard reporting.

The data that come out would be hourly, or it could be daily, and it could be weekly, and it could be monthly. And these five extracts will support almost all the needs for the ED as well as inpatients. In the universe for the Epic universe, my team learned quickly how to create reports for adhoc requests.

We have another level of technical people who get the data into the SQL database that Epic hosts for us. It has thousands of tables. So from that information we plan right now to take the data out and import it into Vertica for advanced analytics. Yesterday my team finished the first ICU reports for all ICU units at the hospital, and we plan to share this report to the entire system.

The ICU report is really amazing. It has all the metrics that the nurse practitioners need to manage the ICU unit. And that information came from Epic, and I would say the data is huge. So without a large database engine it would be difficult to have InetSoft on top.

Read the top 10 reasons for selecting InetSoft as your BI partner.


It looks like this is the final question unless others want to ask more questions, but if you are using normalized tables, does it take time to fetch the large data or not?

It doesn't take much time at all, I think five minutes for 10 million rows, I believe. We don't have the time to wait for something to happen. Every year my team has to finish at least 113 projects. That's on average about 10 a month. We don't have the time to waste. There is another way that we do that with no SQL. We load the data into the format that the user doesn't have to join at all, so the parallel processing Vertica has helps us to break the batch into a hundred pieces, and you load them at the same time, so that's the saves time if you do it right.

It looks like that's those are all the questions that we have from the audience. I want to mention again, before we wrap-up, that a replay of today's event will be available on-demand shortly. And of course you know we encourage you to invite your colleagues and share it with them if they weren't able to join. And with that I would like to thank our audience for joining. Have a great day everyone.

Previous: Hospital Reporting Example Using Vertica