That’s important not just from a data access perspective but also from the meta model perspective which I will come to in a minute. The second core idea behind InetSoft's innovative thinking is the idea that data mashup eventually will have to be a predominant way of integrating information because the volume of data is growing so fast and the disparity of information is so high that physically replicating and consolidating all the possible information that you would want to use is not going to happen. In the future it's going to happen less and less.
So the moment you realize this, you then start to think about flexible ways of combining virtual real time data with cached data, with scheduled batch movement of data at a very granular node-level rather than think of each of these as sort of an either/or technology.
And the third idea is once you have basically abstracted and delivered these data services, you want to make them very reusable, very fast to implement and maintain. There is a separation, if you will, of a logical data service and the actual management of the data services from an access control security perspective, service levels governance, etc. which means the same logical data service might be accessed with high service levels by the CIO and the CFO, but certain other users might only be restricted access or access during certain times, etc. and all of this obviously there is a lot backend about security, performance, scalability, governance, etc.