Implications of Mashup Technology for Enterprises
Below is more from Information Management’s Webcast, “The Last Mile: Data Visualization in a Mashed-Up”. This Webcast was hosted by Eric Kavanagh and included BI consultants William Laurent and Malcolm Chisholm, and InetSoft's Product Manager Byron Igoe.
Eric Kavanagh (EK): Our first official guest is Byron Igoe from InetSoft. Byron, welcome to DM Radio.
Byron Igoe (BI): Hi. Thanks, Eric.
EK: Let’s talk about this enterprise implication. Large organizations have lots of rules and regulations, and there are reasons for all that stuff. We hear all about data governance these days and obviously it’s a big deal, especially for public companies because they have to report on this stuff, and the stock market goes up and down depending on what people hear about things so it’s important. But what do you think are some critical implications of mashup technology with respect to the sort of protocols or processes of large enterprises?
BI: Sure, well, speaking to the point that you made earlier, about not seeing mashups used as heavily in large enterprises, I think probably one of the big reasons for that is inertia of the old paradigm. It used to be the case that companies would always focus on ETL and data marts and data warehouses. There was a huge concern about cleanliness and security rules and with the new mashup paradigm, a lot of people are thinking “oh, all the old rules are broken.”
|#1 Ranking: Read how InetSoft was rated #1 for user adoption in G2's user survey-based index
I really don’t think that is the case. If you do it right, you can really straddle the two worlds and have IT providing the access to the enterprise sources with the appropriate levels of security and governance in place, and still allow a whole self-service layer for the users to really fend for themselves and take ad hoc querying to the next level.
EK: You see, what you just described, it seems to me is an ideal way to go because IT still needs to have that cortically important role of making sure the right data gets to the right people at the right time but I think your point is very well taken, about inertia related to the old way of doing things, and I have to think that part of the reason why you have inertia, quite frankly, is just because these environments are so brittle, in a way, and the people who are managing this stuff have a lot of pressure on them to hit certain batch windows, to make certain this data gets refreshed, that data gets refreshed. So I think that’s probably one of the key reasons why you have that inertia. Do you think that’s a fair assessment?
BI: Eric, I agree 100%, and the great irony is doing data mashups right can actually alleviate a lot of those problems.