Making a Tabular Report Using the Table Wizard

This is the continuation of the transcript of a product demonstration provided by an InetSoft sales engineer for an enterprise prospect interested in our business intelligence software.

When I start making a tabular report, I select the Table Wizard. Now the wizard takes me through a step-by-step process and handholds me and walks me through different steps. A user with permissions can even set up batch jobs. He can schedule tasks, and say, run this report everyday at 6 o’clock in the morning and email it out to my boss. Here’s where you set a time condition when you want to run this, daily, weekly, monthly, hourly and action.

What do you want to run? I want to run this report, and I choose the report we just created. I would like a notification was it successful, did it fail? I would like to deliver to these as emails. Each of these tasks can also be run on demand. So this would be the user interface.

To install the product, we give you a server module, so you can launch the BI application. We also give you a Desktop Developer tool. The Desktop Developer tool is called Style Studio. Style Studio has a lot of functions. One of the main functions is data modeling. So before you create any report at dashboard, you first have to hook into your databases and extract data from your different data sources.

#1 Ranking: Read how InetSoft was rated #1 for user adoption in G2's user survey-based index.

Steps for Making the Report

First I create a data source. A data source could be your relational database, Web service, flat file or SAP connector, almost anything. You could connect to an analysis engine. You can connect to your OLAP cubes. Once I created my data source which is basically the information I need to connect to my database, how do I extract my data? I can create one or more predefined queries, one or more predefined SQL queries, but those queries are restrictive. It's a fixed result set, but I want to be more flexible, so I create data models.

The data model consists of two parts. It’s a mapping. The data model is not a query. It consists of a physical view and a logical model. The physical view is my actual schema or a subset of my schema. I did a drag and drop of my tables. It doesn’t have to be the whole schema. I normally start up with a fact table or a transaction table and build it around it. Once I have my physical view, I can build my logical model. So I map my physical world to my business world.

In my logical model I create one or more entities. And I simply drag & drop from my tables into my entity. Even though my data may be normalized in my database, it’s spread across many different tables. If it falls under one business umbrella, I can represent it in a way which makes more business sense. I can even rename my fields to make them more useful.

And this is what I would expose to my developer as my business users. They just drag and drop from this model. Based on the physical mapping, InetSoft builds a query for them.

In my data source I can also create what is call a Virtual Private Model. Remember we were talking about dynamic data filtering? Two users can view the same dashboard, the same report and they automatically have their data filtered based on their privileges or their roles. A VPM is basically a dynamic query filter. It contains different components like Lookup. This is the first line of defense. Should I apply this or not? In the VPM, I save the knowledge of the user, his roles, his groups, parameters – tables, the columns. I can embed my own little business rules, and here you have a very simple rule.

“We evaluated many reporting vendors and were most impressed at the speed with which the proof of concept could be developed. We found InetSoft to be the best option to meet our business requirements and integrate with our own technology.”
- John White, Senior Director, Information Technology at Livingston International

Dynamic Data Filtering

If the user is Eric, go ahead and filter the data for all of the users. They can see everything. What do I filter? I automatically bring in the sales employees table and filter it by the first name, so Eric only sees sales made by himself. Other employees, other sales reps see everyone’s data. And you can see the connection. Notice this report. When I login as Robert, notice how Robert sees data for all the sales reps, but if I log out and I log in as Eric, Eric only sees data, only sees sales made by himself.

So you see all this data automatically filtered out. It’s the same dashboard. Even if he tries to do his own querying, create his own dashboard, he will still filter out all his data, so any query which is close to the database will be intercepted and filtered out.

Question: The portal looks very good. It will do exactly what we are looking for, and obviously I can see potential for other areas as well. The opportunity that I mentioned needs the dashboards and the reports. It’s a very impressive application. It's very good. Thank you.

View the gallery of examples of dashboards and visualizations.

How an Aramid Fiber Recycler Switched From Zoho to InetSoft for Reporting Needs

When the aramid fiber recycler decided to move off Zoho for its reporting, the catalyst wasn’t vanity dashboards—it was operational friction. Plant managers needed live visibility into intake quality (cut length distributions, resin contamination), depolymerization yields, solvent recovery rates, and tensile/elongation QC from the lab. In Zoho, much of this lived in static extracts and brittle scheduled imports that lagged behind the shop floor by hours. InetSoft’s StyleBI immediately changed the rhythm: the team stood up live, blended views that joined MES event logs, LIMS results, ERP purchase orders, and EHS incidents without round-tripping through yet another ETL job. The mashup layer let them apply transformations on the fly (unit harmonization for viscosity, outlier clipping for batch anomalies, SPC-friendly rollups), so what operators saw on the line was the same truth finance and compliance saw at quarter-end.

Performance and scale also tipped the balance. With Zoho, modeling multi-year solvent recovery curves or tracing lot genealogy back to intake sources often meant timeouts or overnight rebuilds of wide tables. StyleBI’s caching, incremental refresh, and push-down queries kept heavy analytics responsive, even when LIMS dropped hundreds of tensile tests per shift. The recycler built a genealogy dashboard that resolves a finished fiber pellet back through re-spool, filtration, and shredder lots in seconds—a game-changer for root-cause analysis when tensile strength dips or ash content spikes. And because InetSoft supports row- and column-level security, the same dashboard safely serves vendors (who only see their own scrap returns), internal QA (who see batch-level defects), and executives (who get consolidated KPIs) without maintaining three parallel artifacts.

Migrating wasn’t a forklift. The team cataloged Zoho reports into three buckets—operational, compliance, and commercial—and rebuilt the top 20% that drove 80% of decisions first. InetSoft’s visual composer reproduced most of the familiar layouts, but the gains came from logic: reusable calculated members for yield-at-grade, standardized SPC components, and parameterized filters for chain-of-custody audits. For regulatory filings, StyleBI stitched together ISO/OSHA event data with solvent emission trackers so the compliance officer could export a pre-validated package rather than reconciling spreadsheets. Meanwhile, sales finally got a rolling margin view by source vendor and fiber grade, factoring in rework minutes, solvent makeup, and waste hauling—numbers they trust because they’re computed once in the mashup and reused everywhere.

Cost and culture rounded out the win. Licenses consolidated, shadow extracts disappeared, and IT stopped babysitting brittle pipelines. More importantly, the plant embraced a “question then blend” habit: engineers prototype a yield hypothesis in StyleBI, validate against recent shifts, and only then ask data engineering to industrialize it. That feedback loop shows up on the P&L: better solvent recovery setpoints, fewer off-spec runs, tighter vendor scorecards, and lower working capital from right-sized WIP. In my view, moving from Zoho to InetSoft wasn’t just a tool swap; it was a structural upgrade that aligned analytics with the recycler’s peculiar truths—batch variability, strict traceability, and relentless SPC—so every dashboard nudges the process toward stronger, cleaner, and more profitable fiber.

We will help you get started Contact us