The Effective Use of Business Intelligence and Information Management Applications

This is the continuation of the transcript of a Webinar hosted by InetSoft on the topic of "The Movement Towards Using Unstructured Data in Business Intelligence Solutions." The speaker is Mark Flaherty, CMO at InetSoft.

So we did see that in the employee size, that did have an impact on how much, how innovative organizations were with respect to the effective use of business intelligence and information management applications. And the logical explanation of that is that in a larger organization they probably have more need to deal with structured information and have more resources also to specialize and dedicate potentially to automating some of the processing of that unstructured information.

There are four aspects of maturity. One is technology, so to the extent that an organization could be very mature at looking at unstructured information. They may have people trained in evaluating it. They may have processes in place that are repeatable and shared and documented throughout the organization but they might be doing a lot of it manually.

So they would get a very high score on several of the categories but they wouldn’t get as high a score in the technology category and we look at all four of those things together to determine the overall maturity. So I suspect if somebody were not automated and not using technology in this process; they probably wouldn’t make it into the innovation category. The innovative category is a combination of all those things, the people who process the information and the technology, not just one category.

#1 Ranking: Read how InetSoft was rated #1 for user adoption in G2's user survey-based index.

What Are the Most Common Information Management Applications?

There is an opportunity for improvement. People are not completely satisfied. If you look at the pie chart, 53 percent are somewhat or completely unsatisfied. These are the complaints that people have with their information application processes. These are similar to a lot of the business intelligence and analytics complaints. Systems are too slow. They’re not flexible. They’re hard to build and maintain or there aren’t enough resources to implement the systems. And so again these are issues that are common to most BI application deployments.

What are people trying to accomplish? Increasing productivity and efficiency, and this relates back to that automation comment. The more this can be automated, the more you can leverage technology. Even if it’s not necessarily automated, but you have technology that makes it easier for people to do their jobs, then you’ll get some productivity and efficiency gains which more than half the organizations are looking for out of information applications. If you looked at this in a more holistic way, you need both structured and unstructured information to make decisions. To think that you can get all of the information entirely from structured data is not realistic. That might be true in some cases, but in many cases you need to look at unstructured information as well. To the extent that you’re going to support decision making in your organization, this is information that is naturally part of the decision making process.

What about standards? What types of data do people access? And what technology types do they use in their enterprise?

This is somewhat leading up to the types of information accessed, but it’s SQL which gets to the structured data. A lot of the unstructured data is now in XML documents. So that was the second most important standard. And then there are other standards which have more to do with the information delivery: Java, HTML5 and Microsoft .Net. So you can see the types of information that people are bringing together. As for the unstructured information, you have information from content management systems. Then there is Microsoft Word or Adobe Acrobat type of documents. The next one is text documents which could be emails or any other type of information but those are really the primary unstructured sources of information.

In terms of functions, you want to be able to search across structured and unstructured content that’s accessibly via a browser. I think that’s a main requirement. If the data is accessible via a browser; there is technology that can scan for it and extract it and bring it together. So whether it’s text or whether it’s numerical information, if it’s delivered in a browser, you can get it. I don’t want to downplay it, but a poor man’s description would be a screen scraping.

Read the top 10 reasons for selecting InetSoft as your BI partner.

How Can You Extract Unstructured Information?

It’s really looking at the HTML source of the Web page, not so much the screen. Within the HTML there’s typically some keywords and tagging that would help you get to the information that you’re looking for. But again that’s just one way potentially to grab some of this unstructured information here. Whether it’s your own internal content management systems of if it’s documents that are out on the Web or Web pages.

What Is Tracked on a Content Management Dashboard?

A content management dashboard centralizes the KPIs and operational indicators that help editorial teams, content strategists, product owners, and marketers understand how content is created, distributed, discovered, and performing. Below is a detailed, role-friendly list of the metrics and signals typically tracked on a robust Content Management Dashboard, grouped by purpose so you can quickly find what matters to your function.

1. Content Performance Metrics

  • Page Views / Unique Page Views — Raw and unique visits per piece of content over time.
  • Time on Page / Average Engagement Time — How long readers spend with the content; signals depth of attention.
  • Bounce Rate & Exit Rate — Percentage of visitors who leave immediately or exit from that page.
  • Scroll Depth — How far users scroll (e.g., 25%, 50%, 75%, 100%)—useful for longform content performance.
  • Social Shares & Referral Traffic — Visibility and amplification across social platforms and referral sites.
  • Engagement Events — Clicks on CTAs, downloads, video plays, embedded form submissions.
  • Conversion Rate by Content — Percentage of visitors who complete a desired action (signup, download, lead form) after viewing the content.
  • Assisted Conversions — Content that influenced conversions later in the funnel (multi-touch attribution).

2. Content Quality & Editorial Health

  • Readability Scores — Flesch–Kincaid, grade level, sentence length, passive voice usage.
  • SEO Health for Page — On-page SEO checks: title tag, meta description, headings, image alt text, canonical tags.
  • Content Age & Staleness — Age of content and last-updated date; flags for refresh opportunities.
  • Broken Links & 404s — Pages with dead links or requests that returned errors.
  • Duplicate Content / Canonical Issues — Potential cannibalization or indexing problems.
  • Image & Media Optimization — Large images slowing page loads, missing alt text.

3. SEO & Discovery Metrics

  • Organic Sessions & Impressions — Visits and impressions from search engines (Search Console signals).
  • Top Search Queries — Keywords driving impressions and clicks for specific content.
  • Average SERP Position — Where pages rank on average for target keywords.
  • Click-Through Rate (CTR) from Search — How compelling your title and meta are in SERPs.
  • Index Coverage — Pages indexed vs. pages blocked/removed.

4. Editorial Workflow & Productivity Metrics

  • Content Pipeline Status — Counts by stage (idea, drafting, reviewed, ready, scheduled, published).
  • Time to Publish / Cycle Time — Average time from ideation to publication.
  • Approval & Review Times — How long reviewer/approver steps take.
  • Assigned Workload by Author — Active drafts per author or editor to balance capacity.
  • Revision Frequency — How often published posts are edited or updated.
  • Backlog Size — Number of ideas or planned posts waiting to be worked on.

5. Audience & Engagement Segmentation

  • Audience by Channel — Organic, paid, social, email, direct, referral.
  • New vs. Returning Readers — Loyalty and retention of your audience.
  • Geography & Device Mix — Where users come from and which devices they use.
  • Behavior by Persona / Segment — Engagement and conversions split by audience cohorts where available.
  • Subscriber Growth & Churn — Email list or membership signups and unsubscribes tied to content.

6. Content Inventory & Taxonomy Health

  • Total Content Count — Total pages, blog posts, landing pages, product guides, etc.
  • Content by Type & Topic — Distribution across categories, tags, or taxonomies.
  • Orphan Pages — Pages with no internal links that are hard to find.
  • Content Gaps & Coverage Radar — Areas/topics underserved relative to strategy or competitors.
  • Reuse & Repurpose Candidates — Content with evergreen potential for repackaging.

7. Technical Performance & Reliability

  • Page Load Time & Core Web Vitals — LCP, FID/INP, CLS — crucial for search ranking and retention.
  • Error Rates & Uptime — Platform reliability and incidents affecting content access.
  • Cache Hit Rate & CDN Performance — Delivery efficiency for static/asset-heavy pages.
  • API Latency (if headless) — Response times for CMS/content APIs powering apps.

8. Governance, Compliance & Security

  • Permission & Role Violations — Unauthorized publishing or access attempts.
  • GDPR / Privacy Flags — Content handling that requires consent or PII redaction.
  • Copyright / Licensing Notices — Missing attribution or license expirations.
  • Audit Trail Coverage — Percentage of content with full change history for accountability.

9. Monetization & Commercial Metrics

  • Ad Impressions & RPM — Revenue per thousand impressions for ad-driven content.
  • Lead Volume & Lead Quality — Leads generated by content and their downstream value.
  • Subscription Conversions — Free-to-paid conversions attributed to content pieces.
  • Affiliate Clicks & Revenue — Performance of affiliate links embedded in content.
  • Microconversion Tracking — Smaller conversions like video plays, content shares, or guide downloads that feed revenue funnels.

10. Experimentation & Optimization Metrics

  • A/B Test Results — Variant performance for headlines, CTAs, layouts, or pricing blocks.
  • Hypothesis & Lift — Measured lift in engagement or conversions vs. control.
  • Personalization Metrics — Performance of personalized content segments.

11. Collaboration & Cross-Functional Signals

  • Support Tickets Originating from Content — Content-related confusion or missing information causing support load.
  • Sales Enablement Usage — How often sales reps use or reference content assets.
  • Product & Engineering Requests — Feature requests or fixes triggered by content needs.

How These Metrics Fit Together (Practical Connections)

The most useful dashboards don’t just show raw numbers — they connect signals so teams can diagnose and act. A few examples:

  • High page views + low time on page — Content attracts clicks but fails to engage; possibly misleading titles or poor structure.
  • Good organic impressions + low CTR — Improve titles/meta descriptions to increase clicks from search.
  • High bounce after signup CTA — The content promised value but the CTA destination fails to deliver; check landing page UX.
  • Many support tickets tied to a help article — Article might be outdated or unclear; prioritize updates and add FAQs or screenshots.
  • Core Web Vitals regression + traffic drop — Technical performance affecting SEO and user retention; investigate assets and lazy-loading.
Dashboard role suggestions:
  • Editor / Managing Editor view: Pipeline status, time-to-publish, backlog, content age, top underperformers needing refresh.
  • SEO Manager view: Organic impressions, top queries, SERP positions, CTR, index coverage, and on-page SEO issues.
  • Growth / Product view: Conversion by content, assisted conversions, engagement funnels, and A/B test results.
  • Engineering / Ops view: Core Web Vitals, error rates, API latency, and content delivery metrics for headless CMS setups.

Design & Implementation Tips

  • Use multiple dashboard slices tailored to stakeholders — one-size-fits-all dashboards under-serve teams.
  • Prefer trend charts + context over single numbers: show last 7/30/90 day trends and percentage change vs. previous period.
  • Enable drill-downs so an editor can click a poor-performing article and see traffic sources, onsite behavior, and last edit history.
  • Surface automated alerts for content staleness, traffic drops, indexing failures, and large decreases in engagement.
  • Combine quantitative metrics with qualitative signals (reader comments, internal reviews, content scores) for balanced decisions.
Previous: The Movement Towards Using Unstructured Data in BI Solutions
We will help you get started Contact us