Industrializing Business-critical End-user Compute-based Applications using Low-code Platforms
Iris Software Perspective Paper
Thank you for reading
How Low-code Empowers
End Users in Mission-critical
Settings
By Anil Sanagavarapu, Leader of the Low-code Practice at Iris Software
Many large and small enterprises utilize business-managed applications (BMAs) in their value chain to supplement technology-managed applications (TMAs). BMAs are applications or software that end users create or procure off-the-shelf and implement on their own; these typically are low-code or no-code software applications. Such BMAs offer the ability to automate or augment team-specific processes or information to enable enterprise-critical decision-making.
Technology teams build and manage TMAs to do a lot of heavy lifting by enabling business unit workflows and transactions and automating manual processes. They offer traceability and audit capabilities. TMAs are often the source systems for analytics and intelligence engines that drive off data warehouses, marts, lakes, lake-houses, etc.
BMAs dominate the last mile in how these data infrastructures support critical reporting and decision making. Additionally, business teams resort to BMAs when:
1. They need faster speed-to-market and/or flexibility beyond what TMAs can offer.
2. There is a limited budget and significant ambiguity around how things should work
When creating BMAs, business teams look for a range of flexibilities. Those include:
• Adjusting data used for reporting/decisions: This could arise from the need to fine-tune what data is included/excluded, or compensate for known issues with data from certain sources, among other factors.
• Augmenting enterprise data: In data warehousing parlance, this often tends to be dimensional data and custom hierarchies (vs. facts) to enable slicing/dicing of the data/numbers in different ways, beyond the official hierarchies and dimensions supported by technology- managed enterprise data infrastructures.
• Developing and running business-developed models and business-defined scenarios to support derivation of insights and decision making. Examples include models for budgeting and forecasting in finance, sales and valuation, modeling impacts of price changes to sales, modeling impacts of supply chain disruptions due to policy changes such as new tariffs, or geo-political issues on manufacturing and sales.
These flexibilities have common denominators like the addition of calculations, derivations and aggregations. This ease of customizing and configuring BMAs also comes with its own challenges.
While BMAs deliver value and simplify complex processes, they bring with them a large set of challenges. These include:
1. Security: BMAs carry operational risks that are brought about by events like forced platform upgrades/patches, password security breaches, data loss, or unplanned downtime.
2. Opacity: As each BMA matures, the original creator’s approach becomes increasingly opaque to users. In such a setting, users could inadvertently provide wrong inputs, and these in turn could lead to incorrect decisions.
3. Collaboration with controls: While delivering higher flexibility and agility, BMAs typically lack controls that enterprises need in applications to support critical decisions. For example, Excel-based BMAs are used to model critical client or enterprise scenarios with sensitive data and are shared internally and externally with minimal controls.
4. Governance, including traceability and audit: Since most BMAs run with local copies of data, there is no guarantee that it has not been accidentally or intentionally modified in any significant way; that could happen even if the data has impeccable sources. BMAs lack a sufficient audit trail through iterations, and introduce the risk that critical decisions cannot be explained or supported to internal/external auditors with the right version of inputs.
Therefore, on an ongoing basis, business-critical BMAs that have become relatively mature in their capabilities must be industrialized with optimal time and investment. Excel is the most common platform used for BMAs. Hence, this paper discusses approaches for industrialization of Excel-based BMAs. Low-code platforms, such as Microsoft Power Platform, provide the right blend of ease of development, flexibility and governance that enables the rapid conversion of BMAs to TMAs.
Excel-based BMAs have many logical components, such as data from enterprise and external sources, mappings, derivations, aggregations, model/scenario inputs & adjustments, model/scenario calculations & outputs, pivots, and charts & visualization. Hence, the first step is to analyze the BMA and clearly map out the various parts along the conceptual buckets, as in Figure 2.
Dependency analyzers, such as Dependency Auditor, Spreadsheet IQ, and TACO-LENS, are helpful in analyzing and understanding complex Excel BMAs. Excel spreadsheets often have formulae repeated across rows/ columns with apparent differences in cell references, but with no conceptual difference. Overlaying the findings from these analyzers with user journeys, data source analysis, and desired outputs at each stage of the collaboration cycle completes the picture for requirements engineering. It also engineers a Minimum Lovable Product (MLP) and scopes a Minimum Viable Product (MVP) for users.
The final application can be reimagined as a modular set of components that deliver improved outputs that provide the user with the desired levels of configurability, security, collaboration and interactivity along with traceability and audit.
With the availability of Gen AI and AI libraries, it is imperative to design intelligent target-state applications with hooks for AI and Gen AI that enhance user experience and potentially add Intelligent Automation.
When converting a BMA to a TMA, there are many options. In our experience, an approach predicated on a low-code platform for user interactivity, and using cloud-based data components provides a way to deliver proof points and functionality rapidly to users. We depict in Figure 5 an implementation stack that we have most experience with. It uses the Power Platform with any of the prevalent cloud back-ends.
The inter-operability of the Power Platform components with Microsoft Office products, and commonly available technologies like SharePoint Online, make rich user interfaces, powerful end-user configurations, and integrations to databases easy to develop and deploy. Additionally, the out-of-the-box connectors provided by Power Platform make it amenable to connecting with any cloud back-end.
The core of the model computations are abstracted away from the users using a combination of cloud services and code. DAX-enabled features, custom connectors and other Power functions like Visual Calc provide extensibility to the users. The users can then focus on configuring model parameters, analyzing, smoothing, and visualizing data. Write-back functionality can be enabled for Power BI interfaces using a combination of Power Apps, Power Automate and APIs. When combined with Copilots, and Azure AI Builder, these TMAs unleash unprecedented capabilities for users. An added advantage is that both IT and business teams can play a role in the application code base and configurations, respectively.
Recently, our developers created a TMA in four months that replaced an Excel-based model used by over six power users in one of the valuation teams at a leading fund management company. The high quality of the output included numerous time-series outputs with configurable parameters, cross-tab outputs for forecasts, predictive analytics and grid outputs with write-back functionality and complex visualizations with drill-downs.
Low-code TMAs allow end users the flexibility of configuring and extending with additional data, calculations and models, while ensuring that the guardrails of governance, security and traceability are not compromised. It is possible to realize TMAs with predictable timelines, low-cost and high quality using low-code platforms like the Power Platform. In the past few years, the evolution of connectors, Copilots, Gen AI services and AI libraries have made it possible to provide a high-quality product with a rapid time-to-market.
This perspective paper is written by Anil Sanagavarapu, Leader of the Low-code Practice at Iris Software. Anil has more than 20 years of experience in software product engineering, software service experience in Industrializing Excel Models, Gen AI, and digital transformation in several verticals.
Iris Software has been a trusted software engineering partner to Fortune 500 companies for over three decades. We help clients realize the full potential of technology-enabled transformation by bringing together a unique blend of domain knowledge, best-of-breed technologies, and experience executing essential and critical application development engagements. With over 4,300 skilled professionals, Iris builds mission-critical technology solutions across banking, financial services, insurance, life sciences, and manufacturing. Iris Automation services and solutions include Intelligent Automation, Quality Engineering, and Low-code No-code development.
Click here to read about the milestones of our 30-year journey.
Learn more about our Automation capabilities at:https://www.irissoftware.com/services/automation/