2.4. Continuity Equations: The Conceptual Basis for Reengineered Business Reporting

Printer-friendly versionPDF version In this economy business processes are measured on a continuous basis through different types of sensors that capture digital measurements of business metrics. This data are captured at a far finer granularity in time and detail than have ever been possible before.[11] Everything else provided by this ability for more frequent reporting, is a by-product of this fundamental change in the capability of data capture. What that data stream makes possible is measurement with an unprecedented degree of correspondence to underlying business processes. Furthermore the utilization of this data stream and its comparison with a new class of performance models that must be developed[12] will provide the basis for many automatic management decision models where the slowest element of the process, the human being, is excluded by automation. Figure 8 describes a formalization of these processes of data capture, comparison standards, exception standards, and meta-processes for measurement, control, management and assurance. Business processes, which are defined as “a set of logically related tasks performed to achieve a defined business outcome,” (Davenport and Short, 1990), are considered today to be the fundamental atomic elements that make up a company.[13] Thus a company is now described by what it can do rather than by its assets.That changed mindset has yet to be incorporated into traditional management and its assurance. What is fundamental about the real-time economy is that it brings the process approach explicitly into management through the very prompt measurement of processes and the comparison of these metrics with dynamic benchmarks that represent prescribed levels of business performance. Benchmarks that allow for the comparison of business process metrics with a standard (or model) will assume a much larger importance. The real-time economy discussed above, where processes are constantly monitored and their measurement compared with a benchmark for control purposes, requires highly dynamic adaptive models that can adequately represent the normative value that metrics must assume. Furthermore, in addition to basic benchmarking for first harmonic data comparison, second harmonic variance is also necessary for control purposes. Figure 8 illustrates this issue where processes are monitored and controlled by information systems, models, and management. When noteworthy exceptions occur adjusting management actions are effected. Some of these exceptions, are of (maybe also) assurance interest and are alarmed for audit purposes and directed to the audit “control” system. Figure 8: Meta-processes in measurement and assurance -data capture and control The monitoring and control of an organization’s processes can be viewed as a 5 level set of activities as described in Figure 10. The structural level (processes) is measured and metrics extracted and captured for the data level. Data is stored at a high level of granularity, say, basic transaction level. This data history may be examined under many distributions (cuts) such as time period, division, product, function, originator, etc. The third level encompasses the relationships perceived or prescribed among metrics, against which the organization performs control functions. For example, all flows from one process that reach the next one would constitute a one to one relationship and any differences would be exceptions. In general, to use metrics captured from level one in a control process it is necessary to have the measurement of the actual (metric), a model for comparison and a model of variance (which specifies the acceptable variation). The control process will compare the metric with the model, calculate the variance, and then decide if the variance is acceptable. If not, an alarm is triggered that may call for management action and/or assurance. The models may be very simple univariate levels to very complex multi-entity relationships like continuity equations. Among the types of models in CA we find:[m17] • A fixed number (normative or empirically derived) • An adjusted number with some form of analytic related to seasonality, hierarchy, or structure relationship The structure relationships can be represented by continuity equations and may represent: 1. Reconciliation structures 2. Semi deterministic relationships 3. Structures across processes 4. Empirical relationships across processes 5. Empirical relationships of a high level among KPIs The fourth level is the level of analytic monitoring and links very high level measures across processes. KPI (Key performance indicators) can be used to help understand process consistency as well as process performance. If measurements are not available at a lower level, this level serves to provide coarse alarms of major process difficulties. The fifth level is a meta-process level where the actual control and monitoring functions are performed based on continuous measurement, monitoring and proactive exception handling. Building on this model, the proposed solution is based on a view of a business in a real-time economy that would serve as a solution for some of the ailments encompassing the following factors: nCreation of a multivariate measurement model that does not focus exclusively on earnings per share and allows users to predict and evaluate business’ performance on a multivariate basis even if these measurements are in different dimensions (apples and oranges) nCreation of a measurement model that is oriented not only to investors but to other stakeholders of the business nCreation of a measurement model that not only represents static measurements of business but also the types of relationships that represent the business. These relationships can be structural, relational, empirical or comparative in the form of sector benchmarks. Figure 9: Galileo Enhanced Business Reporting Model Based on the examination of the current reporting model (GAAP) under this framework it can be concluded that a dynamic world cannot be well measured with static measurements, and that the technology exists for a more dynamic method of measurement to evolve. The disclosure model is very disjointed when the economic status of a firm has to be shown on a piece of paper (flat) and with very wide discrete intervals. Furthermore, while markets seem to value firms on a wide range of non-financial assets, the GAAP-based model focuses on financial items. It is also concerning that the measurement process focuses on the physical assets of companies more typical of the industrial age, while the valuable assets of an information economy are neglected. In an age where companies outsource many of their processes, suppliers carry the inventories of many companies, the RFID technology allows for specific identification of inventories, parts and assets, we still use FIFO and LIFO inventory valuation methods. In an age where dynamic markets exist where products are valued every minute we still focus on forms of historical cost as a substantive part of our business reports. In the days where it is well known that there is substantial leeway[14] [15] of interpretation in every number that determines an entity’s income we still focus on earnings per share. Another irony is that in the last couple of years and supposedly the next few, the FASB and the IASC will be focusing on the convergence of standards, converging towards a set of standards that is irremediably obsolete. If the measurement model is seriously compromised, progressively presenting less and less mapping with reality, the provisioning of assurance of these numbers is useless and is performed only for statutory purposes. It is not surprising therefore that accounting firms have progressively relied more in cursory analytical reviews and acted more like insurers than auditors. If the measures do not measure, even the best of the audits would just assure bad numbers that do not mean anything. Most litigation against auditors happens in failure situations, bad measures do not detect these, consequently good or bad auditing does not change much the auditing firms’ risk profile. Under these conditions, any downturn will show the underbelly of weak firms that have stretched their reporting to the limit and in their demise will punish CPA firms for purposely “bad audits” or irrelevant numbers that had little representativeness of the firm’s economic health. [MA18]