Diminishing Returns

Sajit Kapurvalapil

|

August 3, 2015

Enterprise resource planning

Ever since businesses turned toward IT for productivity enhancements, lowered labor costs and improved process efficiencies, they have aggressively sought to gain any competitive advantages that IT could bestow. Over the decades, however, companies have harvested nearly all available low-hanging fruit. Universal business processes such as bookkeeping, order processing and inventory management have been turned into software solutions that no longer provide any significant strategic differentiation. Corporations began to find that, for mundane operational processes, extracting marginal efficiencies had exceeded the point of diminishing return.

As a result, in the early 1990s, IT spending began to bifurcate, albeit informally, between the mundane keep-the-lights-on expenditures and the more strategic gain-an-edge-over-your-competitors kind. The strategic IT spend went toward business processes that were either unique to the company or industry, or could make a meaningful impact on the company’s bottom line, such as patented firmware on an automobile, or a unique way of mining consumer purchasing patterns. There remained large swaths of IT spend where the marginal gain in competitive advantage for every additional dollar spent was meager.

Enter enterprise resource planning (ERP). This class of software automated common practices that recur within businesses regardless of industry, including financial accounting, customer order fulfillment and supplier relationship management. For IT managers, this was a godsend. Now, they were free to allocate in-house resources to projects that drove strategic value for their employers.

The Rise of ERP

Coinciding with this split between strategic and non-strategic IT spending, ERP software packages such as SAP, Oracle Financials and Sage exploded into ubiquity. This software provided companies with standardization and rock-solid reliability, while driving the adoption of high-quality business practices. ERP systems grew to become the IT backbone of nearly every large global corporation. Today, they command a market size in excess of $25 billion a year. SAP, the market leader in this space, claims that more than 85% of Fortune 500 companies implement its software today.

As companies began to view portions of IT as akin to utilities, they started offloading them onto ERP providers. In return, they expected their software to behave exactly as a utility service would—reliable, dependable and seldom (if ever) failing. The non-strategic nature of these IT functions did not reduce their critical value to businesses, however. Consider a manufacturing plant where a downtime in ERP’s production-planning module can be just as crippling as, say, a disruption in its power supply. They are both ubiquitous utilities as far as the business is concerned, and their non-availability can have equally paralyzing effects on operations.

ERP vendors responded to this “reliability challenge” by building a monolithic system that housed a business’s major functions. Human resources, accounting/finance, order processing, inventory management and production planning were melded together into a unified software solution. These integrated systems naturally lent themselves to high levels of reliability and reduced risks of failure, especially when compared to the fragmented IT systems that were previously distributed across the enterprise. These systems also came with an unexpected benefit: improved data quality.

Reliability from a Single Data Source

Before the advent of ERP, every business function (and department) had its own IT system, each with its own copy of essential data. It was common, for instance, that within a single company, identical customer data would reside within the financial IT system, order processing system, shipping system, marketing system and every other business function that needed this customer information. If a customer updated a mailing address, it would not just need to propagate accurately and in a timely manner across the various departmental IT systems, but would also require the recreation of several business documents, including invoices and shipping documents. This process was rife with opportunities for failure, as data hopped through the enterprise trying to stay accurate and relevant.

ERP systems sidestepped these data-quality risks by adopting a path of simplicity. They chose to have a single, shared database—one version of the truth, so to speak—while also providing each user group (sales, service, finance) with function-specific tools that worked transparently with this common data. Essentially, in the integrated ERP implementation, there was just one copy of a piece of information, and it was shared by multiple functions of an organization. This eliminated the risk of stale pools of data forming within the company.

A Proliferation of Complexity

The simplicity of this approach and its concomitant systems reliability paid rich dividends to ERP vendors—SAP, for instance, grew to become one of largest companies on the planet, with a market capitalization exceeding $97 billion.

In recent years, however, maturing markets and slowing growth have caused ERP providers to expand the scope of their software beyond its original mandate of reliably automating common businesses processes. Now, it has spread into software segments that businesses previously considered strategic differentiators. New offerings from providers have included software in niche areas such as predictive analytics, behavioral target marketing and transportation network optimization. These new offerings have been networked with the core ERP system to form a kind of federated solution, with many of the core business transactions then retooled to route through this mesh. These solution suites now require that data leap and replicate between multiple nodes in order to successfully perform promised functions.

Several of the key ERP functionalities are also increasingly being built as separate satellite functions. For instance, financial management, which has been the heart of ERP since its inception, is now being fragmented into independent systems such as credit management, cash management, fixed asset management and distributed ledgering. While this multiplies the licensing revenues for ERP vendors, it also adds to the complexity of the system. These distributed systems require a constant flow of electronic documents between one another in order to operate successfully. Common business processes such as the receipt of a goods shipment, which was previously handled within a self-contained software edifice (electronic documents moved seamlessly between the warehouse management, accounting and inventory management modules), now requires data to hop between up to five different systems before reaching its end target. This increased complexity and the need for data replication reintroduces the quandaries that ERP originally sought to solve with its self-contained software solution.

Evolving Toward Fragility

The genius of a monolithic ERP was in its ability to keep down failure rates; an integrated monolithic system will always be more failsafe than a fragmented one. Every time an additional semi-autonomous function is bolted onto a system, it proportionally degrades the overall reliability of that system. Consider a standalone ERP system that has a 99% reliability rating: If you string together five additional software units with similar ratings, the reliability of the combined system degenerates to an unacceptable 94%. In the same vein, while the mean-time-between-failure (MTBF) for a mature ERP software system is one failure in about 365 days, chaining together four newly minted additions (newly crafted software usually have much lower MTBF than mature ones) will exacerbate the failure rate of the combined system to below that of its weakest link.

Many IT risk managers are not yet fully aware of the potential frailties that are being introduced into their hitherto stable and dependable IT backbone. It behooves them to demand that ERP salespeople provide reliability metrics (in standardized measures such as MTBF) that compares their mature single-product solutions with the newer multi-product configurations that are being aggressively promoted. That would be the prudent approach, rather than presuming that the enormous successes ERP systems enjoyed in automating and centralizing a business’s core processes will automatically transfer over to any solutions suites they craft.

ERP’s financial bookkeeping and inventory accuracy tools have long been sacrosanct functions for businesses—functionalities that were untouchable in terms of either taking on new risks or adding complexities. But that sanctity is increasingly being violated. As author Nassim Nicholas Taleb said, “You can expect blowups and explosive errors in fields where there is a penalty for simplicity.” Today’s ERP designers seem to have forgotten that it was simplicity that made their systems so successful.
Sajit Kapurvalapil works as an operations manager at a Fortune 100 company.