ASG Perspectives

Blog > June 2020 > The Impact of Hybrid IT on Enterprise Value Stream Automation - Corralling Clouds Without Wholesale

The Impact of Hybrid IT on Enterprise Value Stream Automation - Corralling Clouds Without Wholesale Reengineering

Earlier today, I was talking to a well-respected industry analyst as we shared our views on how things are changing related enterprises’ approach to mainframe modernization.  He observed that he had seen a great increase in efforts to encapsulate mainframe applications in the last two years compared to the prior few years.

What is Mainframe Encapsulation?

Encapsulation is an object-oriented programming (OOP) technique that exposes the capabilities of an application, while concealing the internals for deriving the capabilities, frequently through an application programming interface (API).

Encapsulation ensures that the capabilities of the mainframe application were exposed to the rest of the infrastructure in a way which made them easy to integrate and, perhaps more importantly, provided some degree of isolation that would make decomposing and re-platforming aspects of the mainframe application easier.

“Get off the mainframe!” Not so fast…

The analyst’s comments echoed, at least some degree, some of the conversations I have been having with customers in recent weeks.  Where a decade ago, it was common for an enterprise or federal agency to bring in a new CTO or other IT leader, who would immediately voice the directive to get off the mainframe as soon as possible.  Usually, the replatforming target expressed is cloud in one or more of its permutations – public, private, Fed, Hybrid, and so on.  In many cases, this big project approach led to large investments that consumed a great deal of resources and attention that were sometimes successful and more often less successful – at least in terms of managing to cost and deadline.

Mission essential value streams rely on mainframe processing

More recently, enterprises appear to be adopting a more pragmatic approach that considers the critical role mainframe processing plays in value streams that are absolutely mission essential.  Impulses to leap off the mainframe in a jump are giving way to taking steps to encapsulate the mainframe application.  The application capabilities are exposed through clean interfaces adhering to contemporary standards, such as REST, OData, and Swagger.  In some cases, this is accomplished using standard tools and techniques already available within IT and more frequently, using tools available from vendors such as HostBridge, Rocket Software, Software AG, IBM, GT Software, or even those who may not be mainframe-centric, such as MuleSoft.

Encapsulation for reduced risk

Integrating the necessary capabilities from the encapsulated mainframe application, then, with the other technology needed to support the end-to-end value stream becomes easier.  Moreover, this degree of isolation and insulation gives the enterprise a better context for applying enterprise complexity analysis tools to deconstruct the functions of the mainframe application so those suitable can be replatformed.  A stepwise approach like this reduces risks and increases the chances of success allowing the enterprise to retain processing on the most reliable, most available technology stack.

While this approach talks well at this level, the devil is in the details, as always.  Shifting workloads from mainframe to another platform has commercial advantages in the cases alluded to above – if and only if the shift can be done with as much transparency and the least risk possible.  Some of our customers are finding that it is crucial to ensure the automated workload management strategy applied takes this into account. Executing jobs that were on mainframe and are shifted elsewhere must be seamless—at least on the context of the overall completion of value stream instances. 

How an ASG customer automates workload management

A major insurance carrier accomplished just that by shifting the workload automation management primary control from a mainframe workload scheduler to ASG-Enterprise Orchestrator for its automated workload management on the distributed infrastructure.  Over time, more and more of the workloads shifted from mainframe execution to multi-tier client-server and cloud execution, with minimal disruption to essential value streams on which that enterprise depends.

To learn more about how this approach might benefit your enterprise, please contact your ASG representative….

Definitions

Encapsulation refers to a practice in object-oriented programming of bundling data with the methods that operate on that data, or the restricting of direct access to some of an object's components. Encapsulation is used to hide the values or state of a structured data object inside a class, preventing unauthorized parties' direct access to them. Publicly accessible methods are generally provided in the class (so-called "getters" and "setters") to access the values, and other client classes call these methods to retrieve and modify the values within the object.
Object-oriented programming (OOP) is a programming paradigm based on the concept of "objects", which can contain data, in the form of fields (often known as attributes or properties), and code, in the form of procedures (often known as methods). A feature of objects is an object's procedures that can access and often modify the data fields of the object with which they are associated (objects have a notion of "this" or "self"). In OOP, computer programs are designed by making them out of objects that interact with one another. OOP languages are diverse, but the most popular ones are class-based, meaning that objects are instances of classes, which also determine their types.

An application programming interface (API) is a computing interface which defines interactions between multiple software intermediaries. It defines the kinds of calls or requests that can be made, how to make them, the data formats that should be used, the conventions to follow, etc. It can also provide extension mechanisms so that users can extend existing functionality in various ways and to varying degrees. An API can be entirely custom, specific to a component, or it can be designed based on an industry standard to ensure interoperability. Some APIs have to be documented, others are designed so that they can be "interrogated" to determine supported functionality. Since other components/systems rely only on the API, the system that provides the API can (ideally) change its internal details "behind" that API without affecting its users.