Process Orchestration - key to the future of business process management - and IT
Mark Ehmke, Managing Director of Staffware South Africa
Process Orchestration holds the key to the future of Business Process Management, and with it the ability to change the way Business and Information Technology (IT) interact.
In order to substantiate how, it is necessary to look first at the past, in order to learn about the future.
IT started because business needed to record its processes. Therefore, IT has always been about data. In fact, it used to be called Data Processing (DP). Traditionally we have captured, stored and analysed data, simply because the associated processes have been too dynamic to capture, store and analyse.
Almost all of the recent technology innovations, such as ERP, CRM and Business Intelligence (BI), have been built upon this data paradigm.
However, because business is about processes rather than data, this has created a Business-IT divide, causing a mismatch between process-based business requirements and data-based IT deliverables.
In the last decade, Process Automation in the form of workflow, or embedded processes within enterprise applications, has moved the control of individual processes from IT back to business. Workflow has been seen as the "glue" that has linked people, applications and systems together into a business process.
However, workflow has been treated as either a departmental point-solution, or as a legacy system in its own right. Adoption has been slow, due to the requirement to analyse and understand each process before automation can occur. Additionally, integration has been a key stumbling block to rapid adoption.
Enter Web Services, which has emerged as a key standard upon which integration methodologies can be based, avoiding many of the problems associated with proprietary integration solutions.
However, Web Services is far more than an integration technique. It, together with BPM, promises to change the role of IT in organisations.
Instead of continuing to fine-tune their legacy applications, companies are now working hard at breaking them down into standalone services according to the emerging Web Services standards such as WSDL and UDDI.
Simultaneously, vendors are themselves being forced to decompose their application products into the selfsame Web Services, in order to keep themselves "open" in the mainstream integration market.
The result is a large number of discrete services that can be used together - but more importantly - totally independently as well.
The Analyst`s vision for BPM, is to provide a Process Engine (PE) to coordinate these services into a variety of compatible business processes, referred to as Process Orchestration.
In other words, Process Orchestration enables the Process Manager to flexibly and dynamically link multiple services to business processes at run-time.
Why is it important?
With the advent of Web services and the demand for agile business, the ability to define processes that can dynamically change in response to data inputs and unplanned business events is crucial to delivering this flexibility.
With Process Orchestration, business analysts are able to map business processes by describing the services that might be used in the process without necessarily understanding exactly which services will be used, or even whether these services are internal or external.
The PE then dynamically invokes the required services at runtime based on the data content of the work item or even an external system event.
As the proliferation of Web Services increases, so the nature of computing will change. The "grid" of Services will be managed by Service Catalogues, both internally within a company, and by external Service Brokers, whose entire existence will be about registering published services into public catalogues, from which customers may subscribe to a selected service.
This services approach will enable consumers to establish a Service Oriented Architecture (SOA) on which they can deploy Service Oriented Software (SOS), a key direction for the future according to the Analysts.
The key requirement for future modeling tools will be to discover new and/or improved services, compare these via the selection criteria (price, speed, quality) of the organisation to the existing services and to then automatically simulate how these new services will affect the business process.
Since services and BPM will dominate the future role of IT, Process Monitoring and Analysis will become an even more key requirement. For example, management will demand a "digital dashboard" to monitor the status of end-to-end processes, even if parts of the process might in fact be external to the company itself. They will want to define Key Performance Indicators (KPIs) across the entire value chain and to have triggers and alerts in place to cater for any exception.
The concept of the "Data Warehouse" will change to that of the "Process Warehouse", which will contain contextualised processes stored over time. Trends will be much easier to detect and Business Intelligence (BI) will be able to slice and dice the processes themselves, not just the associated data.
Taking this new process-centric IT paradigm to its logical conclusion, there will be no applications in the future - hence there will be no more software licenses. A company would only need a Process Engine and a connection via the Web to a set of Service Brokers, in order to connect to the relevant Web Services, as and when needed in the process definition (which itself could be downloaded from the Web as a best practice). They would then be charged for their consumption of the services on a monthly basis, much like the consumption of electricity or for connecting to the mobile telephone networks.
Consumers will experience true `computing on demand`, and be spoilt for choice of services available (not to mention the competition between vendors, which can only be to their benefit).