Subscribe

Enterprise visibility: A business imperative, an IT issue

Inefficiencies from duplication of data and systems and an inability to respond to new demands plague most organisations today.
By Charl Barnard, GM of business intelligence at Knowledge Integration Dynamics
Johannesburg, 15 Jun 2005

Over the past two decades, IT spend has grown significantly with the proliferation of disparate, proprietary ERP, CRM, SCM and Web-based systems for managing everything from supply to demand chains.

These systems were introduced to address specific departmental or line-of-business challenges, such as customer relationship management, financial reporting and inventory planning. Although these transactional systems have provided an effective means for managing specific day-to-day operations, they have also contributed to an enterprise-wide epidemic - poor visibility across the organisation because of little or no data integration.

As a result, inefficiencies from duplication of data and systems and an inability to respond to new demands plague most organisations today.

The need for increased insight into business operations has brought to the forefront the requirement for enterprise data integration. Despite a resounding need for an enterprise-wide approach, organisations often take on the integration of these complex systems one project at a time, using resource-intensive custom coding. However, they are increasingly discovering that this approach is not only costly and time-consuming, but it yields little in achieving the benefits of a true data integration strategy.

With reduced IT budgets and growing integration requirements, organisations are looking to do more with less. Their systems must also adapt to changing environments and requirements, automatically detecting the condition of data and determining the best way to execute against new standards and technologies. As a result, many organisations are looking for a single, adaptive enterprise data integration solution that can rapidly meet immediate needs, while providing the flexibility, scalability and extensibility required for future integration initiatives.

Today, enterprise data integration requires adaptive software that provides improved data integrity and greater visibility of enterprise data and processes, to enable organisations to realise significant time and cost savings.

Adaptive integration: The next wave

Considered the next wave of innovation in enterprise software, "adaptive" software products intelligently adapt to changes in the IT environment and optimise performance by efficiently leveraging IT resources - all without manual coding or modification of existing projects.

Adaptive data integration can help businesses lower IT costs and accelerate return on investment from their integration projects. And, in an era of compliance and governance, it can ensure confidence in real-time information.

Capture any data and move to any system

Given the myriad software systems existing today - including new and existing data formats and standards, as well as legacy applications and other systems - accessing, integrating and delivering data from system to system is a complex undertaking. The benefits, however, are too significant to ignore.

A services-oriented architecture can easily adapt to system and data changes.

Charl Barnard, GM, Knowledge Integration Dynamics

Companies need to easily access organisational data that is stored in a wide variety of transactional applications and systems, data warehouses, operational data stores, flat files and legacy systems. Only broad connectivity can provide uniform access to an array of systems, enabling organisations to more easily meet the integration challenges presented by an ever-changing infrastructure, while simplifying the management and maintenance of integration.

Many data integration initiatives, such as data warehousing, require the ability to process large amounts of data in bulk to populate data stores. However, for ongoing operations, the capture and propagation of smaller subsets of changed data provides a faster, more efficient method for updating systems with the latest changes. And for most time-critical applications, new or changed data needs to be detected and moved as it happens - in real-time.

Data must be provided on demand with batch, real-time, bulk and incremental data integration for efficiently moving operational and transactional data from any system to any system. This enables organisations to access their most critical information when and where they need it so they can effectively monitor, manage and optimise their business performance.

Any project, any time

More often than not, to effectively move information from one system to another requires the reformatting or restructuring of data. Data from more than one source often needs to be combined, filtered by some criterion, or augmented with data from a different location. All of these processes are part of data transformation - the process by which data is resolved into a format required by the receiving system. For example, transactional data requires transformation into a format that business intelligence tools can efficiently query, and master customer data requires transformation into the format required by a system of record.

Change is constant, and an organisation`s ability to respond to change is significantly impacted as the complexity of its infrastructure, systems and data increase.

A services-oriented architecture can easily adapt to system and data changes. Changes in underlying operating systems or databases are easily accommodated, and infrastructure additions can easily be adopted within a data integration services architecture without requiring changes to the business logic.

Ingredients for success

Today`s large-scale integration projects often span time zones and geographies. To effectively manage local and global development teams, organisations need integration software that supports collaborative development and deployment.

Security - in all its aspects - has become a primary concern for IT organisations. Authentication support through lightweight directory access protocol (a software protocol for enabling anyone to locate organisations, individuals, and other resources such as files and devices in a network, whether on the Internet or on a corporate intranet), and other directory servers, role-based permissions for securing granular access, encrypted data transmission, and detailed audit trail, offer a secure environment throughout the data integration process.

For organisations seeking to strengthen and standardise their data access, quality and security policies, data stewardship initiatives are increasingly viewed as critical to the business.

Organisations require seamless data profiling and cleansing capabilities. Data profiling automates the discovery of source data patterns and formats, offering a complete understanding of data, including content, quality and structure. Combined with data cleansing and transformation, data profiling shortens time to deployment of integration projects by providing insight into the condition of the data before extraction.

Metadata is the DNA of data integration. By capturing important information about the enterprise environment, data and business logic, metadata helps accelerate development, drive integration procedures and improve integration efficiency.

Share