Planning Enterprise-Wide Application Integration for a Multi-Billion Corporation
About Our Client
The Client is a conglomerate with multi-billion revenue that runs 30+ diverse businesses and deals with over 300 world-famous brands. The company operates in such areas as engineering, logistics & inventory, the automotive industry, financial consulting (for banks, investment companies, etc.), retail, food & beverage, advertising and more.
Challenge
The Client’s businesses had independent IT infrastructures. Thus, operational data was isolated across multiple CRMs, ERPs, POSs and ecommerce solutions, supply chain management systems and other enterprise systems. As a result of disintegrated data, the Client encountered a number of inconveniences:
- Manual data transferring led to errors and hindered data reliability. Moreover, the data wasn’t available immediately as manual inputs were bound to employee schedule.
- The employees couldn’t quickly get a complete picture of all customer interactions with the business.
- There was no ability to implement an advanced conglomerate-wide loyalty program.
- Automatized reporting embracing all business directions was impossible.
Solution
Assessing the existing needs and problems
ScienceSoft’s consulting team started with a thorough examination of the Client’s applications. During several Q&A sessions with the senior management of each business, our consultants identified existing business processes, information flows, technical details of communication protocols, data formats in use and the required data refresh rates.
Having assessed the revealed problems in the examined processes, ScienceSoft’s IT consultants suggested integrating the applications and systems in use to make the Client’s IT infrastructure better serve the current business needs.
The integration solution had to unite dozens of enterprise software components from 6 industries of different natures. Among them were several CRMs, several ERPs, web and mobile ecommerce solutions, multiple point-of-sale systems (POSs), supply chain management systems, several databases, reporting and analytics tools and more.
The integration architecture pattern had to connect applications placed both on-premises and in the cloud, and support different data formats (text, media, etc.). It was required to handle sync and async messaging, batch processing and support RFC (Remote Function Call) communication, while ensuring the IT infrastructure’s high performance and preserving its flexibility.
Developing application integration architecture
The team of ScienceSoft’s IT consultants offered to implement a layered architecture. The suggested architecture pattern included 4 layers, each fulfilling its specific role and functions:
- Points of initial data entry (included CRMs, ERPs and other enterprises systems of different purposes).
- The integration layer (mediator) – responsible for data extraction, transformation into the needed format, enhancement, and routing. It was decided to extract data directly from source applications since data processing in the data warehouses adversely affected the data veracity and the process velocity.
- Single and consistent data storage (with thought-out data access and encryption) – responsible for storage and processing of operational and analytical data.
- The Business Intelligence (BI) layer – responsible for data analytics, reporting and visualization.
The layered architecture was chosen as it enabled:
- Isolate future changes within one layer.
- Allow functionality re-use (from the lower layers).
- Eliminate security concerns.
- Make the conglomerate-wide IT system more understandable, simplify its testing and maintenance.
The proposed integration was based on ESB (enterprise service bus) and/or ETL (extract, transform, and load) services to allow for two strategies of smooth data exchange. ESB should be responsible for quick and relatively lightweight operations (messaging) between a data source and another data source or the underlying layer. ESB handled message routing, message exchange monitoring, message storage and message format validation.
ETL (extract, transform, and load) tool should be mostly responsible for loading massive amounts of data. ETL dealt with the transformation of structured and unstructured data into usable forms, data cleaning (by applying advanced validating rules), master data management and loading transformed data into the data warehouse.
Evaluating and comparing technology options
As a part of the IT consulting service, ScienceSoft described, assessed and compared four different tech stacks. The options arose from the Client’s partnerships with IT providers and tech prevalence in the existing IT infrastructure. They included:
- Option 1 – integration based on the SAP tech stack and SAP Process Integration (ESB).
- Option 2 – integration based on the Microsoft tech stack and Confluent for Kafka / Azure Service Bus (ESB-centered).
- Option 3 – integration based on the Microsoft tech stack and Talend Data Fabric (ETL-centered).
- Option 4 – integration based on the Microsoft tech stack, Talend Data Fabric as ETL and Talend ESB.
In view of the Client’s cloud migration strategy and plans to introduce IoT and social media streaming, the options included either cloud-ready solutions (Option 1 and 4) or those requiring minimal efforts for migration to the cloud (Option 2 and 3). ScienceSoft consultants highlighted the recommended architecture option for each way.
Reporting
ScienceSoft’s IT consultants prepared comprehensive descriptions and assessments of each proposed integration scenario. They included the integration design and integration guidelines, data extracting & loading, data processing operations, data quality & master management, data storage structure. They also provided detailed recommendations on the overall infrastructure design, rated the strengths and weaknesses of each tech stack, listed available connection types. All the needed tools were compared against key parameters and expected costs of ownership.
In addition, ScienceSoft described the potential risks related to the layered architecture and came up with the mitigation plan for each of them.
As the integrated enterprise applications had to deal with sensitive personal and business data, our IT consultants also described data security strategies for each proposed integration pattern in the report.
Results
The Client obtained four comprehensively described and assessed integration scenarios and tech stacks for the quick and effective consolidation of the conglomerate’s data. All four action plans allowed for having clean, accurate and consistent data, timely available and moved safely across different systems.
To support the company’s growth and scalability, the suggested integration option allowed for the smooth introduction of new components at minimal extra cost.
Methodologies
Q&A sessions, functional decomposition, business requirements analysis, comparative analysis.