Data Orchestration for a Swiss Private Investment Management firm
About the Client
The client is a Swiss private investment management office working in the interests of a European family. It operates mostly in long term strategies to provide the highest quality investment management services.
The core challenge for the client was the huge volume of legacy monolithic applications that were complex in nature and lacked scalability. In addition, the lack of historical data storage and real-time reporting facility was creating bottlenecks in enhancing performance. The key challenges being faced were:
Difficulty in debugging and recovering numerous applications and scripts in case of failure.
Lack of appropriate tools and technologies to offer scalability and flexibility to the architecture.
Complexity in incorporating new sources of data and the challenge of ensuring data quality.
Inability to carry out forecasting or analytics due to lack of storage and no real-time reporting of failures.
The modernization of the platform was planned in two phases
Laying out the platform:
Define and select appropriate tools with clear segregation of responsibilities.
Selection of new tools to schedule jobs and ETL tools for mapping data.
Removing custom applications and scripts.
Selection of issue tracking software for real-time reporting of failures.
Building on top of the platform:
Introduction of enterprise data management tool - Markit EDM.
Introduction of data warehouse and data lake for historical data with Talend.
Build analytical tools for data analysis, reporting, and prediction using Tableau & Python.
Markit EDM was introduced to enhance the data quality and capture historical data. Data from various sources are loaded through Markit EDM where several checks are performed to ensure data quality. The data is then stored and exported to the ThinkFolio trading system and other systems. Coforge was engaged in the development and maintenance of the Markit EDM components.
Delivering More Value
The solution developed by Coforge’s team of experts resulted in:
A clean and manageable system with all artifacts under version control.
100% real-time reporting of failures and problems in the system.
Reduction of downtime by 99% in case of failure.
80% less overheads to incorporate new sources of data.
Better data quality and 100% traceability of individual data points.
Reducing reporting time from 1 week to a few clicks.
Use of custom algorithms to produce analytics data.