Tackling increasing growth in data volume, data warehouse users, and complexity of business intelligence

Client & Case Background

A Leading Japanese headquartered financial services group with an integrated global network of over 30 countries. They service the needs of individuals, institutions, corporates and governments through its four business divisions: Retail, Asset Management, Wholesale (Global Markets and Investment Banking), and Merchant Banking.

Business Unit Overview

The Technology Service division provides solutions and support for all middle office and back-office functions, Investment Banking & other Corporate Divisions. The user base spans across Operations (Asset Servicing, Reconciliation, Trade Booking, Settlements, Payments & Confirmations), Finance (Product Control, Finance Control, Tax & Treasury), Reference Data Services (Products, Books & Counterparties), Risk, Investment Banking, Compliance etc. Coforge was approached by this division to assist in a Data Technology solution.

Problem Statement

For a financial institution, accuracy of data is paramount. Therefore, the issue of data quality needs to take front seat.

  • Testing large amounts of data that are derived from multiple global sources into the main data warehouse.
  • Produce audit trail documentation and reporting for compliance with regulations and data quality standards.
  • Client required a Data Technology expert to deploy a data quality solution that would scale to support increasing growth in data volume, data warehouse users, and complexity of business intelligence.
  • To improve approach to development and deployment, client needed to automate and simplify its manual development processes, which would assist in speeding time to market & reducing the cost of doing business.

Solution:

Coforge, after its discovery phase and successful POC delivery, deployed its team of Core Data Managers and ETL (Extract Transform and Load) delivery team, streamlined multiple data collection processes adding multiple automation tools to save time and costs.

ETL tools collect, read, and migrate large volumes of raw data from multiple structured or unstructured data sources and across disparate platforms. They load that data into a single database, data store, or data warehouse for easy access and retrieval. Our Data Managers make sure

that Quality of the Process and Data is never compromised in its journey to the Cloud or internal data warehouses.

One of the tools widely used by BFSI clients for Compliance is Axiom and is also a one-stop solution for ETL challenges. It is a difficult-to-master tool, but with our huge Risk & BFSI domain experience. Coforge has seasoned Axiom experts in teams.

Additionally, a robust DevOps governance program with measurable key performance indicators was put in place for the client.

Our bespoke solution delivery team leverages the continuous integration (CI) and continuous delivery (CD) features of DevOps, which reduces deployment time, so teams have more time for testing and quality control issues. Through CI, developers integrate code into a shared repository several times a day, enabling quick error detection and mitigation. With CD, teams produce software in short cycles, helping reduce the time and risk of delivering changes.

To get the maximum benefit and further improve management decision making, our team used virtualisation and containerisation of the data.

Dashboard Virtualization enables our DevOps teams to develop and test within simulated environments. This enables development to occur alongside real-time testing of how the changes will impact the entire system. This level of accuracy in testing makes for vastly reduced deployment times and increased stability in such a complex system.

Containerization furthers the concept of virtualization by not only providing a digital configuration that mimics hardware setups but also mimics the OS and libraries that encompass the entire runtime environment.

  • Faster service delivery: Agile releases that kept up with rapid global teams’ demand
  • Visibility across data: Ensured Global compliance and sensitive data accuracy
  • Cost-effective service: Increased productivity and performance of the team with lesser resources
  • Improved Legacy Technology: Seamless modernization into the legacy technology stack
  • Bespoke solution Built as per the need of the client.
  • 7 times lower failure change failure rate
  • 106 times faster lead time from commit to deploy stage
  • 2604 times faster recovery from failures and unexpected incidents
  • 208 times more deployment frequency

Technologies used in this project:

  • Jenkins: Configuration management
  • Kubernetes: Container orchestration
  • Docker: Software containerization
  • Grafana -charting and graphs
  • Prometheus-for database
  • Automating Docker with Ansible