Skip to main content

Ensuring stability in operations by Passenger Service System migration to Navitaire New Skies for an Australian airline

The Client

An Australian airline, provides point to point air travel service to passengers.

Problem and Scope

  • Business assurance of Passenger Service Systems (PSS)
  • migration from Navitaire’s New Skies 3.3 version to 3.4 version
  • Partition from TAH Singapore to TT-only partition
  • Existing integrations refactored with third party systems
  • New third party systems implemented

Key Challenges

  • Functional and Business requirements not available for test basis and work forecasting: Leveraged base test case repository, domain knowledge, and same is augmented with SME reviews.
  • Applications not accessible or few applications owned by SME: Test results verification and sample test cases validation by Coforge to certify test coverage and correctness.
  • Data Migration Testing not done by the PSS owner: Data Validation is done through application testing, data comparison of two schemas, and changing migrated bookings for a percentage of sample bookings.
  • Frequent changes in scope with timelines intact: Scope replanning and stringent monitoring of the scope, resources, timelines, and frequent management escalation.
  • Reduction in UAT 2 scope by 30 %. UAT3 scope increased by over 50%
  • Deviation from planned execution strategy for integration testing shifting to SME testing: Coforge verified SME test results and carried out sample validation.
  • Single environment for development, testing, SME used by all stakeholders: Test Data management plan circulated and flowed stringently
  • Delayed builds causing issues with team utilization - periods of peak and troughs
  • Dependencies on TT for E2E caused delays.
  • Configuration issues encountered after data refresh which impacted execution.
  • UAT payment gateway was down intermittently resulting in team idle time. Re-execution effort was spent because the payment gateway didn’t work at the last step and test analysts had to re-execute the test cases again.
  • Itinerary Emails performance and the connection were erratic i.e. emails were working intermittently. The function went offline suddenly for the entire day. Escalations and follow up time spent to get the emails back online again.
  • Bugs from UAT2/UAT3 were not fixed, as a result, regression got delayed resulting in team idle time for a day.
  • PNRs required for data validation got delayed due to SME availability which delayed the testing of data comparison scenarios.
  • E2E test cases were not reviewed as planned during the test design phase. During test execution parallel review was organized to move ahead with the execution.
  • Issues observed in itinerary email related to delete flight scenario, arrival date, payment details, bar code, declined booking, and change booking cases for IBE and MMB.


  • Test portfolio analysis 
  • Created and implemented test strategies and plans
  • Project monitoring and tracking of test execution
  • Risk, issues and dependency management
  • Defect management
  • Test data management
  • Dashboard / Reporting


  • On-time cutover to 3.4.12
  • Smooth transition without any operational delays

Key Activities

  • As-Is and To be architecture diagrams analyzed to decide the test approach specific to Tiger
  • Organized workshops with SMEs and Third parties to streamline the test approach
  • Capitalized the past experience to provide customized test strategy for the program
  • Analyzed project test requirements, deadlines, and project management expertise to produce the most appropriate high-level plan
  • Created integration test plans
  • Leveraged the Virgin Blue Test Designs for test case basis in absence of business and functional requirements
  • Organized SME workshops for test design augmentation and business understanding
  • Developed operations and governance model for integration testing from offshore
  • Testing Dashboard creation for the program
  • Risks, issues, and dependency register management
  • Test management of information services testing involved integration, data migration, and vendor system testing
  • Data validation through application testing and testing migrated data on a sample basis which was not a part of the original scope.
  • Managing Stakeholder involvement for acceptance of the system, test designs, and results sign off
  • Risk-Based Approach: Identification of critical business areas, impact areas, pain areas
  • Configuring JIRA in the client environment as a central defect management tool
  • Defect management
  • Coforge role: Process Holder, test management and governance
    • Verification of system test execution result provided by application owners
    • Validate the environment readiness and deployment via sanity test and execute the role-based evaluation
    • Testing the overall system developed in the UAT environment through integration testing of critical business processes
    • Design and execution of end-to-end test cases.
    • Performing the final configuration test by end-to-end flows
    • Regression test as a part of Cutover Support
    • Sanity during Go Live
    • System Test Results Validation: Test results of various applications participating in UAT 2 and 3 were validated.
    • Go-Live support -execution of test scripts during cut-over
  • System Testing of business-critical Travel Commerce applications:
    • Internet Booking Engine
    • Manage My Booking
    • Web Checkin
    • Travel Agency Portal
    • Mobile Application
    • Staff Travel
    • Itinerary Emails
  • Set of an automated and reusable test case, providing the airline with an optimal and efficient testing solution
    • Automated IBE regression suite
    • Reusability of test designs across the applications viz. IBE test designs customized for TA portal, mobile application
Let’s engage