实例探究 > Controlled Test Data for Payment Processing Applications

Controlled Test Data for Payment Processing Applications

公司规模
Large Corporate
地区
  • America
国家
  • United States
产品
  • GenRocket TDG
  • FeatureFileCreatorScript
  • FeatureFileGen
  • SegmentDataCreatorReceiver
  • SegmentMergeReceiver
技术栈
  • API Integration
  • Groovy Script
  • XML
  • JSON
  • CSV
实施规模
  • Enterprise-wide Deployment
影响指标
  • Cost Savings
  • Productivity Improvements
  • Customer Satisfaction
  • Digital Expertise
技术
  • 分析与建模 - 预测分析
  • 应用基础设施与中间件 - 数据交换与集成
  • 应用基础设施与中间件 - 数据可视化
  • 应用基础设施与中间件 - 数据库管理和存储
适用行业
  • 金融与保险
  • 电子商务
  • 医疗保健和医院
适用功能
  • 质量保证
  • 商业运营
用例
  • 预测性维护
  • 过程控制与优化
  • 欺诈识别
  • 监管合规监控
  • 远程资产管理
服务
  • 系统集成
  • 软件设计与工程服务
  • 测试与认证
  • 培训
关于客户
The customer in this case study is a major financial services company in the United States that processes billions of debit and credit card transactions annually, representing trillions of dollars in purchases and payments. This company manages millions of cards issued to both business and consumer cardholders, ensuring the accurate, efficient, and secure processing of these payments is crucial to their operations. The company operates in a highly sophisticated environment where payment processing software must support a variety of vertical markets, including restaurants, hospitality, and e-commerce. The software also needs to handle a wide assortment of card categories, incentive and loyalty programs, credit histories, and spending limits for both consumer and business accounts. Given the complexity and volume of transactions, the company faces significant challenges in ensuring their payment processing applications are rigorously tested for defects, compliance with data interchange standards, and performance under heavy load conditions. The company is also highly concerned about data privacy and security, particularly in the context of quality assurance testing, which must be conducted without the use of any Personally Identifiable Information (PII).
挑战
In order to test their payment processing application, the QA team at this financial services company determined their data feeds must be simulated in a highly controlled fashion. To reproduce complex transaction data feeds, the team copied a subset of their production data and prepared it for testing. Production data is attractive because it contains real transactions in the proper data interchange format. However, to prepare the data for testing, it had to be laboriously reworked by hand to create the data variations and permutations needed for test cases while removing all sensitive customer and merchant information. It took the QA staff 160 man-hours (an entire man month) to build a test data set. Because the data interchange format was revised every six months, the number of man-hours required for test data provisioning effectively doubles over the course of a year. The tedious nature of the provisioning process placed limits on the variety of test data available for functional, integration and regression testing. And the limits on the volume of data provisioned was impacting their ability to perform the load and performance testing required to simulate heavy transaction loads. In the end, they concluded there were too many problems associated with using production data alone for testing purposes. The following summarizes their rationale. Production data is not controlled data Without manual modification, test data copied from production data can only test for conditions represented by a given data subset. It does not provide the QA team with the necessary data to test edge case conditions, the presence of invalid data values, or specific input value combinations that might uncover software defects. To maximize code coverage under all potential operating conditions, test data must be controlled to simulate data feeds that contain all of the data variations required by each test case and its assertions. Production data is not secure data Business and IT leaders at this financial service company were very concerned about data privacy. The risk of a data breach that might expose sensitive customer credit information was too great when considering the legal and financial consequences. This risk was further compounded by the fact that much of the testing was being performed by offshore contract resources, limiting the internal control over the handling of sensitive customer data. Secure, high volume production test data is not practical Data masking is the conventional approach often used for mitigating the security risks of working with production data. However, masking all of the PII contained in the transaction data feeds used by payment processing systems is a monumental task. Transaction data feeds are complex, nested, fixed file data structures that contain control codes, record types, accumulated transaction values, and calculations for reward points and cash-back incentives along with real card holder and merchant account numbers and credit information. Finding and masking the sensitive information in this complex data stream while preserving the referential integrity of the data values is both daunting and time consuming.
解决方案
The team then evaluated the GenRocket TDG platform and the use of real-time synthetic test data to meet their needs. They presented their requirements to GenRocket and within three weeks GenRocket was able to provide them with a fully working proof of concept. First, GenRocket created a custom test data generator to recreate the “feature file” used to control test case conditions. This new data generator works in combination with custom test data receivers that format the data to match the company’s data interchange specification. Then a custom script was created to implement an API integration between their testing tools and the GenRocket TDG platform along with test data scenarios that contain instructions for generating test data in the required volume and variety that is needed for comprehensive testing. GenRocket worked closely with Cognizant, one of its premiere testing partners to produce an operational test environment that was ready for immediate use by the testing team. Here is a summary of the steps taken to set up their new test data provisioning platform: First Cognizant and GenRocket used the financial company’s data model to create GenRocket domains and attributes to simulate their payment processing database. Then the Cognizant team used GenRocket data generators to model the company’s business data for each GenRocket attribute. GenRocket created a custom FeatureFileGen generator for the purpose of reading “Feature File” data into GenRocket attributes. The GenRocket team then implemented custom data receivers to create formatted data. Together, GenRocket and Cognizant created GenRocket test data scenarios using the above components to consume “Feature File” data and produce the test data output. Finally, the GenRocket team created a groovy script that used the GenRocket API to orchestrate the entire process. The new custom GenRocket components created for this solution are as follows: FeatureFileCreatorScript: Used to generate a “Feature File” of 1 to 1,000,000 rows or more FeatureFileGen: The GenRocket generator used to query columns in a “Feature File” SegmentDataCreatorReceiver: Creates various segment files to represent the many data elements used in a typical payment transaction process SegmentMergeReceiver: Merges multiple segment files in the proper sequence and hierarchy to produce a consolidated payment transaction file GenRocket API Script (300 Lines): Integrates the test data generation process with test cases and ensures proper relationships of data in a dynamic data hierarchy
运营影响
  • The GenRocket solution allowed the QA team to generate test data in real-time, significantly reducing the time and effort required for test data provisioning.
  • The use of synthetic test data ensured that no Personally Identifiable Information (PII) was used during testing, addressing the company's data privacy and security concerns.
  • The custom components created by GenRocket provided the flexibility to simulate any data feed with 100% secure synthetic data, enabling comprehensive testing under various conditions.
  • The integration with Cognizant and the use of GenRocket's API allowed for seamless integration with the company's existing testing tools and frameworks.
  • The ability to provision test data on-demand and using a self-service model streamlined the entire testing process, improving overall efficiency and productivity.
数量效益
  • The QA staff saved 320 man-hours per year, representing a significant cost and time savings for the organization.
  • The time to create 1 million rows of test data was reduced to less than 30 minutes.
  • The time to create 20 million rows of test data was reduced to less than 8 hours.

Case Study missing?

Start adding your own!

Register with your work email and create a new case study profile for your business.

Add New Record

相关案例.

联系我们

欢迎与我们交流!
* Required
* Required
* Required
* Invalid email address
提交此表单,即表示您同意 IoT ONE 可以与您联系并分享洞察和营销信息。
不,谢谢,我不想收到来自 IoT ONE 的任何营销电子邮件。
提交

感谢您的信息!
我们会很快与你取得联系。