Controlled Test Data for Payment Processing Applications
公司规模
Large Corporate
地区
- America
国家
- United States
产品
- GenRocket TDG
- FeatureFileCreatorScript
- FeatureFileGen
- SegmentDataCreatorReceiver
- SegmentMergeReceiver
技术栈
- API Integration
- Groovy Script
- XML
- JSON
- CSV
实施规模
- Enterprise-wide Deployment
影响指标
- Cost Savings
- Productivity Improvements
- Customer Satisfaction
- Digital Expertise
技术
- 分析与建模 - 预测分析
- 应用基础设施与中间件 - 数据交换与集成
- 应用基础设施与中间件 - 数据可视化
- 应用基础设施与中间件 - 数据库管理和存储
适用行业
- 金融与保险
- 电子商务
- 医疗保健和医院
适用功能
- 质量保证
- 商业运营
用例
- 预测性维护
- 过程控制与优化
- 欺诈识别
- 监管合规监控
- 远程资产管理
服务
- 系统集成
- 软件设计与工程服务
- 测试与认证
- 培训
关于客户
The customer in this case study is a major financial services company in the United States that processes billions of debit and credit card transactions annually, representing trillions of dollars in purchases and payments. This company manages millions of cards issued to both business and consumer cardholders, ensuring the accurate, efficient, and secure processing of these payments is crucial to their operations. The company operates in a highly sophisticated environment where payment processing software must support a variety of vertical markets, including restaurants, hospitality, and e-commerce. The software also needs to handle a wide assortment of card categories, incentive and loyalty programs, credit histories, and spending limits for both consumer and business accounts. Given the complexity and volume of transactions, the company faces significant challenges in ensuring their payment processing applications are rigorously tested for defects, compliance with data interchange standards, and performance under heavy load conditions. The company is also highly concerned about data privacy and security, particularly in the context of quality assurance testing, which must be conducted without the use of any Personally Identifiable Information (PII).
挑战
In order to test their payment processing application, the QA team at this financial services company determined their data feeds must be simulated in a highly controlled fashion. To reproduce complex transaction data feeds, the team copied a subset of their production data and prepared it for testing. Production data is attractive because it contains real transactions in the proper data interchange format. However, to prepare the data for testing, it had to be laboriously reworked by hand to create the data variations and permutations needed for test cases while removing all sensitive customer and merchant information. It took the QA staff 160 man-hours (an entire man month) to build a test data set. Because the data interchange format was revised every six months, the number of man-hours required for test data provisioning effectively doubles over the course of a year. The tedious nature of the provisioning process placed limits on the variety of test data available for functional, integration and regression testing. And the limits on the volume of data provisioned was impacting their ability to perform the load and performance testing required to simulate heavy transaction loads. In the end, they concluded there were too many problems associated with using production data alone for testing purposes. The following summarizes their rationale. Production data is not controlled data Without manual modification, test data copied from production data can only test for conditions represented by a given data subset. It does not provide the QA team with the necessary data to test edge case conditions, the presence of invalid data values, or specific input value combinations that might uncover software defects. To maximize code coverage under all potential operating conditions, test data must be controlled to simulate data feeds that contain all of the data variations required by each test case and its assertions. Production data is not secure data Business and IT leaders at this financial service company were very concerned about data privacy. The risk of a data breach that might expose sensitive customer credit information was too great when considering the legal and financial consequences. This risk was further compounded by the fact that much of the testing was being performed by offshore contract resources, limiting the internal control over the handling of sensitive customer data. Secure, high volume production test data is not practical Data masking is the conventional approach often used for mitigating the security risks of working with production data. However, masking all of the PII contained in the transaction data feeds used by payment processing systems is a monumental task. Transaction data feeds are complex, nested, fixed file data structures that contain control codes, record types, accumulated transaction values, and calculations for reward points and cash-back incentives along with real card holder and merchant account numbers and credit information. Finding and masking the sensitive information in this complex data stream while preserving the referential integrity of the data values is both daunting and time consuming.
解决方案
The team then evaluated the GenRocket TDG platform and the use of real-time synthetic test data to meet their needs. They presented their requirements to GenRocket and within three weeks GenRocket was able to provide them with a fully working proof of concept. First, GenRocket created a custom test data generator to recreate the “feature file” used to control test case conditions. This new data generator works in combination with custom test data receivers that format the data to match the company’s data interchange specification. Then a custom script was created to implement an API integration between their testing tools and the GenRocket TDG platform along with test data scenarios that contain instructions for generating test data in the required volume and variety that is needed for comprehensive testing. GenRocket worked closely with Cognizant, one of its premiere testing partners to produce an operational test environment that was ready for immediate use by the testing team. Here is a summary of the steps taken to set up their new test data provisioning platform: First Cognizant and GenRocket used the financial company’s data model to create GenRocket domains and attributes to simulate their payment processing database. Then the Cognizant team used GenRocket data generators to model the company’s business data for each GenRocket attribute. GenRocket created a custom FeatureFileGen generator for the purpose of reading “Feature File” data into GenRocket attributes. The GenRocket team then implemented custom data receivers to create formatted data. Together, GenRocket and Cognizant created GenRocket test data scenarios using the above components to consume “Feature File” data and produce the test data output. Finally, the GenRocket team created a groovy script that used the GenRocket API to orchestrate the entire process. The new custom GenRocket components created for this solution are as follows: FeatureFileCreatorScript: Used to generate a “Feature File” of 1 to 1,000,000 rows or more FeatureFileGen: The GenRocket generator used to query columns in a “Feature File” SegmentDataCreatorReceiver: Creates various segment files to represent the many data elements used in a typical payment transaction process SegmentMergeReceiver: Merges multiple segment files in the proper sequence and hierarchy to produce a consolidated payment transaction file GenRocket API Script (300 Lines): Integrates the test data generation process with test cases and ensures proper relationships of data in a dynamic data hierarchy
运营影响
数量效益
Case Study missing?
Start adding your own!
Register with your work email and create a new case study profile for your business.
相关案例.
Case Study
Hospital Inventory Management
The hospital supply chain team is responsible for ensuring that the right medical supplies are readily available to clinicians when and where needed, and to do so in the most efficient manner possible. However, many of the systems and processes in use at the cancer center for supply chain management were not best suited to support these goals. Barcoding technology, a commonly used method for inventory management of medical supplies, is labor intensive, time consuming, does not provide real-time visibility into inventory levels and can be prone to error. Consequently, the lack of accurate and real-time visibility into inventory levels across multiple supply rooms in multiple hospital facilities creates additional inefficiency in the system causing over-ordering, hoarding, and wasted supplies. Other sources of waste and cost were also identified as candidates for improvement. Existing systems and processes did not provide adequate security for high-cost inventory within the hospital, which was another driver of cost. A lack of visibility into expiration dates for supplies resulted in supplies being wasted due to past expiry dates. Storage of supplies was also a key consideration given the location of the cancer center’s facilities in a dense urban setting, where space is always at a premium. In order to address the challenges outlined above, the hospital sought a solution that would provide real-time inventory information with high levels of accuracy, reduce the level of manual effort required and enable data driven decision making to ensure that the right supplies were readily available to clinicians in the right location at the right time.
Case Study
Gas Pipeline Monitoring System for Hospitals
This system integrator focuses on providing centralized gas pipeline monitoring systems for hospitals. The service they provide makes it possible for hospitals to reduce both maintenance and labor costs. Since hospitals may not have an existing network suitable for this type of system, GPRS communication provides an easy and ready-to-use solution for remote, distributed monitoring systems System Requirements - GPRS communication - Seamless connection with SCADA software - Simple, front-end control capability - Expandable I/O channels - Combine AI, DI, and DO channels
Case Study
Driving Digital Transformations for Vitro Diagnostic Medical Devices
Diagnostic devices play a vital role in helping to improve healthcare delivery. In fact, an estimated 60 percent of the world’s medical decisions are made with support from in vitrodiagnostics (IVD) solutions, such as those provided by Roche Diagnostics, an industry leader. As the demand for medical diagnostic services grows rapidly in hospitals and clinics across China, so does the market for IVD solutions. In addition, the typically high cost of these diagnostic devices means that comprehensive post-sales services are needed. Wanteed to improve three portions of thr IVD:1. Remotely monitor and manage IVD devices as fixed assets.2. Optimizing device availability with predictive maintenance.3. Recommending the best IVD solution for a customer’s needs.
Case Study
HaemoCloud Global Blood Management System
1) Deliver a connected digital product system to protect and increase the differentiated value of Haemonetics blood and plasma solutions. 2) Improve patient outcomes by increasing the efficiency of blood supply flows. 3) Navigate and satisfy a complex web of global regulatory compliance requirements. 4) Reduce costly and labor-intensive maintenance procedures.
Case Study
Harnessing real-time data to give a holistic picture of patient health
Every day, vast quantities of data are collected about patients as they pass through health service organizations—from operational data such as treatment history and medications to physiological data captured by medical devices. The insights hidden within this treasure trove of data can be used to support more personalized treatments, more accurate diagnosis and more advanced preparative care. But since the information is generated faster than most organizations can consume it, unlocking the power of this big data can be a struggle. This type of predictive approach not only improves patient care—it also helps to reduce costs, because in the healthcare industry, prevention is almost always more cost-effective than treatment. However, collecting, analyzing and presenting these data-streams in a way that clinicians can easily understand can pose a significant technical challenge.