Application Development

5 Secrets to Better Test Data Management

Learn how to build and deliver high quality applications fast and gain competitive advantage with test data management.

Software powers today’s modern enterprise. While DevOps and agile methodologies have allowed organizations to automate and accelerate development, QA teams are still constrained by the speed, quality, security and the costs of moving data to testing environments.

If you’re like most testing professionals, you know why a legacy approach to test data management can be a huge blocker for modern QA practices that promise higher quality and faster releases.

On average, it takes nearly 4 days to fulfill a request for a new environment, and one out of every five organizations say it takes a week or more.

TDM report image

Source: 2017 State of Test Data Management Report, Delphix

Oftentimes, the time to turn around a new environment is directly correlated to how many people are involved in the process. Our research shows that there are roughly 4 people involved in the setup of a non-production environment, and there are more than 5 administrators involved at more than a third of organizations.

Test data delivery graph

Source: 2017 State of Test Data Management Report, Delphix

With data playing a central role in today’s core business applications, organizations need to ensure that data also plays an important role in our test efforts. Here are 5 techniques to help test teams deliver a high-quality product while managing data environments more efficiently.

Virtualize whenever possible

Using a manual approach to test data delivery can introduce dependencies and potential for error when restoring old data from backups, provisioning storage or mounting DBs to target servers. That’s why organizations must leverage data virtualization and increase automation to deliver data faster.

Advanced virtualization technology enables teams to automatically deliver data in minutes rather than hours, days or weeks, even for extremely large datasets. Unlike physical data, virtual data can be provisioned very quickly and consumes a fraction of the storage space, making it easy to distribute test data to application teams. Equally, automation drives speed, repeatability and the ability to integrate test data management into DevOps and CICD workflows.

Use fit-for-purpose data

Development teams often lack access to test data that is fit for purpose. Many times, a developer will require a data set as of a specific point in time depending on the release version being tested. This forces the developer to work with a stale copy of data because of the complexity that comes with refreshing an environment. On average, 15 percent of software defects are related to data.

Data defects graph

Source: 2017 State of Test Data Management Report, Delphix

Most often, the best data for testing will either be a subset of production data or a full copy of production or test data that includes synthetic test data. If data isn’t of sufficient quality, it may not serve the purpose you need even if you know where to find it and how to use it.

Testers need access the right type of data to the test case, and that’s why fit-for-purpose data will increase efficiency while yielding the right test results. The bottom line is: Companies should leverage technologies that allow them to build and maintain a test data catalog, so developers have easy access to the appropriate dataset.

Design security into TDM processes

Did you know the average data breach costs an organization almost $4 million, and only 24 percent of companies are masking their data? Security should be implemented with the same level of automation and speed as the parts of your TDM processes to prevent a bottleneck.

The vast majority of sensitive data in an enterprise exists in non-production environments used for development and testing, so test data management must implement a security model that identifies sensitive data across an entire enterprise.

However, masking sensitive data often adds operational overhead and prolongs test cycles. Your masking solution should take an automated approach to protecting non-production environments, replacing confidential information, such as social security numbers, patient records and credit card information, with fictitious yet realistic data. That way you’re minimizing security risks without compromising speed.

The masking technology should also maintain referential integrity across heterogeneous data sources while seamlessly integrating with data delivery capabilities to ensure that the data is secure before it’s made available for development and testing.

Without the right tools to safeguard your data, you expose the business to sensitive data loss as well as be subjected to legal consequences and sizable fines for non-compliance with privacy regulations.

Empower testers with self-service

Creating an environment with the appropriate test data is a slow, manual and high-touch process. QA teams need the ability to provision the environments they need, on-demand in minutes without help from an administrator. Instead of relying on IT ticketing systems, an effective test data management approach puts sufficient levels of automation in place that enable QA engineers to provision test data via self-service.

A self-service portal can refresh data across multiple environments and provision synchronized copies to non-production environments to streamline testing. Specifically, the self-service feature should allow teams to:

  • Instantly refresh test data to match the latest state of production

  • Rewind the data back to an original, bookmarked state after destructive testing that alters the state of the data underlying an application

  • Synchronize test data from multiple, heterogeneous sources to the exact same point in time for integration testing of composite/federated apps

  • Easily share datasets with developers or testers to improve collaboration

Leverage a platform-based approach

Adopt a test data management solution that works across the data sources you depend on. The modern enterprise depends on a heterogenous set of sources rather than on a single type of data source.

In fact, a single application may be federated/composite in nature fueled with data from multiple sources. Thus, it's critical that your test data management approach supports these sources and provides a standardized approach to managing, securing and moving data across those sources.

An API-driven approach will also allow integrating your capabilities into key workflows that depend on automated testing and database release automation tools.

Better test data management means faster delivery of apps and more opportunity for innovation. Advanced TDM teams that expand their portfolio of datasets can enable measurable outcomes, such as 30 percent faster releases, 15 percent fewer software defects and 100 percent adherence to data privacy laws and regulations.

Leveraging data at scale and allowing data to be accessible across all layers of an organization will help every team maximize value across the enterprise.

DevOps TDM For Dummies® Delphix Special Edition

Find out why DevOps test data management (TDM) is magic in this eBook. You'll explore how DevOps TDM enables speed, quality, and compliance — without manual provisioning or copying of any resource. And you'll get real-world use cases of test data management in financial services, retail, and human resources.

Get DevOps TDM For Dummies >>