Want to see Parasoft in action? Sign up for our Monthly Demos! See Demos & Events >>


Automation for SOA Virtualization & Testing

Automation for SOA Virtualization & Testing Reading Time: 3 minutes

In another post, we discussed the necessities for adopting service virtualization. This is a follow up with a look at automation, which I’ve found to be critical for achieving extensibility and easy configuration a critical must have in any service virtualization solution.

To start, you can save a significant amount of time and hassle by using a graphical interface to customize stubs with different request/response use cases, error conditions, delays, and so forth:

Automation for SOA Virtualization & Testing 

Data-Driven Parameterization & Scripting

To quickly configure responses with a wide variety of values, you can set the stub to use automatically-generated inputs for specified operations, or feed it a range of values that are stored in a data source. In addition, if you have scripts or code that represents a custom response, you can integrate it directly into the emulated environment. This allows you to extend the stub to mimic any level of processing—no matter how complex or sophisticated.

To get the most out of service virtualization you need combine it with test data management to deliver data simulation. This approach allows users to simplify the process of understanding what data is available by leveraging the recordings made while creating virtual services. Parasoft’s test data management technology automatically generates data models from interactions in your system, and automatically infers information about the data to make it easier for non-technical users to get the test data they need.

Parasoft Virtualize provides the tools to take data models and mask the sensitive data, generate additional data for use in their virtual services, as well as snapshot the data to easily roll forwards and backwards in time. This approach to test data is significantly more approachable than traditional test data management solutions because the majority of the complexity in traditional TDM comes from deriving the data model, whereas in the Parasoft ecosystem, that happens automatically.

Deployment Locations

Automation can be used to deploy the stubs locally or make them available as a service so that different teams or business partners can collaborate on evolving their components within the distributed architecture. If the emulated asset changes as the application evolves (for example, the WSDL for an emulated service is extended to include a new operation), the associated stub does not need to be re-created; it can be updated.

Parasoft Virtualize allows you to configure dedicated Virtualize servers—always-running machines that host the specified virtual assets in order to provide the appropriate team members and project stakeholders continuous, stable access to virtualized resources. With such a server, the team gains centralized virtual asset access and management. Such Virtualize servers can be accessed and managed remotely from your team’s various Virtualize desktop installations.

Deployment Strategies

You can also use automation to take advantage of different deployment strategies. One option is to make the stubs available for continuous access. Another is to integrate them into an end-to-end test suite. Parasoft provides a web interface for developers and testers to select and access virtual assets in the context of test environments. Team members can review and provision pre-configured test environments that can include different combinations of real and virtual assets (set to different states with various performance profiles, data sets, etc.). The Virtualize Administrator can decide what environments are available to different users and what configurations and options each environment provides

End-to-End Test Suite Deployment Example

For instance, to validate a loan approval service that executes a business process workflow with multiple steps (including calling a manager approval service and another external credit rating service), you could construct a scenario with the following tests:

  • Test 1: Send a request to a loan approval service to initiate the process.
  • Test 2: Act as a stub to listen for the incoming credit rating service over HTTP and respond with the desired rating for the scenario (emulate the crediting rating service response).
  • Test 3: Act as a stub to consume the manager approval message over a JMS queue and respond with approval, denial, etc.
  • Test 4: Get an asynchronous response from the loan process with the final loan result and validate it.
  • Test 5: Execute a query against a relational database to check if the loan application was audited in the database properly.
  • Test 6: Remove the application data from the database in order to restore it to the original state and make the test scenario repeatable.
Written by

Rami Jaamour