Want to see Parasoft in action? Sign up for our Monthly Demos! See Demos & Events >>
As 2019 commences, I had to reflect on the thousands of conversations I have had over the last year with QA, test engineers, and managers. This year, especially the last quarter of it, was dominated by conversations about accelerating testing, especially how to align testing strategies concurrently with development. So I had the distinct pleasure of sharing with many people how to reduce rework across functional and nonfunctional testing with Parasoft SOAtest, simultaneously improving collaboration across teams while accelerating testing to keep up with development.
I can confidently say that SOAtest is the complete solution for functional test creation and automation. It is the blood, sweat, and tears of over 17 years of development, produced by a company whose single, dogged focus is test automation and making testing easier and streamlined for its customers. In terms of end-to-end testing, it reduces the manual effort of test creation for various types of functional testing, such as Service Definition/Contract tests, smoke tests, API component tests, API scenario tests, Web UI tests, database tests, omnichannel tests, microservices tests, performance/load tests, and API security tests, all of which are easily automated and can be tied to your CI/CD pipeline through a command line interface or SOAtest’s award winning REST API.
There is a lot of raving that I can do about SOAtest and its depth of technology, innovations in AI and machine learning, and the success it has had in enabling our customers to achieve their goals for quality and delivery; however, today I want to talk about the value that is unlocked when an organization uses SOAtest to bridge the gap between development, QA, and performance testing teams, to reach complete synergy in a testing organization.
So I'm just going to rip the band-aid off and lay it out there: development should test.
And I'm not just saying unit testing (which is obviously valuable). Development should be involved in testing any new or changed APIs. Now, I am not saying development should be building full testing scenarios, and I think when you really dig into it, development is most likely already doing some of the testing I'm talking about here. When a new API is being built out, or an API has undergone schema or service change, development typically does minimally create contract tests and smoke tests for each of those APIs to validate that the service contract is written according to specifications and to validate the schema (request and response) and endpoints (HTTP, MQ/JMS Topic/Queue, etc.).
If developers can start using the same functional testing tool to create tests, the QA team can simply leverage these tests to form the more complex test scenarios that they need to validate. So how can developers leverage Parasoft SOAtest to help accelerate testing?
With Parasoft SOAtest, developers can very easily validate:
Developers using SOAtest easily create tests to validate and enforce policies of a WSDL, Swagger, RAML, etc., through the consumption of that service definition file. SOAtest will perform schema and the sematic validity tests to make sure the definition file is machine readable and consumable. It will validate the interoperability to make sure it adheres to the industry standards of a service definition file, and finally it will create a regression test to validate that nothing has changed since the last test run.
These tests provide that stable foundation that QA can leverage to efficiently build a solid and resilient testing strategy (more on that in a moment).
Using Parasoft SOAtest, development can easily create their component tests, to test the individual components of a service looking to validate that:
With SOAtest, creating these functional smoke tests is literally as easy as uploading your definition file to SOAtest and selecting "create functional test." This will parse your API automatically, creating one test for each individual service contained within that API. These tests are immediately runnable, and allow developers to spend minimal time validating that errors they may have received are the correctly-expected error messages and responses.
Development at this point has done its job -- they have validated the basic functionality for each service, and now it's QA's turn. Testers need to create tests that go beyond basic functionality and test the actual business logic and complex scenarios of the API to discover unforeseen and unexpected behavior. Dev does a beautiful job of building it, and QA's job is to create complex scenarios meant to test the stability of the services as they function in concert. I like to look at it like this: when development has utilized SOAtest for their contract and component tests, QA comes to a kitchen already stocked with ingredients laid out ready to be mixed, blended, and assembled into a meal.
It's amazing to me how valuable this reusability of test artifacts is, and how much it can accelerate testing practices, simply by eliminating the rework of QA creating tests that have already been done by development. In the work-smart paradigm, QA starts with the stocked kitchen, and can get more done in less time. It's just logical.
Let's look at how this can accelerate testing.
QA can reuse the same component tests that the developers created in Parasoft SOAtest to make sure that everything works in a specified scenario. They can:
Because QA already has the building blocks it needs (courtesy of development), they can scriptlessly pick and choose, with simple copy-and-paste commands, the individual components that are going to be used to test their scenario. These components can be dragged and dropped into the right order and restructured to create each scenario. The responses and information from the first test can be parameterized with a few clicks, and used to drive the second test’s request data, and so on and so forth down the line.
These scenario tests are created more efficiently, benefiting from the components provided already by the development team. With SOAtest, you can take this efficiency a step further, reducing even more rework by "templatizing" business logic (i.e. assertions, validations, authentication) into rules, with machine learning. Reusing test logic improves the consistency of API testing, while accelerating testing by eliminating work that has previously been completed by another team member.
One consistent struggle that arises due to the gap between Development and QA is the ping pong in communication that occurs when a defect is found by QA. It is a time-consuming task to document that defect, take the screen shots, write out the exact test steps taken that revealed the defect, and then communicate this to development, who will often come back with the frustrating reply that it works fine in their environment.
This ping-pong between Dev and QA slows down defect remediation times and takes valuable time away from both the developer (as they struggle to recreate the test environment) and the tester (who gets caught up in a broken documentation and communication cycle rather than spending their time creating more tests).
Instead, when both teams are using Parasoft SOAtest, this communication/collaboration gap is filled by the creation of re-runnable testing scenarios, dramatically speeding up the knowledge-sharing between testers and development. When a QA member finds an issue they can quickly create a test scenario (.tst file), showcasing the behavior, which can then be shared with the development team. Development can then run the test scenario on their machine to see the behavior and can see the exact steps and calls that lead to the incorrect behavior shortening the defect remediation time.
QA is now working smart. They have created a consistent strategy for testing their APIs that is built upon the existing component tests created by Development, that reduces rework by templatizing the application of business logic so that it can be reused and leveraged across the testing team. But what happens when change is introduced to your applications?
Change can take many forms such as:
Usually a giant headache to QA organizations is understanding those changes, identifying the test cases impacted by the change, and updating and rerunning those test cases to validate that the changes have not broken anything. Without SOAtest, these things require a heavy amount of study of the two versions of an API definition file along with a herculean effort to understand the tests impacted and how to either edit or rewrite each impacted test to validate that change.
SOAtest gives QA an easy way to manage and mitigate the impact of change through its Change Advisor module. Remember those Service Definition or Contract Tests that were so important for stocking QA's kitchen? Those Service Definition files come back to help with change management.
When change occurs within your API schema or services, Development will update that definition file and provide QA with the newest version. SOAtest’s Change Advisor module comes in and automatically compares the new version of the definition file with the old version, creates two maps that graphically lay out the operations and the schemas between the old and new definition files, and then QA can come in, easily identify what needs to be changed, and with a few simple clicks, review and update based on the changes. And once all changes have been reviewed, that change template can easily be applied to automatically bulk refactor all existing tests impacted by those changes.
QA has now done its job. Testers have created multiple complex testing scenarios meant to test the business logic of the API and validate the functionality of the services in concert with each other. The business logic is sound, and each use case has been tested and validated. Any defects found have been easily communicated back to development in the form of a .tst file for quick reproduction and remediation. There is a comprehensive and minimally-manual strategy in place for maintaining those API tests and updating tests when change occurs. Now it's time to break the application -- it's time for the performance testers to test the behavior of the API when it has 100, 500, 1000+ users trying to perform the same scenarios at the same time from varying locations around the world.
In many cases, a performance tester would need to create his or her own testing scenarios specifically under these conditions. Fortunately, by leveraging Parasoft SOAtest, the performance team again doesn't need to reinvent the wheel. They can utilize the combination of component tests created by Development and the Scenario tests created by QA to validate their SLAs and the timely performance of the application, all within within the SOAtest's Load Test module.
Within the Load Test module, existing SOAtest component or scenario tests can be easily leveraged and driven with any possible number of virtual users and spread across any number of slave machines to test scenarios under different load types such as bell, buffer, linear, and stead load, allowing you to validate that the application can behave as expected under the various types of stress.
“Work smart not hard” should be the end goal with your functional testing strategy, but doing the same actions again and again has instead been the norm for testing teams when it comes to API testing. So often, I speak to QA Managers and DevOp Coaches who are tasked with identifying ways to accelerate their testing velocity and increase collaboration, and what I've described here is the answer.
Teams can reduce rework and increase efficiency by leveraging the capabilities of SOAtest. It is easy to adopt at both the enterprise and single project or start up level, as it has been created to deftly scale, and it requires a low level of technical expertise for test creation and automation. Having one unified tool for functional testing used by Development, QA, and Performance allows for a groundbreaking level of collaboration and a reduction in rework that can impact the bottom line, reducing overall testing efforts, time, and costs.
An account manager at Parasoft, Jamie works closely with customers and prospective clients. Her expertise in helping organizations reach their goals through scalable and easily adoptable test strategies has helped her customers to organically shift from waterfall to agile and DevOps methodologies.