Want to see Parasoft in action? Sign up for our Monthly Demos! See Demos & Events >>
To enable visibility and tracking of functional test results, Parasoft SOAtest generates HTML reports and XML output with results that can be published into continuous integration systems as well as to Parasoft’s centralized reporting server for additional reporting and analytics.
In agile environments, new features and functionality are created at high speed, and automated delivery pipelines push new offerings out to market. Continuous feedback is vital to understand levels of risk as products are delivered quickly. Without continuously testing and validating constant code change, the organization is at risk of delayed release schedules or defects leaking into the final product, and customers finding bugs.
Parasoft SOAtest aggregates test results from all of your functional testing disciplines (i.e. mobile, UI, API, database, etc,) to present them in an easy-to-understand, centralized dashboard where you can understand your test results in context of the project, and in context of requirements, enabling stakeholders to make quality decisions the moment the application is ready to go.
By simply executing tests in automation, stakeholders get realtime information about details that matter, such as individual component failures, performance degradation, test stability, or risky code changes. Parasoft’s advanced analytics use data from SOAtest along with code coverage and requirements traceability, for example, to understand the impact of a defect in correlation with its underlying code change. All of this means faster feedback and reduced risk.
When you’re executing hundreds of tests at a time, with some passing and some failing, you need context to make most effective use of your time. Parasoft SOAtest helps users prioritize actions based on the results of test execution, with actionable reports that address quality in the application.
Users can publish functional test results into a centralized reporting dashboard, where the test results are summarized in easy-to-understand, dynamic widgets that enable navigation of all test results and execution details. Test failures can be assigned to different team members, who can then import failures assigned to them back into SOAtest.
With Parasoft SOAtest, users can automatically identify whether the test environment is ready for testing, and if it isn’t, what actions need to be taken to stabilize the test environment. To do this, SOAtest collects system-level information and looks for outages to help the user understand whether these will affect their tests.
Parasoft SOAtest helps users identify gaps in their API testing strategy by identifying services that were not fully exercised during test execution. SOAtest’s API Coverage report shows how each of the services and operations were tested and provides an end-point view of test results.
In Parasoft’s centralized reporting server, a dynamic coverage dashboard helps users explore which SOAtest test cases covered which lines of code. Armed with this information, testers can create the most optimized strategy for testing every one of their APIs.
Users can seamlessly integrate SOAtest test results into their application’s build-deploy integration test process by automatically publishing SOAtest’s functional test results into CI systems (i.e. Jenkins, Atlassian Bamboo, JetBrains TeamCity, and Microsoft Visual Studio Team Services) in the same familiar way that those systems are already presenting unit test results.
Parasoft SOAtest’s rich and dynamic reporting system enables multiple stakeholders to understand the health of critical applications, with meaningful and actionable tasks displayed in various forms, from a printable PDF report to a dynamic multi-tiered HTML report outlining which tests executed, their status (successes and failures), and which requirements they are associated with. All reporting styles are highly customizable to the individual.
Parasoft can uniquely correlate functional and nonfunctional test results with the underlying API and code coverage, so users can understand the impact of code changes in context of their functional testing strategy, immediately analyzing where risky functionality needs to be tested as well as getting a holistic view of the entire software development process.
In addition to sending data back into the CI infrastructure, test results can be published to Parasoft DTP‘s reporting and analytics dashboard for aggregation with quality data from across the development process and correlating with agile planning and test management systems such as JIRA, CollabNet VersionOne, QMetry, and Micro Focus ALM.
Parasoft SOAtest’s HTML reports contain all the information you need to understand the completeness of your test coverage, as well as diagnose testing failures. The API Coverage report correlates test results to the end-points and services being tested, providing you with a view of missing or incomplete testing and the pass/fail state of test scenarios. In addition, the underlying traffic can be captured and reported to help offline diagnosing of test failures from automated test runs executed as part of CI/CD pipelines.
In addition to comprehensive HTML reporting that can be archived and viewed directly within your CI platform, results from automated test execution can be immediately tied back into the reporting infrastructure of the most popular CI platforms through dedicated plugins for Jenkins, Bamboo, TeamCity, and Microsoft Azure DevOps. These results can be used to automatically pass or fail the build if the quality standards are not thoroughly adhered to.
Ensuring that you are delivering high quality applications to the market isn’t just about creating a test suite and making sure it passes. You need the complete view of quality across all the stages of the software development process. Parasoft’s reporting and analytics dashboard enables the aggregation of API test results with other testing practices such as static analysis, unit testing, and coverage analysis -- and then correlates these metrics back to the requirements and user-stories to give you a complete and continuous view of quality.
: error_log(/www/wwwroot/parasoftchina/wp-content/plugins/spider-analyser/#log/log-2820.txt): failed to open stream: No such file or directory in /www/wwwroot/parasoftchina/wp-content/plugins/spider-analyser/spider.class.php on line 2900