Archives

Parasoft AI Solutions Cut Testing Failures, Boost Efficiency

Parasoft

Parasoft, a leader in AI-powered software testing, has made another step in strategically integrating AI and ML quality enhancements where development teams need them most – such as using natural language for troubleshooting, or checking code in real time.

In their campaign to create reliable, customizable testing solutions, their investment in AI serves a singular purpose: Address the quality risk developers and testers face as they’re pressured throughout product release cycles. While some AI advancements may emerge as a trend, Parasoft’s strategic AI investment has been a steadfast necessity, built on a decade of development. As more development teams turn to AI, the foresight of this approach underscores their commitment to helping users augment quality. It helps organizations integrate the Large Language Models (LLMs) they already use into advanced testing workflows.

“Parasoft’s latest innovations are the result of a long-term commitment to AI in software testing,” said Igor Kirilenko, Chief Product Officer, Parasoft. “We’ve been incrementally improving, one release at a time, toward a foundation that offers unparalleled reliability, choice, and control in testing workflows.”

He added, “These latest enhancements are backed by dozens of our technology patents, designed to ease the journey to fully autonomous software testing.”

In Parasoft’s latest product releases, developers gain greater control and feedback in strategic stages of the software development lifecycle – from continuous code validation to robust support for various LLMs.

Faster Time to First Feedback

Live Unit Testing has been added for real-time code verification in this version of Jtest, an AI-powered Java developer productivity solution. This enhancement automatically lets developers continuously execute unit tests impacted by code changes in their Integrated Development Environment (IDE). The ability to validate code changes before checking them into source control is a major time saver, resulting in fewer build and regression failures.

Parasoft’s machine-learning engine operates in the background, correlating recent code changes to impacted unit tests while autonomously executing tests in the IDE, giving the engineer continuous feedback. As an extension of Parasoft’s CLI-based test impact analysis announced earlier this year, this new combination offers two major benefits – it can help accelerate feedback on testing by 90% or more while slashing build and regression failures.

Also Read: Incredibuild Acquires Garden to Boost DevOps Acceleration

Parasoft’s Jtest and dotTEST solutions for C#/VB.NET development offer Live Static Analysis to automate continuous code scanning and remediate defects as they occur. Coupled with Parasoft’s AI-generated code fixes and new extended support for LLM providers, this gives development teams a new advantage. Most notably the ability to receive continuous feedback on quality, security, reliability and maintainability – all while remediating static analysis findings faster.

Accelerate Learning and Troubleshooting with AI Assistant

A new AI Assistant as part of Parasoft SOAtest and Virtualize now integrates with various LLM providers, such as OpenAI and Azure OpenAI.

To harness the power of this highly intuitive functionality, developers can simply ask questions in natural language and receive immediate answers about SOAtest and Virtualize. It’s designed to help users learn faster, but also troubleshoot problems with higher efficiency. The overall effect is that it enhances testing workflows by integrating AI-powered support into existing tester and developer toolsets.

Empowering Users with Choice of LLM

Customers need to clear hurdles as they implement LLMs into their development and testing processes. To help them, Parasoft has expanded LLM support for various providers in the newest releases of Jtest, dotTEST, SOAtest, and Virtualize.

In being able to integrate their preferred LLMs with Parasoft‘s automated software testing solutions, teams have the ability to select the LLM that best fits their needs. It also addresses data security and privacy concerns by letting them use on-prem deployment options.

Source: PRNewswire