Verifyter was granted a US patent March 27th 2015 (”Method and Apparatus for Automatic Diagnosis of Software Failures”) that protects its novel technology on which its automatic debugger PinDown is based.

A second patent application within the same field is currently pending.

Presented ”Standard Regression Testing Does Not Work” at DVCon, San Jose, March 4th 2015 where I showed what is wrong with today’s approach to regression testing and how to fix it.

PinDown presentation from DVCon March 2015. View PDF

DVCon 2015 01

DVCon 2015 02

Presented our automatic debug solution at Microprocessor Test and Verification Dec 15th 2014 in Austin, TX, as part of a special session on failure triage. Other interesting presentations were from Zissis Poulos (University of Toronto) who presented a formal engine approach to automatic debug and from Brian Hickerson (IBM) who presented ”Failure Triage and Debug in the Real World”.

I also participated in a panel called ”Regression: Resource Blackhole” on how to stop the ever increasing need for computer and human resources for regression testing. I put forward my opinion that there is a wasteful disconnect: we automatically kick off tests, but we manually debug the failures. This creates a list of test failures that have to be manually debugged, many of which will point to the same underlying bug, but before these failures have been analyzed a new regression run may be kicked off because there is no connection between kicking off tests and debugging failures.

2014 MTV

Panelists: Daniel Hansson (CEO, Verifyter), Fergus Casey (Sr Manager, Synopsys ARC), Harry Foster (Chief Verification Scientist, Mentor Graphics), JL Gray (Sr Architect, Cadence), Olly Stephens (Engineering Systems Architect, ARM) and Adam Abadir (DFx Execution Lead, AMD)

The solution is to automate the whole flow, both the launch of test suites and the debugging of the test failures. With automatic debug of regression failures, bugs get fixed faster and with less human effort, which in turn means there is less chance that the next regression run will find the same failures, which would be a waste of computer resources. As the launching of the tests and the automatic debug is integrated into the same flow, new tests will not be launched until the debug has finished, which is good as this in most cases would mean wasting computer resources finding the same already known failure status. An integrated flow with both automatic test launches and automatic debug thus saves both human and computer resources.