A test log contains a metric, in this case a performance metric, which must be less/more/equal to a certain limit in order for the test to be considered to have passed.

There are no configurations in this example, just one build used by all tests.

Example

The test name, its performance metric and any other test failure messages are available in a log file (simulate.log). The performance metric is a cycle count. The compilation result is available in another log (compile.log).

logs/compile.log
Compiling project
...
Done!
CPU time 50.7 seconds
logs/simulate.log
Command: run -test dhrystone
...
HAPPY after 95899 cycles
...
CPU time 133 seconds
Note In this example we are extracting the cycle performance, but the same method can be used to extract any metric, e.g. synthesis area, max critical path, functional coverage

PinDown Extraction

Step 1. Extract Build Results

In step 1 we set the configuration name to the folder where all logs are found (see the file pindown_config.txt below). Then by referring to the configuration label ("%configlabel%") we extract the compilation result from the compilation log file. In this case there is only one configuration so it is not really necessary to extract the configuration name and do this mapping, other than to give it a name (the default name is "build") and making it future proof in case more configurations are introduced.

pindown_config.txt
group -name "test" -commands {
  // Step 1. Extract the build results
  extract -type "configlabel" -label "configlabel" -source "filename" -path "processor_nextgen/logs" -keywords "";
  extract -type "buildpass" -path "%configlabel%/compile.log" -keywords "CPU time";
  extract -type "buildfail" -path "%configlabel%/compile.log" -keywords "Error|syntax error|Compilation aborted";

  // Step 2. Extract test results. It's a fail if the simulation took >=95850 cycles or if
  // there are any failure messages. Filter out keywords that are not failure messages.
  extract -type "testname" -path "%configlabel%/simulate.log" -keywords "Command:";
  extract -type "testpass" -path "%configlabel%/simulate.log" -keywords "HAPPY after" -assign "if(<95850)" -default "unknown";
  extract -type "testfail" -path "%configlabel%/simulate.log" -keywords "HAPPY after" -assign "if(>=95850)" -default "unknown";
  extract -type "testfail" -path "%configlabel%/simulate.log" -keywords "FAILED|ERROR|Error|FROZEN_PC_ERROR|FATAL" -filter "MAX_ERRORS|ENABLE_ERROR_RESPONSE";

  // Step 3. Clean up
  extract -type "replace" -label "testname" -text ".*run \-test " -with "";
  extract -type "replace" -label "configlabel" -text "\/logs" -with "";
};
extract -group "test" -file "test_results.xml";

Step 2. Extract Test Results

In step 2 we extract the test names and results from the simulation log. A failure is here defined as the test taking >= 95850 cycles to complete. PinDown extracts the only number it finds on the line containing "HAPPY after" and compares this metric to the criteria for pass and fail (if there had been two different numbers on the line then we would have had to remove one of them first). We also define the test as failing if the test contains failure messages, but we filter out some text that is actually not failures despite containing the keywords for failure.

Step 3. Clean Results

In step 3 the results are cleaned up so that the test name is only the actual test name ("dhrystone") and the same with the configuration name ("processor_nextgen").

The last line is the one that actually runs the extraction as defined by the group "test" and writes the result to the file test_results.xml.

Test Results

The result of running pindown_config.txt is written to the test_results.xml file.

Note that for failures multiple lines are extracted from the log files in order to see the lines above and below the failure message as they may contain interesting information. The actual error message is shown between PinDownMatchStart and PinDownMatchEnd. For metric failures PinDown adds some information within brackets to explain why the metric is a failure.

Command: run -test dhrystone
...
HAPPY after 95899 cycles (The metric 95899 is >=95850 which is the definition of failure. This is 49 or 0.05% above limit.)
...
CPU time 133 seconds
test_results.xml
<?xml version="1.0" encoding="UTF-8"?>
<com.verifyter.pindown.TestResultList formatOwner="Verifyter" formatVersion="1.1" tool="PinDown">
    <com.verifyter.pindown.TestResults>
        <com.verifyter.pindown.ExtractionCommands>
            <com.verifyter.pindown.ExtractionCommand command="extract_config_label" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_pass" label="buildpass"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_fail" label="buildfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_name" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_pass" label="testpass"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_fail" label="testfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_fail" label="testfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_results" label=""/>
            <com.verifyter.pindown.ExtractionCommand command="extract_repository" label="repository"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_revision" label="revision"/>
        </com.verifyter.pindown.ExtractionCommands>
        <com.verifyter.pindown.BuildResult buildResult="%buildpass%pass" buildResultLine="processor_nextgen/logs/compile.log:CPU time 50.7 seconds">
            <com.verifyter.pindown.TestedConfiguration configurationLabel="%configlabel%processor_nextgen"/>
            <com.verifyter.pindown.Test testName="%testname%dhrystone" testResult="%testfail%fail" testResultLine="&lt;PinDownContextStart&gt;processor_nextgen/logs/simulate.log:Command: run -test dhrystone&lt;PinDownNL&gt;processor_nextgen/logs/simulate.log:...&lt;PinDownNL&gt;&lt;PinDownMatchStart&gt;processor_nextgen/logs/simulate.log:HAPPY after 95899 cycles (The metric 95899 is &gt;=95850 which is the definition of failure. This is 49 or 0.05% above limit.)&lt;PinDownMatchEnd&gt;&lt;PinDownNL&gt;processor_nextgen/logs/simulate.log:...&lt;PinDownNL&gt;processor_nextgen/logs/simulate.log:CPU time 133 seconds&lt;PinDownNL&gt;"/>
            <com.verifyter.pindown.LogFile path="processor_nextgen/logs/compile.log"/>
            <com.verifyter.pindown.LogFile path="processor_nextgen/logs/simulate.log"/>
        </com.verifyter.pindown.BuildResult>
    </com.verifyter.pindown.TestResults>
</com.verifyter.pindown.TestResultList>