Both the builds and tests are compiled in steps. The configurations are compiled in multiple build steps. At the first build step there is no concept of configurations. It is just one initial compile that needs to work properly for it to be possible to build the subsequent build steps, which consist of one compile per configuration.

The tests also have steps. The first step is compiling test libs used by each config (there are no test names at this point, but there are configuration names), the second step is compiling each individual test and the third step is to actually run the test.

Example

Each build step has its own compilation log. In build step 1 there is just one compile and no concept of configurations. In build step 2 and 3 there are one compilation per configuration.

Test results are available in separate log files, which also contain the names of the configurations that the tests belong to.

compilation/compile_buildstep1.log
Build First Step
Compiling
...
Done!
CPU time 124.8 seconds
compilation/compile_buildstep2.log
Build config_A
Compiling project
...
Done!
CPU time 12.3 seconds
Build config_B
Compiling project
Done!
...
CPU time 14.7 seconds
Build config_C
Compiling project
...
CPU time 12.1 seconds
compilation/compile_buildstep3.log
Build config_A
Compiling project
...
Done!
CPU time 8.2 seconds
Build config_B
Compiling project
...
CPU time 16.2 seconds
Build config_C
Compiling project
...
CPU time 17.5 seconds
testlogs/compile_testlibs_config_a.log
Test Compilation Step 1
Build: config_A
Compiling .......
CPU time 33.7 seconds
testlogs/compile_testlibs_config_b.log
Test Compilation Step 1
Build: config_B
Compiling .......
CPU time 34.8 seconds
testlogs/compile_test_a.log
Test Compilation Step 2
Test: test1
Seed: 22442244
Build: config_A
Compiling ........
CPU time 21 seconds
testlogs/compile_test_b.log
Test Compilation Step 2
Test: test2
Seed: 12341234
Build: config_A
Compiling ........
Error: Compilation failed
CPU time 18 seconds
testlogs/compile_test_c.log
Test Compilation Step 2
Test: test3
Seed: 45674567
Build: config_A
Compiling ........
CPU time 11 seconds
testlogs/compile_test_d.log
Test Compilation Step 2
Test: test4
Seed: 31576784
Build: config_B
Compiling ........
CPU time 12 seconds
testlogs/compile_test_e.log
Test Compilation Step 2
Test: test5
Seed: 89516342
Build: config_B
Compiling ........
CPU time 16 seconds
testlogs/results_a.log
Command: run -test test1
Build: config_A
seed 22442244
Test Result: PASSED
CPU time 122.7 seconds
testlogs/results_c.log
Command: run -test test3
Build: config_A
seed 45674567
Test Result: PASSED
CPU time 217.1 seconds
testlogs/results_d.log
Command: run -test test4
Build: config_B
seed 31576784
Test not run
Test Result: FAILED
CPU time 10.9 seconds
testlogs/results_e.log
Command: run -test test5
Build: config_B
seed 89516342
Test not run
Test Result: FAILED
CPU time 9.9 seconds

PinDown Extraction

Step 1. Extract Build Step 1 Results

Build step 1 is a first compile at which point there is not yet any concept of configurations (see pindown_config.txt below). In order to simplify debug we want to associate a default configuration name to the build such that if the build fails this early PinDown can still debug the problem by simply requesting the default configuration name to be re-built on older revisions. During debug the default configuration name must be the name of the build that PinDown actually requested (DEBUG_CONFIGURATION) otherwise PinDown will report that the flow is broken. E.g. if PinDown during debug re-runs a test failure on an older revision where build step 1 breaks because of an old compilation issue (which has since been fixed, otherwise we would not be debugging a test failure) then by returning the requested configuration PinDown will accept and understand the results and continue to debug.

However, during the initial test phase when the customer’s regression test script is running, PinDown has not requested anything and consequently the variable DEBUG_CONFIGURATION is set to "all" to indicate that no specific configuration has been requested. In this case we must set the default configuration to any of the known existing configurations, in this example it is hardcoded to "config_A". All this in order to ensure that there is always a useful configuration name that PinDown can use to debug any compilation issues that occur this early.

pindown_config.txt
group -name "test" -commands {

  //
  // Build Results
  //
  // Step 1. Build Step 1 - Only one compilation that affects all configs in later build steps
  extract -type "configlabel" -label "buildstep1" -source "value" -path "%DEBUG_CONFIGURATION%" -keywords "";
  extract -type "replace" -label "buildstep1" -text "^all$" -with "config_A";
  extract -type "buildfail" -path "compilation/compile_buildstep1.log" -keywords "Error|error|failed";
  extract -type "buildend" -path "compilation/compile_buildstep1.log" -keywords "CPU time";

  // Step 2. Build Step 2 - Now there are several configs.
  extract -type "configlabel" -path "compilation/compile_buildstep2.log" -keywords "Build";
  extract -type "buildfail" -path "compilation/compile_buildstep2.log" -keywords "Error|error|failed";
  extract -type "buildend" -path "compilation/compile_buildstep2.log" -keywords "CPU time";
  extract -type "replace" -label "configlabel" -text ".*Build " -with "";
  extract -type "merge" -label "buildstep1,configlabel" -containing "";

  // Step 3. Build Step 3 - Another compile for each config
  extract -type "configlabel" -path "compilation/compile_buildstep3.log" -keywords "Build";
  extract -type "buildpass" -path "compilation/compile_buildstep3.log" -keywords "CPU time";
  extract -type "buildfail" -path "compilation/compile_buildstep3.log" -keywords "Error|error|failed";
  extract -type "buildend" -path "compilation/compile_buildstep3.log" -keywords "CPU time";
  extract -type "replace" -label "configlabel" -text ".*Build " -with "";
  extract -type "merge" -label "configlabel" -containing "";

  //
  // Test Results
  //
  // Step 4. Test Step 1 - Only one test code compilation that affects all tests. Set one default test name per config
  extract -type "configlabel" -label "configlabel1" -source "log" --path "testlogs/compile_testlibs.*log" -keywords "Build:";
  extract -type "replace" -label "configlabel1" -text ":.*" -with "";
  extract -type "testname" -path "%configlabel1%" -keywords "Error|error|failed";
  extract -type "testfail" -path "%configlabel1%" -keywords "Error|error|failed";
  extract -type "testend" -path "%configlabel1%" -keywords "CPU time";
  extract -type "replace" -label "testname" -text ".*" -with "%DEBUG_TESTS%";
  extract -type "move" -label "testname,teststep1" -containing "";
  extract -type "restore" -label "configlabel1" -containing "";
  extract -type "replace" -label "configlabel1" -text ".*Build: " -with "";
  extract -type "move" -label "configlabel1,configlabel" -containing "";
  extract -type "merge" -label "configlabel" -containing "";
  extract -type "replace" -label "teststep1" -text "^all$" -with "%configlabel%_all";
  extract -type "replace" -label "teststep1" -text "config_A_all" -with "test1_seed_9945674567";
  extract -type "replace" -label "teststep1" -text "config_B_all" -with "test4_seed_8845674567";
  extract -type "replace" -label "teststep1" -text "config_C_all" -with "test6_seed_7745674567";

  // Step 5. Test Step 2 - Then each test is compiled
  extract -type "configlabel" -label "configlabel2" -source "log" -path "testlogs/compile_test_.*log" -keywords "Build:";
  extract -type "replace" -label "configlabel2" -text ":.*" -with "";
  extract -type "testname" -path "%configlabel2%" -keywords "Test:";
  extract -type "testseed" -path "%configlabel2%" -keywords "Seed:";
  extract -type "testfail" -path "%configlabel2%" -keywords "Error|error|failed";
  extract -type "testend" -path "%configlabel2%" -keywords "CPU time";
  extract -type "replace" -label "testname" -text ".*Test: " -with "";
  extract -type "replace" -label "testseed" -text ".*Seed: " -with "";
  extract -type "restore" -label "configlabel2" -containing "";
  extract -type "replace" -label "configlabel2" -text ".*Build: " -with "";
  extract -type "move" -label "configlabel2,configlabel" -containing "";
  extract -type "merge" -label "configlabel" -containing "";
  extract -type "merge" -label "teststep1,testname" -containing "";

  // Step 6. Test Step 3 - Another compile for each config
  extract -type "configlabel" -label "configlabel3" -source "log" -path "testlogs/results.*log" -keywords "Build:";
  extract -type "replace" -label "configlabel3" -text ":.*" -with "";
  extract -type "testname" -path "%configlabel3%" -keywords "run \-test";
  extract -type "testseed" -path "%configlabel3%" -keywords "seed";
  extract -type "testpass" -path "%configlabel3%" -keywords "CPU time";
  extract -type "testfail" -path "%configlabel3%" -keywords "FAILED";
  extract -type "testend" -path "%configlabel3%" -keywords "CPU time";
  extract -type "replace" -label "testname" -text ".*run -test " -with "";
  extract -type "replace" -label "testseed" -text ".*seed " -with "";
  extract -type "restore" -label "configlabel3" -containing "";
  extract -type "replace" -label "configlabel3" -text ".*Build: " -with "";
  extract -type "move" -label "configlabel3,configlabel" -containing "";
  extract -type "merge" -label "configlabel" -containing "";
  extract -type "merge" -label "testname" -containing "";

  // Step 7. Clean the test and build times
  extract -type "replace" -label "testend" -text ".*CPU time " -with "";
  extract -type "replace" -label "testend" -text "\..*" -with "";
  extract -type "replace" -label "buildend" -text ".*CPU time " -with "";
  extract -type "replace" -label "buildend" -text "\..*" -with "";

};
extract -group "test" -file "test_results.xml";

Step 2. Extract Build Step 2 Results

In build step 2 we extract one compilation result per configuration from one log file. Then we merge the results from Build Step 1 and 2 by specifying the two different labels used to extract the configuration names. There is only one configuration associated with the label "buildstep1" whereas there are 3 configurations associated with the label "configlabel" from Build Step 2. The result of this merge is to merge any build failures in the single configuration in Build Step 1 into all 3 configurations in label "configlabel" from Build Step 2. The names of the configurations will still be that of Build Step 2 because "configlabel" is specified last in the list of labels to be merge.

If Build Step 1 was failing then all configurations of Build Step 2 would be marked as failing after this operation. However if there are no results at all in Build Step 2, because Build Step 1 failed in a fatal way, then the configuration name and results of Build Step 1 would be preserved as there is nothing to merge with.

Step 3. Extract Build Step 3 Results

In build step 3 we again extract one compilation result per configuration from one log file and then merge it with the results from Build Step 2. This type of merge is different than the merge that occurred in Build Step 2. Here we are just specifying one label to merge ("configlabel"), which means the merger will take place within a label (Build Step 2 and 3 are using the same name for their configurations) provided the configuration names are the same. E.g in this case the build results for configuration "config_A", from both Build Step 2 and 3, will be merged together such that any failure from either step will be picked up.

Step 4. Extract Test Step 1 Results

In this step test libs are compiled which are not associated to any tests. However they are associated to one configuration each. We have to assigna default test name in case the test compilation fails this early on. PinDown will use this test name during debug. The default test name is set with the DEBUG_TESTS variable which is set to whatever PinDown requested during the debug phase.

However, during the initial test phase when the customer’s regression test script is running, PinDown has not requested anything and consequently the variable DEBUG_TESTS is set to "all" to indicate that no specific test has been requested. In this case we must set the default test name to any of the known existing tests. In this example there is a different default test name for each configuration.

Note that we are using a different configuration label, configlabel1, in this step in order not to accidentally modify test results from the previous steps. Once we have done all the cleaning of the test results for this step then we move the data associated with configlabel1 to configlabel and then merge the results.

Step 5. Extract Test Step 2 Results

In this step we extract the results from the test compilation of each individual test. Again we modify the data in a specific label, configlabel2, to avoid affecting results from previous steps, which we are already happy with. At the end we move over the data to the same label where the results of the previous steps are, configlabel, and merge this data together with the data of the previous steps.

The last command in this step is a one-to-many merger, where the single test of test step 1 is merged with each and everyone of the test compilation results of the same config. The result of this is to have as many tests as there are compilation results in test step 2, but if there are any failures in test step 1 then this is reflected in the results.

Step 6. Extract Test Step 3 Results

In this step we extract the actual test results after the tests were run. Again we do this extraction isolated in a specific label, configlabel3, to avoid modifying results from previous steps. The last 3 commands in this step moves the data over to the label configlabel, merge the configurations and finally merge the test results.

Step 5. Clean Results

The final thing to do is to clean the results to get clean test and build times, i.e. whole seconds as decimals are not allowed.

The last line is the one that actually runs the extraction as defined by the group "test" and writes the result to the file test_results.xml.

Test Results

The result of running pindown_config.txt is written to the test_results.xml file.

Note that test step 1 (compilation of test libs) failed for config_C. There is also one test step 2 failure (compilation failure of an individual test) which occurs for test2 in config_A. There are two test step 3 failures (failures when running the tests): test4 and test5 in config_B.

Note that for failures multiple lines are extracted from the log files in order to see the lines above and below the failure message as they may contain interesting information. The actual error message is shown between PinDownMatchStart and PinDownMatchEnd.

test_results.xml
<?xml version="1.0" encoding="UTF-8"?><com.verifyter.pindown.TestResultList formatOwner="Verifyter" formatVersion="1.1" tool="PinDown">
    <com.verifyter.pindown.TestResults>
        <com.verifyter.pindown.ExtractionCommands>
            <com.verifyter.pindown.ExtractionCommand command="extract_config_label" label="buildstep1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="buildstep1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_fail" label="buildfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_end" label="buildend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_config_label" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_fail" label="buildfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_end" label="buildend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_merge" label="buildstep1,configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_config_label" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_pass" label="buildpass"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_fail" label="buildfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_build_end" label="buildend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_merge" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_config_label" label="configlabel1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_name" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_fail" label="testfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_end" label="testend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_move" label="testname,teststep1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_restore" label="configlabel1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_move" label="configlabel1,configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_merge" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="teststep1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="teststep1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="teststep1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="teststep1"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_config_label" label="configlabel2"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel2"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_name" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_seed" label="testseed"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_fail" label="testfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_end" label="testend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="testseed"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_restore" label="configlabel2"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel2"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_move" label="configlabel2,configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_merge" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_merge" label="teststep1,testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_config_label" label="configlabel3"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel3"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_name" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_seed" label="testseed"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_pass" label="testpass"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_fail" label="testfail"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_test_end" label="testend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="testseed"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_restore" label="configlabel3"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="configlabel3"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_move" label="configlabel3,configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_merge" label="configlabel"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_merge" label="testname"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="testend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="testend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="buildend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_replace" label="buildend"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_results" label=""/>
            <com.verifyter.pindown.ExtractionCommand command="extract_repository" label="repository"/>
            <com.verifyter.pindown.ExtractionCommand command="extract_revision" label="revision"/>
        </com.verifyter.pindown.ExtractionCommands>
        <com.verifyter.pindown.BuildResult buildEnd="%buildend%17" buildResult="%buildpass%pass" buildResultLine="compilation/compile_buildstep3.log:CPU time 17.5 seconds">
            <com.verifyter.pindown.TestedConfiguration configurationLabel="%configlabel%config_C"/>
            <com.verifyter.pindown.Test testEnd="%testend%35" testName="%testname%test6" testResult="%testfail%fail" testResultLine="&lt;PinDownContextStart&gt;testlogs/compile_testlibs_config_c.log:Test Compilation Step 1&lt;PinDownNL&gt;testlogs/compile_testlibs_config_c.log:Build: config_C&lt;PinDownNL&gt;testlogs/compile_testlibs_config_c.log:Compiling .......&lt;PinDownNL&gt;&lt;PinDownMatchStart&gt;testlogs/compile_testlibs_config_c.log:Error: testlibs not found&lt;PinDownMatchEnd&gt;&lt;PinDownNL&gt;testlogs/compile_testlibs_config_c.log:CPU time 35.9 seconds&lt;PinDownNL&gt;" testSeed="%testseed%7745674567"/>
            <com.verifyter.pindown.LogFile path="compilation/compile_buildstep3.log"/>
            <com.verifyter.pindown.LogFile path="testlogs/compile_testlibs_config_c.log"/>
        </com.verifyter.pindown.BuildResult>
        <com.verifyter.pindown.BuildResult buildEnd="%buildend%8" buildResult="%buildpass%pass" buildResultLine="compilation/compile_buildstep3.log:CPU time 8.2 seconds">
            <com.verifyter.pindown.TestedConfiguration configurationLabel="%configlabel%config_A"/>
            <com.verifyter.pindown.Test testEnd="%testend%18 seconds" testName="%testname%test2" testResult="%testfail%fail" testResultLine="&lt;PinDownContextStart&gt;testlogs/compile_test_b.log:Test Compilation Step 2&lt;PinDownNL&gt;testlogs/compile_test_b.log:Test: test2&lt;PinDownNL&gt;testlogs/compile_test_b.log:Seed: 12341234&lt;PinDownNL&gt;testlogs/compile_test_b.log:Build: config_A&lt;PinDownNL&gt;testlogs/compile_test_b.log:Compiling ........&lt;PinDownNL&gt;&lt;PinDownMatchStart&gt;testlogs/compile_test_b.log:Error: Compilation failed&lt;PinDownMatchEnd&gt;&lt;PinDownNL&gt;testlogs/compile_test_b.log:CPU time 18 seconds&lt;PinDownNL&gt;" testSeed="%testseed%12341234"/>
            <com.verifyter.pindown.Test testEnd="%testend%122" testName="%testname%test1" testResult="%testpass%pass" testResultLine="testlogs/results_a.log:CPU time 122.7 seconds" testSeed="%testseed%22442244"/>
            <com.verifyter.pindown.Test testEnd="%testend%217" testName="%testname%test3" testResult="%testpass%pass" testResultLine="testlogs/results_c.log:CPU time 217.1 seconds" testSeed="%testseed%45674567"/>
            <com.verifyter.pindown.LogFile path="compilation/compile_buildstep3.log"/>
            <com.verifyter.pindown.LogFile path="testlogs/compile_test_b.log"/>
            <com.verifyter.pindown.LogFile path="testlogs/results_a.log"/>
            <com.verifyter.pindown.LogFile path="testlogs/results_c.log"/>
        </com.verifyter.pindown.BuildResult>
        <com.verifyter.pindown.BuildResult buildEnd="%buildend%16" buildResult="%buildpass%pass" buildResultLine="compilation/compile_buildstep3.log:CPU time 16.2 seconds">
            <com.verifyter.pindown.TestedConfiguration configurationLabel="%configlabel%config_B"/>
            <com.verifyter.pindown.Test testEnd="%testend%10" testName="%testname%test4" testResult="%testfail%fail" testResultLine="&lt;PinDownContextStart&gt;testlogs/results_d.log:Command: run -test test4&lt;PinDownNL&gt;testlogs/results_d.log:Build: config_B&lt;PinDownNL&gt;testlogs/results_d.log:seed 31576784&lt;PinDownNL&gt;testlogs/results_d.log:Test not run&lt;PinDownNL&gt;&lt;PinDownMatchStart&gt;testlogs/results_d.log:Test Result: FAILED&lt;PinDownMatchEnd&gt;&lt;PinDownNL&gt;testlogs/results_d.log:CPU time 10.9 seconds&lt;PinDownNL&gt;" testSeed="%testseed%31576784"/>
            <com.verifyter.pindown.Test testEnd="%testend%9" testName="%testname%test5" testResult="%testfail%fail" testResultLine="&lt;PinDownContextStart&gt;testlogs/results_e.log:Command: run -test test5&lt;PinDownNL&gt;testlogs/results_e.log:Build: config_B&lt;PinDownNL&gt;testlogs/results_e.log:seed 89516342&lt;PinDownNL&gt;testlogs/results_e.log:Test not run&lt;PinDownNL&gt;&lt;PinDownMatchStart&gt;testlogs/results_e.log:Test Result: FAILED&lt;PinDownMatchEnd&gt;&lt;PinDownNL&gt;testlogs/results_e.log:CPU time 9.9 seconds&lt;PinDownNL&gt;" testSeed="%testseed%89516342"/>
            <com.verifyter.pindown.LogFile path="compilation/compile_buildstep3.log"/>
            <com.verifyter.pindown.LogFile path="testlogs/results_d.log"/>
            <com.verifyter.pindown.LogFile path="testlogs/results_e.log"/>
        </com.verifyter.pindown.BuildResult>
    </com.verifyter.pindown.TestResults>
</com.verifyter.pindown.TestResultList>