Importantly, you need to specify the location of SBST-benchmark Framework's jar file.#! /bin/sh # Specify the dir where you put this script: TOOL_HOME=somewhere/my_tool_X # Specify the location of the SBST-benchmark Framework's jar file: CONF=$TOOL_HOME/benchmarks.list # Specify the locations of jacoco, pittest, and junit: JAR=$TOOL_HOME/benchmarktool-0.0.2-jar-with-dependencies.jar JUNIT_JAR=$TOOL_HOME/junit-4.10.jar PITEST_JAR=$TOOL_HOME/pitest-0.31.jar JACOCO_JAR=$TOOL_HOME/jacocoagent.jar # Specify the commands to invoke javac and java: JAVAC_CMD=javac JAVA_CMD=java # This is the command to start the benchmarking: exec $JAVA_CMD -Dsbst.benchmark.jacoco="$JACOCO_JAR" -Dsbst.benchmark.java="$JAVA_CMD" -Dsbst.benchmark.javac="$JAVAC_CMD" -Dsbst.benchmark.config="$CONF" -Dsbst.benchmark.junit="$JUNIT_JAR" -Dsbst.benchmark.pitest="$PITEST_JAR" -jar "$JAR" $*
You can see that we also need the following libraries: jacoco (to measure code coverage), pitest (to do mutation tests), and junit.
benchmark is the name of a benchmark-group, as you have specified in your file benchmarks.list.runbenchmark <benchmark> <tooldirectory> <debugoutput>
Additionally, once the protocol is implemented, you need to provide a script or an executable named runtool that will invoke the protocol. This script will later be automatically invoked; but you need to provide it.
The protocol is a very simple line based protocol over the standard input and output channels. The following table describes the protocol, every step consists of a line of text received by your side of the protocol on STDIN or what it should sent to STDOUT.
STEP | MESSAGES STDIN | MESSAGES STDOUT | DESCRIPTION |
---|---|---|---|
1 | BENCHMARK | Signals the start of a benchmark run; directory $HOME/temp is cleared | |
2 | directory | Directory with the source code SUT | |
3 | directory | Directory with compiled class files of the SUT | |
4 | number | Number of entries in the class path (N) | |
5 | directory/jar file | Class path entry (repeated N times) | |
6 | number | Number of classes to be covered (M) | |
7 | CLASSPATH | Signals that the testing tool required additional classpath entries | |
8 | Number | Number of additional class path entries (K) | |
9 | directory/jar file | Repeated K times | |
10 | READY | Signals that the testing tool is ready to receive challenges | |
11 | class name | The name of the class for which unit tests must be generated. | |
12 | READY | Signals that the testing tool is ready to receive more challenges; test cases in $HOME/temp/testcases are analyzed; subsequently $HOME/temp/testcases is cleared; goto step 11 until M class names have been processed |
The tests generated by tool X should be in the form of java files containing JUnit4 tests.
The generated test cases will be compiled against