This package contains the class {@link Main} which is the main access to T2 framework. This framework can be used as either a console application, or called from other Java classes.

The framework gives access to various tools. Currently it only supports two tools: the {@link Sequenic.T2.Engines.BaseEngine automated testing tool} and the {@link Sequenic.T2.Engines.Replay replay (regression) tool}. In the future we plan to add more.

T2 test algorithm

This describes how T2's automated testing tool works. More detailed explanation, with examples can be found in the User Manual.

The units subjected to testing by T2 are Java classes. T2 checks for class invariant as well as the specifications of methods, if they are provided. T2 does not use a special specification language. All specifications are to be written in plain Java.

Given a target class C, a test engine tests C by generating random 'executions'. Each execution is a sequence of steps. It starts by creating an instance of C, which takes the role of the target object. Then, at each step of the execution, the test engine either randomly updates a field of the target object, or randomly calls a method of C. When a method m is called, the engine either passes the target object as the receiver of m, or as a parameter. After each step, the target object will be checked against the class invariant of C, if one is specified. Furthermore, if the step calls a method, internal error and run time exceptions are checked. If the method has a specification, it will be checked as well.

When a violation is found, the execution will be reported. To report an execution we will need to print the state of involved objects at each step of the execution. This means that we need to be able to replay an execution. To be able to do this we maintain a {{@link Sequenic.T2.Seq.Trace meta representation} of the ongoing execution. The important thing about this meta representation is that it allows us to reproduce the corresponding actual execution, exactly as it was. This is important for e.g. regression test.

During an execution the test engine will need to generate objects, e.g. to be passed as parameters when methods are called. Since in real execution objects may be linked to each other, the test engine has to be able to reuse old objects rather than keep generating fresh objects. To facilitate this the test engine maintains an {@link Sequenic.T2.Pool object pool}. Whenever objects are created during an execution, they are put in the pool. When an execution needs an object, the engine can decide to just pick one (of the right type) from the pool rather than creating a fresh one. Each object in the pool also receives a unique integer ID. This ID is very important. When an object from the pool is reused, we remember its ID in the meta representation of the test sequence, so that when the execution has to be reproduced (replayed) we know exactly which objects are, e.g. passed as arguments to a method call.

Whenever a new execution is started, the used pool has to be reset. This makes sure we start from a fresh pool, free from side effect of the previous execution.

The algorithm for 'generating' objects is actually a bit more complicated than above. Suppose T2's test engine has to generate an object of class E; it goes through the following steps:

  1. Since E can be an interface or an abstract class, and hence has no constructor, the engine first consults an {@link Sequenic.T2.InterfaceMap interface map}. This map can tell the engine to generate an instance of another class E' instead, which should be a concrete implementation of E. A standard interface map is provided, but you can pass your own custom map.

  2. Next the engine tries to find an instance of E (or E') in a {@link Sequenic.T2.BaseDomain base domain}. The domain is passed to the engine upon creation. A base domain is essentially just a set of objects. When the engine can find an instance of E in the domain, it will clone it and use the clone.

    Because the base domain is always checked first we can use it to limit the range over a class E from which the engine generates objects. For example, if the only intergers in the base domain are -1 and 1, then these will be the only integers the engine generates whevener it needs one. This gives a way to constrain the range of the integers we generate. Alternatively, we can choose to use a base domain that can supply a random integer from the entire range of int values.

    Only clonable objects can be put in the base domain. Cloning is necessary to make sure that objects in a base domain are safe from the engine's side effect (in contrast, objects in the pool are not, and should not be, protected from the engine's side effect). The cloning relies on serialization, so we should only put serializable objects in a base domain.

    When looking for an instance of E in a base domain the engine will not look for instances from subclasses of E.

  3. Only when the engine can't find an instance of E (E') in the base domain then it will either look for one in the pool or create a fresh; this goes as described before.

Non-determinism and Concurrency

T2 cannot test a class that behaves non-deterministically, e.g. if it has a method that uses a random generator. For similar reason, a class that does multi threading is also beyong T2's capability. The reason for this restriction is as follows. The ability to be able to reproduce execution from its meta representation is crucial for T2. It is used in the reporting, and from user's perspective we also want to have this property for e.g. regression. However, we can only reproduce an execution if it contains no internal non-determinism.