Check out the XTS code
The XTS uses autotools and can be obtained with:
$> git clone git://anongit.freedesktop.org/git/xorg/test/xts $> cd xts $> ./autogen.sh $> make $> make run-tests
The first "make" command builds the sources, "make run-tests" executes all the tests. Results are stored in the results/ directory under the current timestamp (e.g. results/2010-04-19-18:19:08/). Note that XTS does not need to be installed, if you do want to do so you're encouraged to pick the appropriate installation prefix for your distribution.
For more information, see the README in the xts repository.
Running a subset of tests
The tests are divided into protocol tests, Xlib (per chapter) and extension tests. Each of these subsection can be run separately with the make test-<section> command. e.g. make test-Xlib13 will only run the Xlib13-related tests.
CVS version of the XTS
The following information refers to the CVS, non-autotooled version of the XTS. We do not recommend using this version.
Anonymous CVS is available from:
cvs -z3 -d:pserver:firstname.lastname@example.org:/cvs/xtest login cvs -z3 -d:pserver:email@example.com:/cvs/xtest co xtest cd xtest export TET_ROOT=`pwd` export PATH=$TET_ROOT/bin:$TET_ROOT/xts5/bin:$PATH make cd xts5
(This won't install anything outside of the checkout tree.) This creates an assortment of journal files with the output from the various build commands. Hopefully that will go away eventually, but in the meantime look in xts5/results. Use CFLOCAL=<cflags> to set local C compiler flags (for example -g) (NOTE: this is only supported for xts5 not for tet).
The XTS code uses a call to _XConnectDisplay. This function is provided by libX11, but only if libX11 was compiled without XCB support (--with-xcb=no).
Build the supporting libraries and binaries
Make sure you have a fonts.dir in your fonts directory. Which format the fonts need to be in depends on which X server you're using. If you don't know what to pick, this command will probably take care of everything for you:
make -C fonts comp_pcf
(I had to edit out the *.bdf entries of the fonts.dir to make Xfake happy. With other X servers, YMMV.)
Running the test cases
Start an X server to run the tests against. Remember that if you have an X server running on :0, you'll need to pick another display for this test server: let's assume you picked :1. It's also easiest to disable access control with the -ac flag. Note that you may want to run the tests with multiple screens, which can be easily accomplished using Xvfb or Xfake. Here are a few sample X server invocations:
Xvfb -screen 0 640x480x24 -screen 1 640x480x8 -ac :1 Xfake -screen 640x480x16 -screen 640x480x16 -ac :1
Now you need to make sure that tetexec.cfg matches your test X server. The xts-config script can fill in most values automatically by querying the running X server. Anything it can't handle it leaves unchanged, so you can intersperse xts-config runs with hand-edits safely, and comments are preserved as well.
DISPLAY=:1 xts-config tetexec.cfg
You still need to review tetexec.cfg after running xts-config, because if your xdpyinfo and xset binaries use the same Xlib that you want to test, or your X server is reporting weird values, the test suite may report PASS on tests that should have FAILed. In other words, if you don't review the resulting tetexec.cfg file, your test results may be invalid.
Assuming that your X server is running locally, the default delays in tetexec.cfg are much higher than they need to be. On local tests you can safely set XT_SPEEDFACTOR and XT_RESET_DISPLAY to 1 instead of the default 5. This will make the test suite run somewhat faster.
Finally, you can run the test suite using
You can run subsets of the test suite by picking a subset from the tet_scen file, and then passing the subset name as a parameter
tcc -e xts5 <subset>
A (re)build of all scenarios can be started with
Running and Debugging Individual Test Cases
run_assert XGetGeometry 4
cd tset/Xlib5/gtgmtry TET_CONFIG=tetexec.cfg gdb Test
alternatively, one can run
cd xtest/xts5/tset/path/to/testcase xtest/xts5/src/bin/scripts/pt -i 3 # run testcase t003() of the Test binary.
Reviewing the Results
Use the vswrpt command (which should now be in your path) to see the results of the test run.
Cleaning the test cases
tcc -c make clean