Linux Kernel Selftests¶
The kernel contains a set of “self tests” under the tools/testing/selftests/directory. These are intended to be small tests to exercise individual codepaths in the kernel. Tests are intended to be run after building, installingand booting a kernel.
Kselftest from mainline can be run on older stable kernels. Running testsfrom mainline offers the best coverage. Several test rings run mainlinekselftest suite on stable releases. The reason is that when a new testgets added to test existing code to regression test a bug, we should beable to run that test on an older kernel. Hence, it is important to keepcode that can still test an older kernel and make sure it skips the testgracefully on newer releases.
You can find additional information on Kselftest framework, how towrite new tests using the framework on Kselftest wiki:
https://kselftest.wiki.kernel.org/
On some systems, hot-plug tests could hang forever waiting for cpu andmemory to be ready to be offlined. A special hot-plug target is createdto run the full range of hot-plug tests. In default mode, hot-plug tests runin safe mode with a limited scope. In limited mode, cpu-hotplug test isrun on a single cpu as opposed to all hotplug capable cpus, and memoryhotplug test is run on 2% of hotplug capable memory instead of 10%.
kselftest runs as a userspace process. Tests that can be written/run inuserspace may wish to use theTest Harness. Tests that need to berun in kernel space may wish to use aTest Module.
Documentation on the tests¶
For documentation on the kselftests themselves, see:
Running the selftests (hotplug tests are run in limited mode)¶
To build the tests:
$ make headers$ make -C tools/testing/selftests
To run the tests:
$ make -C tools/testing/selftests run_tests
To build and run the tests with a single command, use:
$ make kselftest
Note that some tests will require root privileges.
Kselftest supports saving output files in a separate directory and thenrunning tests. To locate output files in a separate directory two syntaxesare supported. In both cases the working directory must be the root of thekernel src. This is applicable to “Running a subset of selftests” sectionbelow.
To build, save output files in a separate directory with O=
$ make O=/tmp/kselftest kselftest
To build, save output files in a separate directory with KBUILD_OUTPUT
$ export KBUILD_OUTPUT=/tmp/kselftest; make kselftest
The O= assignment takes precedence over the KBUILD_OUTPUT environmentvariable.
The above commands by default run the tests and print full pass/fail report.Kselftest supports “summary” option to make it easier to understand the testresults. Please find the detailed individual test results for each test in/tmp/testname file(s) when summary option is specified. This is applicableto “Running a subset of selftests” section below.
To run kselftest with summary option enabled
$ make summary=1 kselftest
Running a subset of selftests¶
You can use the “TARGETS” variable on the make command line to specifysingle test to run, or a list of tests to run.
To run only tests targeted for a single subsystem:
$ make -C tools/testing/selftests TARGETS=ptrace run_tests
You can specify multiple tests to build and run:
$ make TARGETS="size timers" kselftest
To build, save output files in a separate directory with O=
$ make O=/tmp/kselftest TARGETS="size timers" kselftest
To build, save output files in a separate directory with KBUILD_OUTPUT
$ export KBUILD_OUTPUT=/tmp/kselftest; make TARGETS="size timers" kselftest
Additionally you can use the “SKIP_TARGETS” variable on the make commandline to specify one or more targets to exclude from the TARGETS list.
To run all tests but a single subsystem:
$ make -C tools/testing/selftests SKIP_TARGETS=ptrace run_tests
You can specify multiple tests to skip:
$ make SKIP_TARGETS="size timers" kselftest
You can also specify a restricted list of tests to run together with adedicated skiplist:
$ make TARGETS="breakpoints size timers" SKIP_TARGETS=size kselftest
See the top-level tools/testing/selftests/Makefile for the list of allpossible targets.
Running the full range hotplug selftests¶
To build the hotplug tests:
$ make -C tools/testing/selftests hotplug
To run the hotplug tests:
$ make -C tools/testing/selftests run_hotplug
Note that some tests will require root privileges.
Install selftests¶
You can use the “install” target of “make” (which calls thekselftest_install.shtool) to install selftests in the default location (tools/testing/selftests/kselftest_install),or in a user specified location via theINSTALL_PATH “make” variable.
To install selftests in default location:
$ make -C tools/testing/selftests install
To install selftests in a user specified location:
$ make -C tools/testing/selftests install INSTALL_PATH=/some/other/path
Running installed selftests¶
Found in the install directory, as well as in the Kselftest tarball,is a script namedrun_kselftest.sh to run the tests.
You can simply do the following to run the installed Kselftests. Pleasenote some tests will require root privileges:
$ cd kselftest_install$ ./run_kselftest.sh
To see the list of available tests, the-l option can be used:
$ ./run_kselftest.sh -l
The-c option can be used to run all the tests from a test collection, orthe-t option for specific single tests. Either can be used multiple times:
$ ./run_kselftest.sh -c size -c seccomp -t timers:posix_timers -t timer:nanosleep
For other features see the script usage output, seen with the-h option.
Timeout for selftests¶
Selftests are designed to be quick and so a default timeout is used of 45seconds for each test. Tests can override the default timeout by addinga settings file in their directory and set a timeout variable there to theconfigured a desired upper timeout for the test. Only a few tests overridethe timeout with a value higher than 45 seconds, selftests strives to keepit that way. Timeouts in selftests are not considered fatal because thesystem under which a test runs may change and this can also modify theexpected time it takes to run a test. If you have control over the systemswhich will run the tests you can configure a test runner on those systems touse a greater or lower timeout on the command line as with the-o orthe--override-timeout argument. For example to use 165 seconds insteadone would use:
$ ./run_kselftest.sh --override-timeout 165
You can look at the TAP output to see if you ran into the timeout. Testrunners which know a test must run under a specific time can then optionallytreat these timeouts then as fatal.
Packaging selftests¶
In some cases packaging is desired, such as when tests need to run on adifferent system. To package selftests, run:
$ make -C tools/testing/selftests gen_tar
This generates a tarball in theINSTALL_PATH/kselftest-packages directory. Bydefault,.gz format is used. The tar compression format can be overridden byspecifying aFORMAT make variable. Any value recognized bytar’s auto-compressoption is supported, such as:
$ make -C tools/testing/selftests gen_tar FORMAT=.xz
make gen_tar invokesmake install so you can use it to package a subset oftests by using variables specified inRunning a subset of selftestssection:
$ make -C tools/testing/selftests gen_tar TARGETS="size" FORMAT=.xz
Contributing new tests¶
In general, the rules for selftests are
Do as much as you can if you’re not root;
Don’t take too long;
Don’t break the build on any architecture, and
Don’t cause the top-level “make run_tests” to fail if your feature isunconfigured.
The output of tests must conform to the TAP standard to ensure hightesting quality and to capture failures/errors with specific details.The kselftest.h and kselftest_harness.h headers provide wrappers foroutputting test results. These wrappers should be used for pass,fail, exit, and skip messages. CI systems can easily parse TAP outputmessages to detect test results.
Contributing new tests (details)¶
In your Makefile, use facilities from lib.mk by including it instead ofreinventing the wheel. Specify flags and binaries generation flags onneed basis before including lib.mk.
CFLAGS = $(KHDR_INCLUDES)TEST_GEN_PROGS := close_range_testinclude ../lib.mkUse TEST_GEN_XXX if such binaries or files are generated duringcompiling.
TEST_PROGS, TEST_GEN_PROGS mean it is the executable tested bydefault.
TEST_GEN_MODS_DIR should be used by tests that require modules to be builtbefore the test starts. The variable will contain the name of the directorycontaining the modules.
TEST_CUSTOM_PROGS should be used by tests that require custom buildrules and prevent common build rule use.
TEST_PROGS are for test shell scripts. Please ensure shell script hasits exec bit set. Otherwise, lib.mk run_tests will generate a warning.
TEST_CUSTOM_PROGS and TEST_PROGS will be run by common run_tests.
TEST_PROGS_EXTENDED, TEST_GEN_PROGS_EXTENDED mean it is theexecutable which is not tested by default.
TEST_FILES, TEST_GEN_FILES mean it is the file which is used bytest.
TEST_INCLUDES is similar to TEST_FILES, it lists files which should beincluded when exporting or installing the tests, with the followingdifferences:
symlinks to files in other directories are preserved
the part of paths below tools/testing/selftests/ is preserved whencopying the files to the output directory
TEST_INCLUDES is meant to list dependencies located in other directories ofthe selftests hierarchy.
First use the headers inside the kernel source and/or git repo, and then thesystem headers. Headers for the kernel release as opposed to headersinstalled by the distro on the system should be the primary focus to be ableto find regressions. Use KHDR_INCLUDES in Makefile to include headers fromthe kernel source.
If a test needs specific kernel config options enabled, add a config file inthe test directory to enable them.
e.g: tools/testing/selftests/android/config
Create a .gitignore file inside test directory and add all generated objectsin it.
Add new test name in TARGETS in selftests/Makefile:
TARGETS += androidAll changes should pass:
kselftest-{all,install,clean,gen_tar}kselftest-{all,install,clean,gen_tar} O=abo_pathkselftest-{all,install,clean,gen_tar} O=rel_pathmake -C tools/testing/selftests {all,install,clean,gen_tar}make -C tools/testing/selftests {all,install,clean,gen_tar} O=abs_pathmake -C tools/testing/selftests {all,install,clean,gen_tar} O=rel_path
Test Module¶
Kselftest tests the kernel from userspace. Sometimes things needtesting from within the kernel, one method of doing this is to create atest module. We can tie the module into the kselftest framework byusing a shell script test runner.kselftest/module.sh is designedto facilitate this process. There is also a header file provided toassist writing kernel modules that are for use with kselftest:
tools/testing/selftests/kselftest_module.htools/testing/selftests/kselftest/module.sh
Note that test modules should taint the kernel with TAINT_TEST. This willhappen automatically for modules which are in thetools/testing/directory, or for modules which use thekselftest_module.h header above.Otherwise, you’ll need to addMODULE_INFO(test,"Y") to your modulesource. selftests which do not load modules typically should not taint thekernel, but in cases where a non-test module is loaded, TEST_TAINT can beapplied from userspace by writing to/proc/sys/kernel/tainted.
How to use¶
Here we show the typical steps to create a test module and tie it intokselftest. We use kselftests for lib/ as an example.
Create the test module
Create the test script that will run (load/unload) the modulee.g.
tools/testing/selftests/lib/bitmap.shAdd line to config file e.g.
tools/testing/selftests/lib/configAdd test script to makefile e.g.
tools/testing/selftests/lib/MakefileVerify it works:
# Assumes you have booted a fresh build of this kernel treecd/path/to/linux/treemakekselftest-mergemakemodulessudomakemodules_installmakeTARGETS=libkselftest
Example Module¶
A bare bones test module might look like this:
// SPDX-License-Identifier: GPL-2.0+#define pr_fmt(fmt) KBUILD_MODNAME ": " fmt#include"../tools/testing/selftests/kselftest_module.h"KSTM_MODULE_GLOBALS();/* * Kernel module for testing the foobinator */staticint__inittest_function(){...}staticvoid__initselftest(void){KSTM_CHECK_ZERO(do_test_case("",0));}KSTM_MODULE_LOADERS(test_foo);MODULE_AUTHOR("John Developer <jd@fooman.org>");MODULE_LICENSE("GPL");MODULE_INFO(test,"Y");
Example test script¶
#!/bin/bash# SPDX-License-Identifier: GPL-2.0+$(dirname$0)/../kselftest/module.sh"foo"test_foo
Test Harness¶
The kselftest_harness.h file contains useful helpers to build tests. Thetest harness is for userspace testing, for kernel space testing seeTestModule above.
The tests from tools/testing/selftests/seccomp/seccomp_bpf.c can be used asexample.
Example¶
#include"kselftest_harness.h"TEST(standalone_test){do_some_stuff;EXPECT_GT(10,stuff){stuff_state_tstate;enumerate_stuff_state(&state);TH_LOG("expectation failed with state: %s",state.msg);}more_stuff;ASSERT_NE(some_stuff,NULL)TH_LOG("how did it happen?!");last_stuff;EXPECT_EQ(0,last_stuff);}FIXTURE(my_fixture){mytype_t*data;intawesomeness_level;};FIXTURE_SETUP(my_fixture){self->data=mytype_new();ASSERT_NE(NULL,self->data);}FIXTURE_TEARDOWN(my_fixture){mytype_free(self->data);}TEST_F(my_fixture,data_is_good){EXPECT_EQ(1,is_my_data_good(self->data));}TEST_HARNESS_MAIN
Helpers¶
- TH_LOG¶
TH_LOG(fmt,...)
Parameters
fmtformat string
...optional arguments
Description
TH_LOG(format,...)
Optional debug logging function available for use in tests.Logging may be enabled or disabled by defining TH_LOG_ENABLED.E.g., #define TH_LOG_ENABLED 1
If no definition is provided, logging is enabled by default.
- TEST¶
TEST(test_name)
Defines the test function and creates the registration stub
Parameters
test_nametest name
Description
TEST(name){implementation}
Defines a test by name.Names must be unique and tests must not be run in parallel. Theimplementation containing block is a function and scoping should be treatedas such. Returning early may be performed with a bare “return;” statement.
EXPECT_* and ASSERT_* are valid in aTEST() { } context.
- TEST_SIGNAL¶
TEST_SIGNAL(test_name,signal)
Parameters
test_nametest name
signalsignal number
Description
TEST_SIGNAL(name,signal){implementation}
Defines a test by name and the expected term signal.Names must be unique and tests must not be run in parallel. Theimplementation containing block is a function and scoping should be treatedas such. Returning early may be performed with a bare “return;” statement.
EXPECT_* and ASSERT_* are valid in aTEST() { } context.
- FIXTURE_DATA¶
FIXTURE_DATA(datatype_name)
Wraps the
structnameso we have one less argument to pass around
Parameters
datatype_namedatatype name
Description
FIXTURE_DATA(datatype_name)
Almost always, you want justFIXTURE() instead (see below).This call may be used when the type of the fixture datais needed. In general, this should not be needed unlesstheself is being passed to a helper directly.
- FIXTURE¶
FIXTURE(fixture_name)
Called once per fixture to setup the data and register
Parameters
fixture_namefixture name
Description
FIXTURE(fixture_name){typeproperty1;...};
Defines the data provided toTEST_F()-defined tests asself. It should bepopulated and cleaned up usingFIXTURE_SETUP() andFIXTURE_TEARDOWN().
- FIXTURE_SETUP¶
FIXTURE_SETUP(fixture_name)
Prepares the setup function for the fixture._metadata is included so that EXPECT_*, ASSERT_* etc. work correctly.
Parameters
fixture_namefixture name
Description
FIXTURE_SETUP(fixture_name){implementation}
Populates the required “setup” function for a fixture. An instance of thedatatype defined withFIXTURE_DATA() will be exposed asself for theimplementation.
ASSERT_* are valid for use in this context and will prempt the executionof any dependent fixture tests.
A bare “return;” statement may be used to return early.
- FIXTURE_TEARDOWN¶
FIXTURE_TEARDOWN(fixture_name)
Parameters
fixture_namefixture name
Description
_metadata is included so that EXPECT_*, ASSERT_* etc. work correctly.
FIXTURE_TEARDOWN(fixture_name){implementation}
Populates the required “teardown” function for a fixture. An instance of thedatatype defined withFIXTURE_DATA() will be exposed asself for theimplementation to clean up.
A bare “return;” statement may be used to return early.
- FIXTURE_VARIANT¶
FIXTURE_VARIANT(fixture_name)
Optionally called once per fixture to declare fixture variant
Parameters
fixture_namefixture name
Description
FIXTURE_VARIANT(fixture_name){typeproperty1;...};
Defines type of constant parameters provided toFIXTURE_SETUP(),TEST_F() andFIXTURE_TEARDOWN asvariant. Variants allow the same tests to be run withdifferent arguments.
- FIXTURE_VARIANT_ADD¶
FIXTURE_VARIANT_ADD(fixture_name,variant_name)
Called once per fixture variant to setup and register the data
Parameters
fixture_namefixture name
variant_namename of the parameter set
Description
FIXTURE_VARIANT_ADD(fixture_name,variant_name){.property1=val1,...};
Defines a variant of the test fixture, provided toFIXTURE_SETUP() andTEST_F() asvariant. Tests of each fixture will be run once for eachvariant.
- TEST_F¶
TEST_F(fixture_name,test_name)
Emits test registration and helpers for fixture-based test cases
Parameters
fixture_namefixture name
test_nametest name
Description
TEST_F(fixture,name){implementation}
Defines a test that depends on a fixture (e.g., is part of a test case).Very similar toTEST() except thatself is the setup instance of fixture’sdatatype exposed for use by the implementation.
The _metadata object is shared (MAP_SHARED) with all the potential forkedprocesses, which enables them to use EXCEPT_*() and ASSERT_*().
Theself object is only shared with the potential forked processes ifFIXTURE_TEARDOWN_PARENT() is used instead ofFIXTURE_TEARDOWN().
- TEST_HARNESS_MAIN¶
TEST_HARNESS_MAIN
Simple wrapper to run the test harness
Description
TEST_HARNESS_MAINUse once to append a
main()to the test file.
Operators¶
Operators for use inTEST() andTEST_F().ASSERT_* calls will stop test execution immediately.EXPECT_* calls will emit a failure warning, note it, and continue.
- ASSERT_EQ¶
ASSERT_EQ(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
ASSERT_EQ(expected, measured): expected == measured
- ASSERT_NE¶
ASSERT_NE(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
ASSERT_NE(expected, measured): expected != measured
- ASSERT_LT¶
ASSERT_LT(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
ASSERT_LT(expected, measured): expected < measured
- ASSERT_LE¶
ASSERT_LE(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
ASSERT_LE(expected, measured): expected <= measured
- ASSERT_GT¶
ASSERT_GT(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
ASSERT_GT(expected, measured): expected > measured
- ASSERT_GE¶
ASSERT_GE(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
ASSERT_GE(expected, measured): expected >= measured
- ASSERT_NULL¶
ASSERT_NULL(seen)
Parameters
seenmeasured value
Description
ASSERT_NULL(measured): NULL == measured
- ASSERT_TRUE¶
ASSERT_TRUE(seen)
Parameters
seenmeasured value
Description
ASSERT_TRUE(measured): measured != 0
- ASSERT_FALSE¶
ASSERT_FALSE(seen)
Parameters
seenmeasured value
Description
ASSERT_FALSE(measured): measured == 0
- ASSERT_STREQ¶
ASSERT_STREQ(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
ASSERT_STREQ(expected, measured): !strcmp(expected, measured)
- ASSERT_STRNE¶
ASSERT_STRNE(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
ASSERT_STRNE(expected, measured): strcmp(expected, measured)
- EXPECT_EQ¶
EXPECT_EQ(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
EXPECT_EQ(expected, measured): expected == measured
- EXPECT_NE¶
EXPECT_NE(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
EXPECT_NE(expected, measured): expected != measured
- EXPECT_LT¶
EXPECT_LT(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
EXPECT_LT(expected, measured): expected < measured
- EXPECT_LE¶
EXPECT_LE(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
EXPECT_LE(expected, measured): expected <= measured
- EXPECT_GT¶
EXPECT_GT(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
EXPECT_GT(expected, measured): expected > measured
- EXPECT_GE¶
EXPECT_GE(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
EXPECT_GE(expected, measured): expected >= measured
- EXPECT_NULL¶
EXPECT_NULL(seen)
Parameters
seenmeasured value
Description
EXPECT_NULL(measured): NULL == measured
- EXPECT_TRUE¶
EXPECT_TRUE(seen)
Parameters
seenmeasured value
Description
EXPECT_TRUE(measured): 0 != measured
- EXPECT_FALSE¶
EXPECT_FALSE(seen)
Parameters
seenmeasured value
Description
EXPECT_FALSE(measured): 0 == measured
- EXPECT_STREQ¶
EXPECT_STREQ(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
EXPECT_STREQ(expected, measured): !strcmp(expected, measured)
- EXPECT_STRNE¶
EXPECT_STRNE(expected,seen)
Parameters
expectedexpected value
seenmeasured value
Description
EXPECT_STRNE(expected, measured): strcmp(expected, measured)