Commit Graph

3 Commits

Author SHA1 Message Date
Janusz Dziedzic
a185e9b10b tests/remote: Add hwsim wrapper
This allow to run hwsim test cases.

duts go to apdev while refs go to dev

For now I tested:
./run-tests.py -d hwsim0 -r hwsim1 -h ap_open -h dfs
./run-tests.py -r hwsim0 -r hwsim1 -h ibss_open -v
./run-tests.py -r hwsim0 -r hwsim1 -r hwsim2 -d hwsim3 -d hwsim4 -h ap_vht80 -v
./run-tests.py -r hwsim0 -r hwsim1 -r hwsim2 -d hwsim3 -d hwsim4 -h all -k ap -k vht

Signed-off-by: Janusz Dziedzic <janusz.dziedzic@tieto.com>
2016-05-14 17:56:37 +03:00
Janusz Dziedzic
a73fa13be2 tests/remote: Add utils file
Add rutils.py for remote tests.

Signed-off-by: Janusz Dziedzic <janusz.dziedzic@tieto.com>
2016-05-14 17:56:37 +03:00
Janusz Dziedzic
5865186e31 tests: Add remote directory to tests
Add tests/remote directory and files:
config.py - handle devices/setup_params table
run-tests.py - run test cases
test_devices.py - run basic configuration tests

You can add own configuration file, by default this is cfg.py, and put
there devices and setup_params definition in format you can find in
config.py file. You can use -c option or just create cfg.py file.

Print available devices/test_cases:
./run-tests.py

Check devices (ssh connection, authorized_keys, interfaces):
./run-test.py -t devices

Run sanity tests (test_sanity_*):
./run-test.py -d <dut_name> -t sanity

Run all tests:
./run-tests.py -d <dut_name> -t all

Run test_A and test_B:
./run-tests.py -d <dut_name> -t "test_A, test_B"

Set reference device, and run sanity tests:
./run-tests.py -d <dut_name> -r <ref_name> -t sanity

Multiple duts/refs/monitors could be setup:
e.g.
./run-tests.py -d <dut_name> -r <ref1_name> -r <ref2_name> -t sanity

Monitor could be set like this:
./run-tests.py -d <dut_name> -t sanity -m all -m <standalone_monitor>

You can also add filters to tests you would like to run
./run-tests.py -d <dut_name> -t all -k wep -k g_only
./run-tests.py -d <dut_name> -t all -k VHT80

./run-test.py doesn't start/terminate wpa_supplicant or hostpad,
test cases are resposible for that, while we don't know test
case requirements.

Restart (-R) trace (-T) and perf (-P) options available.
This request trace/perf logs from the hosts (if possible).

As parameters each test case get:
- devices - table of available devices
- setup_params
- duts - names of DUTs should be tested
- refs - names of reference devices should be used
- monitors - names of monitors list

Each test could return append_text.

Signed-off-by: Janusz Dziedzic <janusz.dziedzic@tieto.com>
2016-05-14 17:56:35 +03:00