Personal tools
The Open Lighting Project has moved!

We've launched our new site at www.openlighting.org. This wiki will remain and be updated with more technical information.

RDM Test Output

From wiki.openlighting.org

Jump to: navigation, search

The RDM Responder Tests can produce quite a bit of output. This guide covers how to interpret the output and diagnose failures.

First a quick overview of how the tests are structured:

Test Categories

Tests are grouped according to the sections in the RDM Categories/Parameter ID Defines table in the E1.20 document. There are some extra categories for specific behavior like error conditions and sub device handling.

Test States

There are four possible result states for a test:

Passed
The responder replied with the expected result
Failed
The responder failed to reply, or replied with an un-expected result
Not Run
This test wasn't run because the responder doesn't support the required functionality or a previous test failed.
Broken
An internal error occurred, this indicates a programming error or an error with the test rig.

Log Messages

Warnings
Warnings indicate behavior that doesn't match the standard, but is unlikely to cause usability issues. Warnings are printed in the summary section of the test output.
Advisory Messages
Advisory messages indicate issues that are not covered by the standard but are likely to cause problems i.e a sensor temperature out side of the stated scale range.

Test Output

Let's go over the output from the tests using some examples. If you're running the tests from the command line you can see the verbose output by adding -d as an argument. If you're using the Web UI the verbose output will be visible in the bottom third of the screen.

First a successful test:

GetDeviceModelDescription: GET the device model description.
 GET: uid: 7a70:00000001, pid: DEVICE_MODEL_DESCRIPTION (0x0080), sub device: 0, args: []
 Response: RDMResponse: ACK, PID = 0x0080, data = {u'description': 'Model 1'}
GetDeviceModelDescription: Passed

The first line contains the name of the test, as well as a one line description of what the test does. The second line is the command we sent to the responder, in this case it was a GET DEVICE_MODEL_DESCRIPTION sent to 7a70:00000001 with no parameter data.

The third line is the response we received from the responder. In this case we received an ACK, with the response data "Model 1". This is a valid response so the test is declared to have passed (line 4).

Now let's consider a test that fails:

AllSubDevicesDeviceInfo: Send a Get Device Info to ALL_SUB_DEVICES.
 GET: uid: 7a70:00000001, pid: DEVICE_INFO (0x0060), sub device: 65535, args: []
Request failed: Response Timeout
 Failed: expected one of:
  CC: Get, PID 0x0060, NACK Sub device out of range

In this test, we send a GET DEVICE_INFO to all the sub devices on the responder. In this case the responder doesn't respond with any data, so we declare a response timeout. The test then displays what it was expecting, in this case the correct response was a NR_SUB_DEVICE_OUT_OF_RANGE response code.

Let's consider another test:

GetSensorDefinitionWithNoData: Get the sensor definition with no data.
 GET: uid: 7a70:00000001, pid: SENSOR_DEFINITION (0x0200), sub device: 0, data: ''
 Response: RDMResponse: ACK, PID = 0x0200, data = {u'normal_min': -32768, u'name': 'Device Lifetime Hours', u'range_min': -32768, u'range_max': 32767, u'normal_max': 32767, u'supports_recording': 0, u'prefix': 0, u'sensor_number': 4, u'type': 32, u'unit': 0}
 Failed: expected one of:
  CC: Get, PID 0x0200, NACK Format Error
GetSensorDefinitionWithNoData: Failed

Here we send a GET SENSOR_DEFINITION but without any parameter data. Recall that the GET SENSOR_DEFINITION command takes one byte of data which represents the sensor index to return the definition for. In this case the responder replies with an ACK, even though it doesn't have enough information to determine which sensor we asked for! This fails the test since the correct result would be a NR_FORMAT_ERROR.