Personal tools
The Open Lighting Project has moved!

We've launched our new site at www.openlighting.org. This wiki will remain and be updated with more technical information.

Difference between revisions of "RDM Responder Testing"

From wiki.openlighting.org

Jump to: navigation, search
Line 1: Line 1:
As part of the [[Open Lighting Project]] a suite of tests for [[RDM]] responders has been developed. This enables manufacturers to check how well a RDM device conforms to the E1.20 specification. The tests cases are written in Python, and use the [[OLA Open Lighting Architecture]] to communicate with devices.
+
As part of the [[Open Lighting Project]] a suite of tests for [[RDM]] responders has been developed. This enables manufacturers to check how well a RDM device conforms to the E1.20 specification. The tests cases are written in Python, and use the [[OLA| Open Lighting Architecture]] to communicate with devices.
 
 
  
 +
Useful Links:
 
* [[Responder Testing FAQ]]
 
* [[Responder Testing FAQ]]
 
* [[Running the tests]]
 
* [[Running the tests]]
 
* [[Writing RDM Responder Tests]]
 
* [[Writing RDM Responder Tests]]
  
 +
== Test Categories ==
  
== Setup the Test Rig ==
+
Tests are grouped according to the sections in the RDM Categories/Parameter ID Defines table in the E1.20 document.  There are some extra categories for specific behavior like error  conditions and sub device handling.
 
 
The following controller devices are supported:
 
* [[RDM-TRI]]
 
*  [[DMXter4 RDM]] / [[MiniDMXter]]
 
 
 
Connect the device under test to the controller device and start ''olad''. Patch the output port on the controller device to a universe (UNIVERSE_NUMBER). Then run ''ola_rdm_discover'', you should see the responder's UID appear:
 
  
  $ ola_rdm_discover -u UNIVERSE_NUMBER
+
== Test States ==
  00a1:00010003
 
  7a70:ffffff00
 
  
== Running the Tests ==
+
There are four possible result states for a test:
 
 
The tests are written in Python and run using ''ola_rdm_test.py''.  Below is the output from a typical test run:
 
 
 
  ./ola_rdm_test.py --universe 1  --pid_file ../../python/pids.config  00a1:00010003
 
  Starting tests, universe 3, UID 00a1:00010003
 
  SetManufacturerLabel: Passed
 
  SetSoftwareVersionLabel: Passed
 
  GetManufacturerLabel: Passed
 
  GetSoftwareVersionLabelWithData: Failed
 
  ...
 
  ------------- Warnings --------------
 
  ------------ By Category ------------
 
    Product Information:  7 /  7  100%
 
        RDM Information:  1 /  1  100%
 
    Core Functionality:  2 /  2  100%
 
      Error Conditions: 10 / 16  62%
 
          DMX512 Setup:  3 /  3  100%
 
  -------------------------------------
 
  29 / 30 tests run, 23 passed, 6 failed, 0 broken
 
 
 
== Useful Options ==
 
 
 
''ola_rdm_test.py'' has some options which can assist in debugging failures. For a full list of options run with -h
 
 
 
; -d, --debug
 
: Show all debugging output, including actual & expected responses.
 
; -l, --log
 
: Log the output of the tests to a file. The UID and timestamp is appended to the filename
 
; -t Test1,Test2  , --tests=Test1,Test2
 
: Only run a subset of the Tests. Only the tests listed (and their dependencies) will be run.
 
 
 
== Information on Tests ==
 
 
 
Some tests have dependencies, which are other tests that need to be completed before the test can be run. Dependencies can be used to check for supported parameters and other conditions that may affect responder behavior.
 
 
 
There are 4 result states for a test:
 
  
 
; Passed
 
; Passed
Line 62: Line 19:
 
: The responder failed to reply, or replied with an un-expected result
 
: The responder failed to reply, or replied with an un-expected result
 
; Not Run
 
; Not Run
: This test wasn't run because the responder doesn't support the required functionality
+
: This test wasn't run because the responder doesn't support the required functionality or a previous test failed.
 
; Broken
 
; Broken
 
: An internal error occurred, this indicates a programming error or an error with the test rig.
 
: An internal error occurred, this indicates a programming error or an error with the test rig.
 +
 +
== Log Messages ==
 +
 +
;Warnings
 +
:Warnings indicate behavior that doesn't match the standard, but is unlikely to cause usability issues. Warnings are printed in the summary section of the test output.
 +
;Advisory Messages
 +
:Advisory messages indicate issues that are not covered by the standard but are likely to cause problems i.e a sensor temperature out side of the stated scale range.

Revision as of 20:57, 30 January 2011

As part of the Open Lighting Project a suite of tests for RDM responders has been developed. This enables manufacturers to check how well a RDM device conforms to the E1.20 specification. The tests cases are written in Python, and use the Open Lighting Architecture to communicate with devices.

Useful Links:

Test Categories

Tests are grouped according to the sections in the RDM Categories/Parameter ID Defines table in the E1.20 document. There are some extra categories for specific behavior like error conditions and sub device handling.

Test States

There are four possible result states for a test:

Passed
The responder replied with the expected result
Failed
The responder failed to reply, or replied with an un-expected result
Not Run
This test wasn't run because the responder doesn't support the required functionality or a previous test failed.
Broken
An internal error occurred, this indicates a programming error or an error with the test rig.

Log Messages

Warnings
Warnings indicate behavior that doesn't match the standard, but is unlikely to cause usability issues. Warnings are printed in the summary section of the test output.
Advisory Messages
Advisory messages indicate issues that are not covered by the standard but are likely to cause problems i.e a sensor temperature out side of the stated scale range.