Test Results#

The Test Results contains the full collection of test results generated from running test cases. You can view individual test reports of executed test cases. Test reports contain detailed information about the test case, including the success/failure result, important statistics for each ego vehicle, interesting events that happened, and a video of the ego vehicle's drive during the test case.

Test Results top#

On the left-hand side of the web user interface, clicking "Test Results" brings up the Test Results page. This page contains a table of all executed test cases which enabled test report generation.

For each test case, you can see when the report was created, the name of the test case for the report, an evaluated test case result, and a button to view the detailed report.

In order to create and view a test case report, make sure to enable the "Create Test Case report" radio button when creating a simulation.

Test Cases must have test report generation enabled when they are executed in order to be shown in Test Results. For video recording, the Video Recording sensor must also be included. Please see the sensor configurations page for the JSON.

Each test result will show the evaluated result, with either "Success", "Failed", or "In Progress". You can view each test report for additional details or messages about the reason for the evaluation.

Viewing a Test Report top#

To view a detailed test report, click the "View" button on the right for a given test result.

The top bar indicates the Test Report name, the associated simulation, and buttons to expand the starting simulation configuration details, and delete the report.

The "Iterations" section displays all iterations for this test report. For test cases run with a Python API script, one test case can contain many iterations, each with its own result, statistics, and events. The highlighted box indicates the selected iteration.

Simulation evaluation top#

The test report indicates the evaluation status, or result, of the executed simulation. The below table specifies the conditions and criteria for a simulation result. In the future, additional criteria as well as user-customizable criteria will be supported.

Result Cause/Criteria
SUCCESS The test case did not result in FAILED or ERROR
FAILED The test case involved at least one failure event callback
ERROR The test case resulted in an error and failed to execute properly

The "Start Time" indicates the cluster's system date and time when the test case was started/stopped. For test cases with a distributed cluster, the master node's system time is used.

The "Simulation time duration" indicates the total duration of time of the test case in simulation, excluding paused time. Note that this can be different than the "Real time duration", which measures how much total time passed in the real world during execution.

In the vehicle section, for each ego vehicle in the test case, you can view the vehicle bridge type, bridge IP and port location.

If you included the Video Recording sensor in at least one ego vehicle, you can see the file location where the recorded video of the simulation has been stored.

The following section, Sensors, exists for every ego vehicle involved in the test case. In test cases involving multiple ego vehicles, you will be able to see sensor statistics and callback event information for each ego vehicle.

Sensors: Statistics top#

The Callbacks section gives notable information about the ego vehicle during the test case run.

Metric Description Units
Distance travelled Total distance traveled, as measured by wheel odometry km
Average speed The average speed of the ego vehicle km/hr
Max speed The maximum speed of the ego vehicle km/hr
Min speed The minimum speed of the ego vehicle km/hr
Max longitudinal acceleration Maximum instantaneous acceleration along the vehicle's longitudinal axis m/s2
Max lateral acceleration Maximum instantaneous acceleration along the vehicle's lateral (perpendicular to longitudinal) axis m/s2
Max longitudinal jerk Maximum instantaneous jerk along the vehicle's longitudinal axis m/s3
Max lateral jerk Maximum instantaneous jerk along the vehicle's lateral (perpendicular to longitudinal) axis m/s3

Callbacks top#

Callback events are categorized as one-time events during a test case that either affect the ego vehicle, or that the ego vehicle causes in the environment. The following types of events are supported:

  • Collision involving ego (EgoCollision)
  • Speed limit violation (SpeedViolation)
  • Sudden braking (SuddenBrake)
  • Sudden steer (SuddenSteer)
  • Low simulation performance (LowFPS)

The occurrence of at least one of these events will cause a simulation to be evaluated as "Failed".

For each callback event, the returned information is reported, including the simulation time at which the event occurred.

Selecting the eye icon will cause the visualization to seek to the time of the event in order to view the state of each sensor at that time.

EgoCollision top#

A collision involving an ego vehicle.

Field Description Units
OtherType Type of agent with which ego collided NPC|Pedestrian|Obstacle
EgoVelocity Velocity vector of ego at time at collision time x, y, z in m/s
OtherVelocity Velocity vector of agent/object at collision time km/hr
EgoCollisionTotal Number of ego collisions in the event

SpeedViolation top#

A speed limit violation by an ego vehicle, based on the lane speed limit in the map annotations.

Metric Description Units
Duration Duration of speed limit violation event time
Max speed Maximum speed of ego vehicle during event. m/s
SpeedLimit The ego lane's annotated speed limit of the HD map m/s

EgoStuck top#

An ego stuck event by the ego vehicle, in which the ego travels a threshold distance and then stops moving for a threshold time

SuddenBrake top#

A sudden braking event by the ego vehicle, in which it exceeds a threshold deceleration value.

SuddenSteer top#

A sudden steering event by the ego vehicle, in which absolute magnitude of steering angle exceeds a threshold value.

LowFPS top#

A low simulation performance to indicate that results may not be accurate due to sufficiently low framerate during simulation on the cluster.