A three-phase mixed-methods approach: (1) Critical incident interviews with 20 expert UAS operators to generate realistic scenarios; (2) Expert panel (n=10) to establish correct/incorrect response keys; (3) Validation with 150 UAS trainees, comparing SIT-UAS scores against instructor ratings and simulator performance.
Traditional technical assessments for UAS pilots focus on procedural knowledge, yet most operational failures stem from poor judgement under ambiguous or time-critical conditions. Existing selection tools lack scenario-based items specific to UAS challenges (e.g., beyond visual line of sight operations, lost link procedures). sit uas
Development and Validation of a Situational Judgement Test (SIT-UAS) for Unmanned Aircraft System Operators: Predicting Non-Technical Skills in Remote Piloting Development and Validation of a Situational Judgement Test
Situational Judgement Test, Unmanned Aircraft Systems, non-technical skills, pilot selection, human factors. 1. Introduction The proliferation of UAS (drones) across civil, commercial, and military domains has increased demand for effective operator selection and training (ICAO, 2022). While technical proficiency in flight control is measurable, critical incidents often involve failures of judgement—e.g., prioritizing a visual fix over battery state, or misinterpreting controller handoffs. While technical proficiency in flight control is measurable,
A. Continue the mission, hoping the link recovers. B. Immediately initiate return-to-home (RTH). C. Climb to higher altitude to improve line of sight. D. Wait 30 seconds, then command RTH if no recovery.