Real-time Driver Monitoring Systems on Edge AI Device
URL: http://arxiv.org/abs/2304.01555v1
archive: archived pipeline: cataloged
Abstract
As road accident cases are increasing due to the inattention of the driver, automated driver monitoring systems (DMS) have gained an increase in acceptance. In this report, we present a real-time DMS system that runs on a hardware-accelerator-based edge device. The system consists of an InfraRed camera to record the driver footage and an edge device to process the data. To successfully port the deep learning models to run on the edge device taking full advantage of the hardware accelerators, model surgery was performed. The final DMS system achieves 63 frames per second (FPS) on the TI-TDA4VM edge device.
Summary
Engineering report describing a real-time camera-based Driver Monitoring System (DMS) deployed on the Texas Instruments TDA4VM hardware-accelerator edge device. The pipeline uses a Leopard Imaging IR camera and a deep-learning stack (face detection, facial landmarks, head-pose estimation, eye state classification) to detect distraction and drowsiness. The authors describe the model-porting and 'model surgery' work needed to fully execute the DL operators on the TI MMA matrix accelerator (e.g., replacing or removing unsupported ops), and report that the optimized system runs at 63 FPS on the edge device. No human-subjects study, behavioral measurement, or driver-cognition experiment is reported.
Key finding
After model surgery to fit the TDA4VM SDK and offload all DL operators to the matrix accelerator (MMA), the camera-based DMS achieves 63 FPS inference on the edge device, above the authors' real-time target.
Methodology
Engineering / system-build report. No human-subjects experiment. Deep-learning models for face detection, facial landmarks, head pose and eye state are ported to the TI TDA4VM edge device; latency and FPS are benchmarked before and after model surgery.
Quality score: 5 / 5