Situation Awareness for Driver-Centric Driving Style Adaptation
URL: http://arxiv.org/abs/2403.19595v1
archive: archived pipeline: cataloged verified
Abstract
There is evidence that the driving style of an autonomous vehicle is important to increase the acceptance and trust of the passengers. The driving situation has been found to have a significant influence on human driving behavior. However, current driving style models only partially incorporate driving environment information, limiting the alignment between an agent and the given situation. Therefore, we propose a situation-aware driving style model based on different visual feature encoders pretrained on fleet data, as well as driving behavior predictors, which are adapted to the driving style of a specific driver. Our experiments show that the proposed method outperforms static driving styles significantly and forms plausible situation clusters. Furthermore, we found that feature encoders pretrained on our dataset lead to more precise driving behavior modeling. In contrast, feature encoders pretrained supervised and unsupervised on different data sources lead to more specific situation clusters, which can be utilized to constrain and control the driving style adaptation for specific situations. Moreover, in a real-world setting, where driving style adaptation is happening iteratively, we found the MLP-based behavior predictors achieve good performance initially but suffer from catastrophic forgetting. In contrast, behavior predictors based on situationdependent statistics can learn iteratively from continuous data streams by design. Overall, our experiments show that important information for driving behavior prediction is contained within the visual feature encoder. The dataset is publicly available at huggingface.co/datasets/jHaselberger/SADC-Situation-Awareness-for-Driver-Centric-Driving-Style-Adaptation.
Summary
Methodology paper (Haselberger, Stuhr, Schick, Müller) proposing a situation-aware neural-network driving-style model for autonomous vehicles. The system pairs visual feature encoders pretrained on fleet data with driving-behaviour predictors that adapt to an individual driver's style. Multiple encoder-pretraining strategies (supervised, unsupervised, fleet-data) are compared. Experiments show the proposed approach significantly outperforms static driving-style baselines and forms plausible situation clusters. Encoders pretrained on the authors' own fleet dataset gave more precise behaviour modelling, while supervised/unsupervised pretraining on external sources produced more specific situation clusters useful for constraining or controlling style adaptation. In a real-world iterative driving-style-adaptation setting, MLP-based behaviour predictors performed well initially but suffered from catastrophic forgetting, motivating future work on continual learning for driver-centric AV style adaptation.
Key finding
A situation-aware NN driving-style model with fleet-pretrained visual encoders outperforms static styles and yields meaningful situation clusters, but MLP behaviour predictors suffer catastrophic forgetting in iterative real-world adaptation.
Methodology
experimental
Sample size: Exp 1: N=10; Exp 2: N=20
Quality score: 5 / 5