Browsing by Author "Guo, Gang"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Open Access CogEmoNet: A cognitive-feature-augmented driver emotion recognition model for smart cockpit(IEEE, 2021-11-30) Li, Wenbo; Zeng, Guanzhong; Zhang, Juncheng; Xu, Yan; Xing, Yang; Zhou, Rui; Guo, Gang; Shen, Yu; Cao, Dongpu; Wang, Fei-YueDriver's emotion recognition is vital to improving driving safety, comfort, and acceptance of intelligent vehicles. This article presents a cognitive-feature-augmented driver emotion detection method that is based on emotional cognitive process theory and deep networks. Different from the traditional methods, both the driver's facial expression and cognitive process characteristics (age, gender, and driving age) were used as the inputs of the proposed model. Convolutional techniques were adopted to construct the model for driver's emotion detection simultaneously considering the driver's facial expression and cognitive process characteristics. A driver's emotion data collection was carried out to validate the performance of the proposed method. The collected dataset consists of 40 drivers' frontal facial videos, their cognitive process characteristics, and self-reported assessments of driver emotions. Another two deep networks were also used to compare recognition performance. The results prove that the proposed method can achieve well detection results for different databases on the discrete emotion model and dimensional emotion model, respectively.Item Open Access A multimodal psychological, physiological and behavioural dataset for human emotions in driving tasks(Nature Publishing Group, 2022-08-06) Li, Wenbo; Tan, Ruichen; Xing, Yang; Li, Guofa; Li, Shen; Zeng, Guanzhong; Wang, Peizhi; Zhang, Bingbing; Su, Xinyu; Pi, Dawei; Guo, Gang; Cao, DongpuHuman emotions are integral to daily tasks, and driving is now a typical daily task. Creating a multi-modal human emotion dataset in driving tasks is an essential step in human emotion studies. we conducted three experiments to collect multimodal psychological, physiological and behavioural dataset for human emotions (PPB-Emo). In Experiment I, 27 participants were recruited, the in-depth interview method was employed to explore the driver’s viewpoints on driving scenarios that induce different emotions. For Experiment II, 409 participants were recruited, a questionnaire survey was conducted to obtain driving scenarios information that induces human drivers to produce specific emotions, and the results were used as the basis for selecting video-audio stimulus materials. In Experiment III, 40 participants were recruited, and the psychological data and physiological data, as well as their behavioural data were collected of all participants in 280 times driving tasks. The PPB-Emo dataset will largely support the analysis of human emotion in driving tasks. Moreover, The PPB-Emo dataset will also benefit human emotion research in other daily tasks.