Li, WenboZeng, GuanzhongZhang, JunchengXu, YanXing, YangZhou, RuiGuo, GangShen, YuCao, DongpuWang, Fei-Yue2021-12-102021-12-102021-11-30Li W, Zeng G, Zhang J, et al., (2021) CogEmoNet: A cognitive-feature-augmented driver emotion recognition model for smart cockpit. IEEE Transactions on Computational Social Systems, Volume 9, Number 3, June 2022, pp. 667-6782329-924Xhttps://doi.org/10.1109/TCSS.2021.3127935https://dspace.lib.cranfield.ac.uk/handle/1826/17329Driver's emotion recognition is vital to improving driving safety, comfort, and acceptance of intelligent vehicles. This article presents a cognitive-feature-augmented driver emotion detection method that is based on emotional cognitive process theory and deep networks. Different from the traditional methods, both the driver's facial expression and cognitive process characteristics (age, gender, and driving age) were used as the inputs of the proposed model. Convolutional techniques were adopted to construct the model for driver's emotion detection simultaneously considering the driver's facial expression and cognitive process characteristics. A driver's emotion data collection was carried out to validate the performance of the proposed method. The collected dataset consists of 40 drivers' frontal facial videos, their cognitive process characteristics, and self-reported assessments of driver emotions. Another two deep networks were also used to compare recognition performance. The results prove that the proposed method can achieve well detection results for different databases on the discrete emotion model and dimensional emotion model, respectively.enAttribution-NonCommercial 4.0 InternationalAffective computingdriver emotionfacial expressionhuman-machine interaction (HMI)smart cockpitCogEmoNet: A cognitive-feature-augmented driver emotion recognition model for smart cockpitArticle