Existing emotion recognition research is typically limited to static laboratory settings and has not fully handle the changes in emotional states in dynamic scenarios. To address this problem, this paper proposes a method for dynamic continuous emotion recognition based on electroencephalography (EEG) and eye movement signals. Firstly, an experimental paradigm was designed to cover six dynamic emotion transition scenarios including happy to calm, calm to happy, sad to calm, calm to sad, nervous to calm, and calm to nervous. EEG and eye movement data were collected simultaneously from 20 subjects to fill the gap in current multimodal dynamic continuous emotion datasets. In the valence-arousal two-dimensional space, emotion ratings for stimulus videos were performed every five seconds on a scale of 1 to 9, and dynamic continuous emotion labels were normalized. Subsequently, frequency band features were extracted from the preprocessed EEG and eye movement data. A cascade feature fusion approach was used to effectively combine EEG and eye movement features, generating an information-rich multimodal feature vector. This feature vector was input into four regression models including support vector regression with radial basis function kernel, decision tree, random forest, and K-nearest neighbors, to develop the dynamic continuous emotion recognition model. The results showed that the proposed method achieved the lowest mean square error for valence and arousal across the six dynamic continuous emotions. This approach can accurately recognize various emotion transitions in dynamic situations, offering higher accuracy and robustness compared to using either EEG or eye movement signals alone, making it well-suited for practical applications.
Musical emotion perception, as a key pathway to decoding the essence of human emotions, requires the analysis of its regulatory mechanisms for the development of precise neural modulation strategies. Although electroencephalography (EEG) signals can be used to capture the dynamic neural activities associated with music emotion processing, and virtual reality (VR) technology can offer immersive enhancement effects for emotion regulation, the interaction mechanism among the VR environment, music emotion and neural activity remains unclear. This study established a multimodal experimental paradigm of “VR environment-music stimulation-EEG response” and employed multi-band feature analysis to systematically elucidate the neural dynamics patterns of musical emotion perception during transitions between different virtual scenarios. The results demonstrated that the right temporal lobe exhibited significant electrophysiological changes when comparing real and virtual scenarios, while posterior brain regions were sensitive to differences in virtual environments. Furthermore, the environment exerted specific modulation on both low-frequency and high-frequency EEG activities, with the δ energy percentage demonstrating a context-dependent differentiation in music emotion perception. This study, through virtual scenario-modulated music emotion perception experiments, systematically reveals the frequency-band-specific modulation effects of environmental factors on music emotion, establishes the energy ratio of the δ band as a key biomarker for environment-emotion interaction, and provides an important theoretical basis and quantitative assessment methods for the development of immersive emotion regulation strategies and clinical psychological interventions.