Visual representation of emotions in Virtual Reality
One of the challenges during the post-COVID pandemic era will be to foster social connections between people. Previous research suggests that people who is able to regulate their emotions tends to have better social connections with others. Additional studies indicate that it is possible to train the ability to regulate emotions voluntarily, using a procedure that involves three steps: (1) asking participants to evoke an autobiographical memory associated with a positive emotion; (2) analyze participants’ brain activity in real-time to estimate their emotional state; and (3) provide visual feedback about the emotions evoked with the autobiographical memory. However, there is not enough research on how to provide the visual feedback required for the third step. Therefore, this manuscript introduces five virtual environments that can be used to provide emotional visual feedback. Each virtual environment was designed based on evidence found in previous studies, suggesting that there are visual cues, such as colors, shapes and motion patterns, that tend to be associated with emotions. In each virtual environment, the visual cues changed, intending to represent five emotional categories. An experiment was conducted to analyze the emotions that participants associated with the virtual environments. The results indicate that each environment is associated with the emotional categories that they were meant to represent.