Automatic transfer of musical mood into virtual environments
PubDate: November 2018
Teams: POSTECH
Writers: Sangyoon Han;Amit Bhardwaj;Seungmoon Choi
PDF: Automatic transfer of musical mood into virtual environments
Abstract
This paper presents a method that automatically transforms a virtual environment (VE) according to the mood of input music. We use machine learning to extract a mood from the music. We then select images exhibiting the mood and transfer their styles to the textures of objects in the VE photorealistically or artistically. Our user study results indicate that our method is effective in transferring valence-related aspects, but not arousal-related ones. Our method can still provide novel experiences in virtual reality and speed up the production of VEs by automating its procedure.