Abstract:
In the world of musical creation, the integration of artificial intelligence (AI) represents a significant paradigm shift in emotional engagement. This research investigates the human emotional responses evoked when listening to AI-composed music. Focused on figuring out the emotional impact of AI-composed music, the study explores the complex relationship between human emotional experiences and compositions crafted by AI algorithms. Through a comprehensive literature review, this paper examines existing methodologies, insights, and gaps in understanding the emotional dimensions of AI-composed music. Major findings reveal that while AI software like artificial intelligence virtual artist (AIVA) shows it can help explore emotional authenticity, ongoing doubt and preference for music made by humans highlight the need for more research. Attitudes of both listeners and music professionals toward AI-composed music are characterized by skepticism and negative perceptions, emphasizing the urgency to address reservations and investigate the unique emotional qualities of AI-composed music. Furthermore, the complex nature of music emotion recognition, influenced by factors such as music genre, cultural perspective, and age group, complicates understanding emotional responses to both human-created and AI-composed music. The paper supports the development of analytical methods, particularly through machine learning and deep learning approaches, to enhance understanding of the complexities of emotional responses and improve AI music composition. A human- experience-centered framework is proposed to address subjectivity in assessing emotional responses to music. This research aims to understand the details of emotional responses and find out if AI-composed music can really evoke emotions comparable to human-created compositions.