Generating Music Transition by Using a Transformer-Based Model

With the prevalence of online video-sharing platforms increasing in recent years, many people have started to create their own videos and upload them onto the Internet. In filmmaking, background music is also one of the major elements besides the footage. With matching background music, a video can...

Full description

Bibliographic Details
Main Authors: Jia-Lien Hsu, Shuh-Jiun Chang
Format: Article
Language:English
Published: MDPI AG 2021-09-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/10/18/2276
Description
Summary:With the prevalence of online video-sharing platforms increasing in recent years, many people have started to create their own videos and upload them onto the Internet. In filmmaking, background music is also one of the major elements besides the footage. With matching background music, a video can not only convey information, but also immerse the viewers in the setting of a story. There is often not only one piece of background music, but several, which is why audio editing and music production software are required. However, music editing is a professional expertise, and it can be hard for amateur creators to compose ideal pieces for the video. At the same time, there are some online audio libraries and music archives for sharing audio/music samples. For beginners, one possible way to compose background music for a video is “arranging and integrating samples”, rather than making music from scratch. As a result, this leads to a problem. There might be some gaps between samples, in which we have to generate transitions to fill the gaps. In our research, we build a transformer-based model for generating a music transition to bridge two prepared music clips. We design and perform experiments to demonstrate that our results are promising. The results are also analysed by using a questionnaire to reveal a positive response from listeners, supporting that our generated transitions conform to background music.
ISSN:2079-9292