Vision-based beatmap extraction in rhythm game toward platform-aware note generation

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 122
  • Download : 0
Recent approaches to deep learning-based music analysis have had significant impact on procedural content generation in music-based games. However, the lack of understanding of the unique features of various platforms and interfaces makes auto-generated content less valuable than manually designed content. Hand-crafted datasets are required, to enhance the quality of content in various platforms, but most rhythm games permit only indirect access to the dataset, as a form of player's experience and its replay video. We develop a vision-based approach to content extraction through video analysis, using a format named beatmap. We cover some common visualized features in well-known rhythm games, and construct a mapping from their content to our beatmap model, using multiple object detection. Our method correctly detects each action button, type, and time, and extracts beatmap representations for our target game.
Publisher
IEEE
Issue Date
2021-08-18
Language
English
Citation

IEEE Conference on Games (IEEE CoG), pp.901 - 905

DOI
10.1109/CoG52621.2021.9619108
URI
http://hdl.handle.net/10203/291980
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0