Skip to content

AIOZ-GDANCE: a large-scale dataset & baseline for music-driven group dance generation. (CVPR 2023)

License

Notifications You must be signed in to change notification settings

02sausage/AIOZ-GDANCE

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

We demonstrate the AIOZ-GDANCE dataset with in-the-wild videos, music audio, and 3D group dance motion.

Abstract

Music-driven choreography is a challenging problem with a wide variety of industrial applications. Recently, many methods have been proposed to synthesize dance motions from music for a single dancer. However, generating dance motion for a group remains an open problem. In this paper, we present GDANCE, a new large-scale dataset for music-driven group dance generation. Unlike existing datasets that only support single dance, our new dataset contains group dance videos, hence supporting the study of group choreography. We propose a semi-autonomous labeling method with humans in the loop to obtain the 3D ground truth for our dataset. The proposed dataset consists of 16.7 hours of paired music and 3D motion from in-the-wild videos, covering 7 dance styles and 16 music genres. We show that naively applying single dance generation technique to creating group dance motion may lead to unsatisfactory results, such as inconsistent movements and collisions between dancers. Based on our new dataset, we propose a new method that takes an input music sequence and a set of 3D positions of dancers to efficiently produce multiple group-coherent choreographies. We propose new evaluation metrics for measuring group dance quality and perform intensive experiments to demonstrate the effectiveness of our method. Our code and dataset will be released to facilitate future research on group dance generation.

Table of Contents

  1. AIOZ-GDANCE Dataset
  2. Visualizing
  3. Prerequisites
  4. Usage

AIOZ-GDANCE Dataset

[Download] The dataset can be downloaded at Data

[Updated] The music and dance labels are now available at Labels

The data directory is organized as follows:

  • split_sequence_names.txt:
    • a txt file containing seperate sequence names in the data (each sequence should have unique name or id)
  • musics:
    • contains raw music .wav file of each sequence with the corresponding name. The music frames are aligned with the motion frames.
  • motions_smpl:
    • contains the motion file of each sequence with the corresponding name, the motion is provided in .pkl file format.
    • Each data dictionary mainly includes the following items:
      • 'smpl_poses': shape[num_persons x num_frames x 72]: the motions contain 72-D vector pose sequences in SMPL pose format (24 joints).
      • 'root_trans': shape[num_persons x num_frames x 3]: sequences of root translation.

Here is an example python script to read the motion file

import pickle
import numpy as np
data = pickle.load(open("sequence_name.pkl","rb"))
print(data.keys())

smpl_poses = data['smpl_poses']
smpl_trans = data['root_trans']

# ... may utilize the pose by using SMPL forward function: https://github.com/vchoutas/smplx

Figure 4

Figure 5

Visualizing

We provide demo code for loading and visualizing the motions.

Prerequisites

First, you need to download the SMPL model (v1.0.0) and rename the model files for visualization. The directory structure of the data is expected to be:

The directory structure of the data is expected to be:

<DATA_DIR>
├── motions_smpl/
├── musics/
└── split_sequence_names.txt

<SMPL_DIR>
├── SMPL_MALE.pkl
└── SMPL_FEMALE.pkl

Then run this to install the necessary packages

pip install scipy torch smplx chumpy vedo trimesh
pip install numpy==1.23.0

Usage

Visualize the SMPL joints

The following command will first calculate the SMPL joint locations (joint rotations and root translation) and then plot on the 3D figure in realtime.

python vis_smpl_kpt.py \
  --data_dir <DATA_DIR>/motions_smpl \
  --smpl_path <SMPL_DIR>/SMPL_FEMALE.PKL \
  --sequence_name sequence_name.pkl

Visualize the SMPL Mesh

The following command will calculate the SMPL meshes and visualize in 3D.

python vis_smpl_mesh.py \
  --data_dir <DATA_DIR>/motions_smpl \
  --smpl_path <SMPL_DIR>/SMPL_FEMALE.PKL \
  --sequence_name sequence_name.pkl

TODO

  • Dataset
  • Baseline model & training: TBD

Citation

@inproceedings{aiozGdance,
    author    = {Le, Nhat and Pham, Thang and Do, Tuong and Tjiputra, Erman and Tran, Quang D. and Nguyen, Anh},
    title     = {Music-Driven Group Choreography},
    journal   = {CVPR},
    year      = {2023},
}		

License

Software Copyright License for non-commercial scientific research purposes. Please read carefully the following terms and conditions and any accompanying documentation before you download and/or use AIOZ-GDANCE data, model and software, (the "Data & Software"), including 3D meshes, images, videos, textures, software, scripts, and animations. By downloading and/or using the Data & Software (including downloading, cloning, installing, and any other use of the corresponding github repository), you acknowledge that you have read these terms and conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Data & Software. Any infringement of the terms of this agreement will automatically terminate your rights under this License.

Acknowledgement

This repo used visualization code from AIST++

About

AIOZ-GDANCE: a large-scale dataset & baseline for music-driven group dance generation. (CVPR 2023)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%