MBAimCLR (based on MotionBert and AimCLR)
This project propose a model that uses the self-supervised learning framework from AimCLR to apply extreme data augmentations to the motion encoder DSTformer from MotionBERT. It is trained and evaluate on the NTU RGB+D 60 and 120 dataset. For more information about the model and its performance you can read the files in the folder report.
This project was made in the context of a semester project done at VITA lab, for the Robotic Master at EPFL. The goal was to study the impact the framework on a state of the art transformer method for action recognition task.
To install the project and be able to run it, you need to follow the following steps:
- Clone the repository:
- Install the requirements:
the required packages and python version :
conda create -n mbpip python=3.7 anaconda
conda activate mbpip
# Please install PyTorch according to your CUDA version.
conda install pytorch torchvision torchaudio pytorch-cuda=11.6 -c pytorch -c nvidia
pip install -r requirements.txt
and torchlight which is a function wrapper for pytorch :
# Install torchlight
$ cd torchlight
$ python setup.py install
- If you want to use a already trained model, download the checkpoint from here.
The already generated dataset can be downloaded from : NTU-60 and NTU-120. To generate the dataset yourself, see the procedure here under Data Preparation
- Fill the config file corresponding to what you want to train with the paths to the dataset.
- Run the following command:
python main.py pretrain_mbaimclr --config config/<your_config_file>.yaml
It is the same command, you just need to adapt the config file with resume set to True and the path to the weights.
The code will automatically create two folders named train
and val
in the working directory you put in the config file. You can visualize the logs with Tensorboard by running the following command:
tensorboard --logdir=<path_to_the_working_directory>/
In this part we will explain how to evaluate the model.
- Fill the corresponding config file in the
config
folder by putting the path of the model and the data. - Run the following command:
python main.py linear_evaluation_mb --config config/<your_config_file>.yaml