Skip to content

LIUQI-creat/SigFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SigFormer: Sparse Signal-Guided Transformer for MultiModal Human Action Segmentation

Introduction

This is an implementation repository for our work. SigFormer: Sparse Signal-Guided Transformer for MultiModal Human Action Segmentation.

Installation

Clone the repository and move to folder:

git clone https://github.com/LIUQI-creat/SigFormer.git

cd SigFormer

To use this source code, you need Python3.8+ and a few python3 packages:

  • pytorch 1.12.1
  • torchvision 0.13.1
  • openpack-torch
  • openpack-toolkit
  • ......

Data

Please download the OpenPack dataset use:

optk-download -d ./data

Train and Test

Training

Use the following commands for training:

python src/train.py

Testing

Obtain the final prediction results:

python src/ensemble_mean.py

In order to get the results in the table below, you need to submit the generated submission.zip file to the online review.

Our submitted file is provided in baiduyun, passcode:ubfo.

Main results

F1 (Macro Average) U0104 U0108 U0110 U0203 U0204 U0207 ALL
SigFormer 0.971 0.969 0.960 0.966 0.903 0.923 0.958

Acknowledgement

We greatly appreciate the OpenPack-Challenge-1st repository.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published