Skip to content
/ HAHT Public

History-Aware Hierarchical Transformer for Multi-session Open-domain Dialogue System

License

Notifications You must be signed in to change notification settings

xiaolan98/HAHT

Repository files navigation

HAHT

History-Aware Hierarchical Transformer for Multi-session Open-domain Dialogue System

Overview of HAHT HAHT Structure HAHT Structure

Prerequisites

  • Python 3.6
  • PyTorch 1.10.0

Getting Started

Installation

python setup.py develop

Download the pretrained blender90M model

As the ParlAI is archived, the pretrained model will not be downloaded automatically. This step is to manually download the pretrained model.

pip install gdown

cd data/
gdown --id 1qRcKgAQvAA-hPLjHDazxV8E3ngyYfKpo
tar -xzvf models.tar.gz
rm models.tar.gz
cd ..

You can also download the file, extract it, and put the extracted folder under the data folder.

Training

To train the model, run:

cd debug_scripts/blender_haht/
python finetune_haht.py

To evaluate the model, run:

python evaluate_haht.py --skip_generation False --model_parallel False --batchsize 32

Logging

TensorBoard logs and models will be saved in HAHT/data/tmp/all_session/ folder. Generate the responses and coresponding input conversation context will be saved in HAHT/debug_scripts/blender_haht/logs/

Discussion

If you have difficulties to get things working in the above steps, please let us know.

About

History-Aware Hierarchical Transformer for Multi-session Open-domain Dialogue System

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published