Skip to content

Latest commit

 

History

History
 
 

Step-by-Step

This example load BERT Pre-Training of Image Transformers(BEiT) model and confirm its accuracy and performance based on ImageNet-1k dataset. You need to download this dataset yourself.

In this example, the BEiT model is pre-trained in a self-supervised fashion on ImageNet-22k - also called ImageNet-21k (14 million images, 21,841 classes) at resolution 224x224, and fine-tuned on the same dataset at resolution 224x224. It was first released in this repository.

Prerequisite

1. Environment

pip install neural-compressor
pip install -r requirements.txt

Note: Validated ONNX Runtime Version.

2. Prepare Model

Prepare DETR R18 model for table structure recognition.

python prepare_model.py --input_model=beit_base_patch16_224 --output_model=beit_base_patch16_224_pt22k_ft22kto1k.onnx

3. Prepare Dataset

Download and extract ImageNet-1k to dir: /path/to/imagenet. The dir include below folder:

ls /path/to/imagenet
train  val

Run

1. Quantization

Quantize model with QLinearOps:

bash run_quant.sh --input_model=/path/to/model \  # model path as *.onnx
                   --dataset_location=/path/to/imagenet \
                   --output_model=/path/to/save \
                   --quant_format="QOperator"

2. Benchmark

bash run_benchmark.sh --input_model=/path/to/model \  # model path as *.onnx
                      --dataset_location=/path/to/imagenet \
                      --batch_size=batch_size \
                      --mode=performance # or accuracy