Skip to content

Commit

Permalink
[Improve] Add notes for requirements; Improve badges (#277)
Browse files Browse the repository at this point in the history
* Update README.md

* Update README_zh-CN.md

* Update README.md

* Update deepspeed.txt

* Update runtime.txt

* Update runtime.txt

* Update README.md

* Update README_zh-CN.md

* Update README.md

* Update README.md

* Update README.md

* Update README_zh-CN.md

* fix pre-commit

* Update runtime.txt

* Update runtime.txt
  • Loading branch information
LZHgrla authored Dec 26, 2023
1 parent 1004188 commit e7348af
Show file tree
Hide file tree
Showing 5 changed files with 45 additions and 16 deletions.
22 changes: 15 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,22 @@
<img src="https://github.com/InternLM/lmdeploy/assets/36994684/0cf8d00f-e86b-40ba-9b54-dc8f1bc6c8d8" width="600"/>
<br /><br />

[![GitHub Repo stars](https://img.shields.io/github/stars/InternLM/xtuner?style=social)](https://github.com/InternLM/xtuner/stargazers)
[![license](https://img.shields.io/github/license/InternLM/xtuner.svg)](https://github.com/InternLM/xtuner/blob/main/LICENSE)
[![PyPI](https://badge.fury.io/py/xtuner.svg)](https://pypi.org/project/xtuner/)
[![Generic badge](https://img.shields.io/badge/🤗%20Huggingface-xtuner-yellow.svg)](https://huggingface.co/xtuner)
[![Generic badge](https://img.shields.io/badge/🤖%20ModelScope-xtuner-yellow.svg)](https://www.modelscope.cn/organization/xtuner)
[![PyPI](https://img.shields.io/pypi/v/xtuner)](https://pypi.org/project/xtuner/)
[![Downloads](https://static.pepy.tech/badge/xtuner)](https://pypi.org/project/xtuner/)
[![issue resolution](https://img.shields.io/github/issues-closed-raw/InternLM/xtuner)](https://github.com/InternLM/xtuner/issues)
[![open issues](https://img.shields.io/github/issues-raw/InternLM/xtuner)](https://github.com/InternLM/xtuner/issues)

English | [简体中文](README_zh-CN.md)
👋 join us on [![Static Badge](https://img.shields.io/badge/-grey?style=social&logo=wechat&label=WeChat)](https://cdn.vansin.top/internlm/xtuner.jpg)
[![Static Badge](https://img.shields.io/badge/-grey?style=social&logo=twitter&label=Twitter)](https://twitter.com/intern_lm)
[![Static Badge](https://img.shields.io/badge/-grey?style=social&logo=discord&label=Discord)](https://discord.gg/xa29JuW87d)

🔍 Explore our models on
[![Static Badge](https://img.shields.io/badge/-gery?style=social&label=🤗%20Huggingface)](https://huggingface.co/xtuner)
[![Static Badge](https://img.shields.io/badge/-gery?style=social&label=🤖%20ModelScope)](https://www.modelscope.cn/organization/xtuner)

👋 join us on <a href="https://twitter.com/intern_lm" target="_blank">Twitter</a>, <a href="https://discord.gg/xa29JuW87d" target="_blank">Discord</a> and <a href="https://cdn.vansin.top/internlm/xtuner.jpg" target="_blank">WeChat</a>
English | [简体中文](README_zh-CN.md)

</div>

Expand Down Expand Up @@ -138,13 +146,13 @@ XTuner is a toolkit for efficiently fine-tuning LLM, developed by the [MMRazor](
- Install XTuner via pip

```shell
pip install xtuner
pip install -U xtuner
```

or with DeepSpeed integration

```shell
pip install 'xtuner[deepspeed]'
pip install -U 'xtuner[deepspeed]'
```

- Install XTuner from source
Expand Down
22 changes: 15 additions & 7 deletions README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,22 @@
<img src="https://github.com/InternLM/lmdeploy/assets/36994684/0cf8d00f-e86b-40ba-9b54-dc8f1bc6c8d8" width="600"/>
<br /><br />

[![GitHub Repo stars](https://img.shields.io/github/stars/InternLM/xtuner?style=social)](https://github.com/InternLM/xtuner/stargazers)
[![license](https://img.shields.io/github/license/InternLM/xtuner.svg)](https://github.com/InternLM/xtuner/blob/main/LICENSE)
[![PyPI](https://badge.fury.io/py/xtuner.svg)](https://pypi.org/project/xtuner/)
[![Generic badge](https://img.shields.io/badge/🤗%20Huggingface-xtuner-yellow.svg)](https://huggingface.co/xtuner)
[![Generic badge](https://img.shields.io/badge/🤖%20ModelScope-xtuner-yellow.svg)](https://www.modelscope.cn/organization/xtuner)
[![PyPI](https://img.shields.io/pypi/v/xtuner)](https://pypi.org/project/xtuner/)
[![Downloads](https://static.pepy.tech/badge/xtuner)](https://pypi.org/project/xtuner/)
[![issue resolution](https://img.shields.io/github/issues-closed-raw/InternLM/xtuner)](https://github.com/InternLM/xtuner/issues)
[![open issues](https://img.shields.io/github/issues-raw/InternLM/xtuner)](https://github.com/InternLM/xtuner/issues)

[English](README.md) | 简体中文
👋 加入我们:[![Static Badge](https://img.shields.io/badge/-grey?style=social&logo=wechat&label=微信)](https://cdn.vansin.top/internlm/xtuner.jpg)
[![Static Badge](https://img.shields.io/badge/-grey?style=social&logo=twitter&label=推特)](https://twitter.com/intern_lm)
[![Static Badge](https://img.shields.io/badge/-grey?style=social&logo=discord&label=Discord)](https://discord.gg/xa29JuW87d)

🔍 探索我们的模型:
[![Static Badge](https://img.shields.io/badge/-gery?style=social&label=🤗%20Huggingface)](https://huggingface.co/xtuner)
[![Static Badge](https://img.shields.io/badge/-gery?style=social&label=🤖%20ModelScope)](https://www.modelscope.cn/organization/xtuner)

👋 加入我们:<a href="https://twitter.com/intern_lm" target="_blank">推特</a>、<a href="https://discord.gg/xa29JuW87d" target="_blank">Discord</a>、<a href="https://cdn.vansin.top/internlm/xtuner.jpg" target="_blank">微信</a>
[English](README.md) | 简体中文

</div>

Expand Down Expand Up @@ -138,13 +146,13 @@ XTuner 是一个轻量级微调大语言模型的工具库,由 [MMRazor](https
- 通过 pip 安装 XTuner:

```shell
pip install xtuner
pip install -U xtuner
```

亦可集成 DeepSpeed 安装:

```shell
pip install 'xtuner[deepspeed]'
pip install -U 'xtuner[deepspeed]'
```

- 从源码安装 XTuner:
Expand Down
1 change: 1 addition & 0 deletions requirements/deepspeed.txt
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
# Minimum 0.12.3, see https://github.com/microsoft/DeepSpeed/pull/4587
deepspeed>=0.12.3
mpi4py-mpich
13 changes: 11 additions & 2 deletions requirements/runtime.txt
Original file line number Diff line number Diff line change
@@ -1,12 +1,21 @@
bitsandbytes>=0.40.0
datasets
# Minimum 0.40.0.post4 to fix some 4-bit precision bugs
bitsandbytes>=0.40.0.post4
# Minimum 2.16.0 to fix some bugs, see https://github.com/huggingface/datasets/pull/6444
datasets>=2.16.0
einops
# Minimum 0.1.2 to fix some bugs, see https://github.com/InternLM/lagent/pull/44
lagent>=0.1.2
# Minimum 0.10.1 to support exclude_frozen_parameters for DeepSpeedStrategy,
# see https://github.com/open-mmlab/mmengine/pull/1415, https://github.com/open-mmlab/mmengine/pull/1424
mmengine>=0.10.1
# Minimum 0.4.0 to support QLoRA, see https://github.com/huggingface/peft/pull/476
peft>=0.4.0
scipy
SentencePiece
tiktoken
torch
# Minimum 4.32.1 to support the QLoRA fine-tune of ChatGLM2
# Exclude 4.34.1, 4.35.0, 4.35.1, 4.35.2 to avoid BC-break,
# see https://github.com/huggingface/transformers/pull/27020, https://github.com/huggingface/transformers/pull/27073
transformers>=4.32.1,!=4.34.1,!=4.35.0,!=4.35.1,!=4.35.2
transformers_stream_generator
3 changes: 3 additions & 0 deletions xtuner/configs/mixtral/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,9 @@
## Install

```bash
# Install the latest xtuner
pip install -U 'xtuner[deepspeed]'

# Mixtral requires the latest version of transformers.
pip install git+https://github.com/huggingface/transformers.git

Expand Down

0 comments on commit e7348af

Please sign in to comment.