Skip to content

Commit

Permalink
feat: update libraries (#30)
Browse files Browse the repository at this point in the history
* update library version

* minor fix

* fix pytorch version
  • Loading branch information
Ino-Ichan authored Feb 29, 2024
1 parent c4809ff commit 5a6e018
Show file tree
Hide file tree
Showing 7 changed files with 67 additions and 5 deletions.
2 changes: 2 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# this drop notebooks from GitHub language stats
*.ipynb linguist-vendored
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,7 @@ data/*
wandb

*egg-info

poetry.lock

.env
18 changes: 18 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,24 @@ Please sign-in the Hugging Face account.
huggingface-cli login
```

## 4. Flash Attention
Make sure that your environment can use the CUDA toolkit. See also [installation-and-features](https://github.com/Dao-AILab/flash-attention?tab=readme-ov-file#installation-and-features) in flash-attention.

To use flash-attention, you need to install following packages.
```bash
pip install packaging wheel
pip uninstall -y ninja && pip install ninja --no-cache-dir
pip install flash-attn --no-build-isolation
```

If flash-atten doesn't work, please install it from the source. ([Related issue](https://github.com/Dao-AILab/flash-attention/issues/821))
```bash
cd /path/to/download
git clone https://github.com/Dao-AILab/flash-attention.git
cd flash-attention
python setup.py install
```
# Training
For learning, use the yaml configuration file under the `projects` directory.<br>
Expand Down
17 changes: 17 additions & 0 deletions docs/README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,23 @@ pre-commit install
huggingface-cli login
```

## 4. 使用 Flash Attention
确保您的环境可以使用 CUDA Toolkit。另请参阅 flash-attention 中的安装和功能。
为了使用 flash-attention,请安装以下包。
```bash
pip install packaging wheel
pip uninstall -y ninja && pip install ninja --no-cache-dir
pip install flash-attn --no-build-isolation
```

如果 flash-attention 无法正常工作,请从源代码安装。([相关issue](https://github.com/Dao-AILab/flash-attention/issues/821))
```bash
cd /path/to/download
git clone https://github.com/Dao-AILab/flash-attention.git
cd flash-attention
python setup.py install
```

# 学习方法

学习时,请使用 `projects` 目录下的 yaml 配置文件.<br>
Expand Down
17 changes: 17 additions & 0 deletions docs/README_JP.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,23 @@ Llama-2モデルを使用するには、アクセスの申請が必要です。
huggingface-cli login
```

## 4. Flash Attentionの使用
実行する環境でCUDA Toolkitが正しく使えることを確認してください。Flash Attentionの[installation-and-features](https://github.com/Dao-AILab/flash-attention?tab=readme-ov-file#installation-and-features)も参照してください。
flash-attentionを使うために、以下のパッケージをインストールしてください。
```bash
pip install packaging wheel
pip uninstall -y ninja && pip install ninja --no-cache-dir
pip install flash-attn --no-build-isolation
```

もしflash-attentionがうまく動かない場合は、Sourceからインストールしてください。([関連issue](https://github.com/Dao-AILab/flash-attention/issues/821))
```bash
cd /path/to/download
git clone https://github.com/Dao-AILab/flash-attention.git
cd flash-attention
python setup.py install
```

# 学習方法

学習を行う場合、`projects`ディレクトリ配下のyaml設定ファイルを使用します。<br>
Expand Down
10 changes: 6 additions & 4 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,20 +9,20 @@ readme = "README.md"
[tool.poetry.dependencies]
python = ">=3.10, <3.13"
pyyaml = "^6.0.1"
accelerate = "^0.22.0"
accelerate = "~0.27.2"
datasets = "^2.14.4"
deepspeed = "^0.10.2"
deepspeed = "~0.13.2"
einops = "^0.6.1"
evaluate = "^0.4.0"
peft = "^0.5.0"
protobuf = "^4.24.2"
scikit-learn = "^1.3.0"
scipy = "^1.11.2"
sentencepiece = "^0.1.99"
torch = ">=2.0.1"
torch = { url = "https://download.pytorch.org/whl/cu121/torch-2.2.0%2Bcu121-cp310-cp310-linux_x86_64.whl"}
fire = "^0.5.0"
pillow = "^10.0.0"
transformers = "^4.33.0"
transformers = "~4.38.1"
isort = "^5.12.0"
black = "^23.7.0"
wandb = "^0.15.9"
Expand All @@ -32,6 +32,8 @@ jupyterlab = "^4.0.5"
matplotlib = "^3.7.2"
japanize-matplotlib = "^1.1.3"
pre-commit = "^3.4.0"
packaging = "^23.2"
wheel = "^0.42.0"

[build-system]
requires = ["poetry-core"]
Expand Down
4 changes: 3 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
--find-links https://download.pytorch.org/whl/torch_stable.html
torch==2.2.0+cu121

PyYAML
accelerate
datasets
Expand All @@ -9,7 +12,6 @@ protobuf
scikit-learn
scipy
sentencepiece
torch>=2.0.1
fire
pillow
transformers
Expand Down

0 comments on commit 5a6e018

Please sign in to comment.