Skip to content

Commit

Permalink
[doc] fix docs
Browse files Browse the repository at this point in the history
  • Loading branch information
mikecovlee committed Nov 13, 2024
1 parent 1f9d5a2 commit 8ea22e5
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ MoE-PEFT is an open-source *LLMOps* framework built on [m-LoRA](https://github.c

- Seamless integration with the [HuggingFace](https://huggingface.co) ecosystem.

You can try MoE-PEFT with [Google Colab](https://githubtocolab.com/TUDB-Labs/MoE-PEFT/blob/main/misc/finetune-demo.ipynb) before local installation.
You can try MoE-PEFT with [Google Colab](https://colab.research.google.com/github/TUDB-Labs/MoE-PEFT/blob/main/misc/finetune-demo.ipynb) before local installation.

## Supported Platform

Expand Down
2 changes: 1 addition & 1 deletion misc/finetune-demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
"\n",
"## About this notebook\n",
"\n",
"This is a simple jupiter notebook for showcasing the basic process of fine-tuning TinyLLaMA with dummy data"
"This is a simple jupiter notebook for showcasing the basic process of fine-tuning TinyLLaMA with dummy data."
]
},
{
Expand Down

0 comments on commit 8ea22e5

Please sign in to comment.