mirror of
https://github.com/hiyouga/LLaMA-Factory.git
synced 2025-08-06 05:32:50 +08:00
rename repository
Former-commit-id: 197c754d731d495330f33bbf962f8bbc7a10c0cc
This commit is contained in:
parent
2bf5da7b0e
commit
819b5faa9a
26
README.md
26
README.md
@ -1,11 +1,11 @@
|
|||||||
# LLaMA Efficient Tuning
|
# LLaMA Factory: Training and Evaluating Large Language Models with Minimal Effort
|
||||||
|
|
||||||
[](https://github.com/hiyouga/LLaMA-Efficient-Tuning/stargazers)
|
[](https://github.com/hiyouga/LLaMA-Factory/stargazers)
|
||||||
[](LICENSE)
|
[](LICENSE)
|
||||||
[](https://github.com/hiyouga/LLaMA-Efficient-Tuning/commits/main)
|
[](https://github.com/hiyouga/LLaMA-Factory/commits/main)
|
||||||
[](https://pypi.org/project/llmtuner/)
|
[](https://pypi.org/project/llmtuner/)
|
||||||
[](https://pypi.org/project/llmtuner/)
|
[](https://pypi.org/project/llmtuner/)
|
||||||
[](https://github.com/hiyouga/LLaMA-Efficient-Tuning/pulls)
|
[](https://github.com/hiyouga/LLaMA-Factory/pulls)
|
||||||
[](https://discord.gg/7HGMsdxqJ)
|
[](https://discord.gg/7HGMsdxqJ)
|
||||||
|
|
||||||
👋 Join our [WeChat](assets/wechat.jpg).
|
👋 Join our [WeChat](assets/wechat.jpg).
|
||||||
@ -143,10 +143,10 @@ Please refer to `data/example_dataset` for checking the details about the format
|
|||||||
### Dependence Installation (optional)
|
### Dependence Installation (optional)
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git clone https://github.com/hiyouga/LLaMA-Efficient-Tuning.git
|
git clone https://github.com/hiyouga/LLaMA-Factory.git
|
||||||
conda create -n llama_etuning python=3.10
|
conda create -n llama_factory python=3.10
|
||||||
conda activate llama_etuning
|
conda activate llama_factory
|
||||||
cd LLaMA-Efficient-Tuning
|
cd LLaMA-Factory
|
||||||
pip install -r requirements.txt
|
pip install -r requirements.txt
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -468,10 +468,10 @@ Please follow the model licenses to use the corresponding model weights:
|
|||||||
If this work is helpful, please kindly cite as:
|
If this work is helpful, please kindly cite as:
|
||||||
|
|
||||||
```bibtex
|
```bibtex
|
||||||
@Misc{llama-efficient-tuning,
|
@Misc{llama-factory,
|
||||||
title = {LLaMA Efficient Tuning},
|
title = {LLaMA Factory},
|
||||||
author = {hiyouga},
|
author = {hiyouga},
|
||||||
howpublished = {\url{https://github.com/hiyouga/LLaMA-Efficient-Tuning}},
|
howpublished = {\url{https://github.com/hiyouga/LLaMA-Factory}},
|
||||||
year = {2023}
|
year = {2023}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
@ -482,4 +482,4 @@ This repo benefits from [PEFT](https://github.com/huggingface/peft), [QLoRA](htt
|
|||||||
|
|
||||||
## Star History
|
## Star History
|
||||||
|
|
||||||

|

|
||||||
|
26
README_zh.md
26
README_zh.md
@ -1,11 +1,11 @@
|
|||||||
# LLaMA Efficient Tuning
|
# LLaMA Factory: 轻松的大模型训练与评估
|
||||||
|
|
||||||
[](https://github.com/hiyouga/LLaMA-Efficient-Tuning/stargazers)
|
[](https://github.com/hiyouga/LLaMA-Factory/stargazers)
|
||||||
[](LICENSE)
|
[](LICENSE)
|
||||||
[](https://github.com/hiyouga/LLaMA-Efficient-Tuning/commits/main)
|
[](https://github.com/hiyouga/LLaMA-Factory/commits/main)
|
||||||
[](https://pypi.org/project/llmtuner/)
|
[](https://pypi.org/project/llmtuner/)
|
||||||
[](https://pypi.org/project/llmtuner/)
|
[](https://pypi.org/project/llmtuner/)
|
||||||
[](https://github.com/hiyouga/LLaMA-Efficient-Tuning/pulls)
|
[](https://github.com/hiyouga/LLaMA-Factory/pulls)
|
||||||
[](https://discord.gg/7HGMsdxqJ)
|
[](https://discord.gg/7HGMsdxqJ)
|
||||||
|
|
||||||
👋 加入我们的[微信群](assets/wechat.jpg)。
|
👋 加入我们的[微信群](assets/wechat.jpg)。
|
||||||
@ -143,10 +143,10 @@ huggingface-cli login
|
|||||||
### 环境搭建(可跳过)
|
### 环境搭建(可跳过)
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git clone https://github.com/hiyouga/LLaMA-Efficient-Tuning.git
|
git clone https://github.com/hiyouga/LLaMA-Factory.git
|
||||||
conda create -n llama_etuning python=3.10
|
conda create -n llama_factory python=3.10
|
||||||
conda activate llama_etuning
|
conda activate llama_factory
|
||||||
cd LLaMA-Efficient-Tuning
|
cd LLaMA-Factory
|
||||||
pip install -r requirements.txt
|
pip install -r requirements.txt
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -467,10 +467,10 @@ CUDA_VISIBLE_DEVICES=0 python src/train_bash.py \
|
|||||||
如果您觉得此项目有帮助,请考虑以下列格式引用
|
如果您觉得此项目有帮助,请考虑以下列格式引用
|
||||||
|
|
||||||
```bibtex
|
```bibtex
|
||||||
@Misc{llama-efficient-tuning,
|
@Misc{llama-factory,
|
||||||
title = {LLaMA Efficient Tuning},
|
title = {LLaMA Factory},
|
||||||
author = {hiyouga},
|
author = {hiyouga},
|
||||||
howpublished = {\url{https://github.com/hiyouga/LLaMA-Efficient-Tuning}},
|
howpublished = {\url{https://github.com/hiyouga/LLaMA-Factory}},
|
||||||
year = {2023}
|
year = {2023}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
@ -481,4 +481,4 @@ CUDA_VISIBLE_DEVICES=0 python src/train_bash.py \
|
|||||||
|
|
||||||
## Star History
|
## Star History
|
||||||
|
|
||||||

|

|
||||||
|
Loading…
x
Reference in New Issue
Block a user