mirror of
https://github.com/hiyouga/LLaMA-Factory.git
synced 2025-08-02 03:32:50 +08:00
[misc] update readme (#6917)
Former-commit-id: 499ea45d1f1ea7704ee82f58c35af123a6c2632b
This commit is contained in:
parent
1679930e00
commit
07aa7b71a3
3
.gitignore
vendored
3
.gitignore
vendored
@ -162,6 +162,9 @@ cython_debug/
|
||||
# vscode
|
||||
.vscode/
|
||||
|
||||
# uv
|
||||
uv.lock
|
||||
|
||||
# custom .gitignore
|
||||
ms_cache/
|
||||
hf_cache/
|
||||
|
16
README.md
16
README.md
@ -436,6 +436,22 @@ Extra dependencies available: torch, torch-npu, metrics, deepspeed, liger-kernel
|
||||
> [!TIP]
|
||||
> Use `pip install --no-deps -e .` to resolve package conflicts.
|
||||
|
||||
<details><summary>Setting up a virtual environment with <b>uv</b></summary>
|
||||
|
||||
Create an isolated Python environment with uv:
|
||||
|
||||
```bash
|
||||
uv sync --extra torch --extra metrics --prerelease=allow
|
||||
```
|
||||
|
||||
Run LLaMA-Factory in the isolated environment:
|
||||
|
||||
```bash
|
||||
uv run --prerelease=allow llamafactory-cli train examples/train_lora/llama3_lora_pretrain.yaml
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
<details><summary>For Windows users</summary>
|
||||
|
||||
#### Install BitsAndBytes
|
||||
|
17
README_zh.md
17
README_zh.md
@ -438,6 +438,23 @@ pip install -e ".[torch,metrics]"
|
||||
> [!TIP]
|
||||
> 遇到包冲突时,可使用 `pip install --no-deps -e .` 解决。
|
||||
|
||||
<details><summary>使用 <b>uv</b> 构建虚拟环境</summary>
|
||||
|
||||
创建隔离的 Python 环境:
|
||||
|
||||
```bash
|
||||
uv sync --extra torch --extra metrics --prerelease=allow
|
||||
```
|
||||
|
||||
在环境中运行 LLaMA-Factory:
|
||||
|
||||
```bash
|
||||
uv run --prerelease=allow llamafactory-cli train examples/train_lora/llama3_lora_pretrain.yaml
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details><summary>Windows 用户指南</summary>
|
||||
|
||||
#### 安装 BitsAndBytes
|
||||
|
@ -98,7 +98,7 @@ FORCE_TORCHRUN=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ds3.
|
||||
#### 使用 Ray 在 4 张 GPU 上微调
|
||||
|
||||
```bash
|
||||
USE_RAY=1 llamafactory-cli train examples/train_full/llama3_lora_sft_ray.yaml
|
||||
USE_RAY=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ray.yaml
|
||||
```
|
||||
|
||||
### QLoRA 微调
|
||||
|
Loading…
x
Reference in New Issue
Block a user