mirror of
https://github.com/hiyouga/LLaMA-Factory.git
synced 2025-08-22 13:42:51 +08:00
update wechat
Former-commit-id: 2670f6fb3ddabc4da3b0ed49baa40950744e75d7
This commit is contained in:
parent
11f79ea20e
commit
eabaf0def8
@ -345,6 +345,8 @@ To enable FlashAttention-2 on the Windows platform, you need to install the prec
|
||||
|
||||
<details><summary>For Ascend NPU users</summary>
|
||||
|
||||
Join [NPU user group](assets/wechat_npu.jpg).
|
||||
|
||||
To utilize Ascend NPU devices for (distributed) training and inference, you need to install the **[torch-npu](https://gitee.com/ascend/pytorch)** library and the **[Ascend CANN Kernels](https://www.hiascend.com/developer/download/community/result?module=cann)**.
|
||||
|
||||
| Requirement | Minimum | Recommend |
|
||||
|
@ -345,6 +345,8 @@ pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/downl
|
||||
|
||||
<details><summary>昇腾 NPU 用户指南</summary>
|
||||
|
||||
加入 [NPU 用户群](assets/wechat_npu.jpg)。
|
||||
|
||||
如果使用昇腾 NPU 设备进行(分布式)训练或推理,需要安装 **[torch-npu](https://gitee.com/ascend/pytorch)** 库和 **[Ascend CANN Kernels](https://www.hiascend.com/developer/download/community/result?module=cann)**。
|
||||
|
||||
| 依赖项 | 至少 | 推荐 |
|
||||
|
BIN
assets/wechat_npu.jpg
Normal file
BIN
assets/wechat_npu.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 146 KiB |
Loading…
x
Reference in New Issue
Block a user