update readme

Former-commit-id: 4054e85c664c541f435619baebcd687f80445d4a
This commit is contained in:
hiyouga 2023-05-31 16:57:43 +08:00
parent 693c049eac
commit a79df3500b

View File

@ -14,15 +14,15 @@
- [LLaMA](https://github.com/facebookresearch/llama) (7B, 13B, 33B, 65B) - [LLaMA](https://github.com/facebookresearch/llama) (7B, 13B, 33B, 65B)
- [BLOOM](https://huggingface.co/bigscience/bloom) & [BLOOMZ](https://huggingface.co/bigscience/bloomz) (560M, 1.1B, 1.7B, 3B, 7.1B, 176B) - [BLOOM](https://huggingface.co/bigscience/bloom) & [BLOOMZ](https://huggingface.co/bigscience/bloomz) (560M, 1.1B, 1.7B, 3B, 7.1B, 176B)
## Supported Training Approach ## Supported Training Approaches
- [(Continually) pre-training](https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf) - [(Continually) pre-training](https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf)
- Full-parameter training - Full-parameter training
- Selected-parameter training - Partial-parameter training
- [LoRA](https://arxiv.org/abs/2106.09685) - [LoRA](https://arxiv.org/abs/2106.09685)
- [Supervised fine-tuning](https://arxiv.org/abs/2109.01652) - [Supervised fine-tuning](https://arxiv.org/abs/2109.01652)
- Full-parameter training - Full-parameter training
- Selected-parameter training - Partial-parameter training
- [LoRA](https://arxiv.org/abs/2106.09685) - [LoRA](https://arxiv.org/abs/2106.09685)
- [RLHF](https://arxiv.org/abs/2203.02155) - [RLHF](https://arxiv.org/abs/2203.02155)
- [LoRA](https://arxiv.org/abs/2106.09685) - [LoRA](https://arxiv.org/abs/2106.09685)