LLaMA2
(Redirected from LlaMA2)
Jump to navigation
Jump to search
wikipedia:LLaMA2 pretrained models are trained on 2 trillion tokens.
Related[edit]
- q4
- Ollama uses 4-bit quantization
See also[edit]
Advertising:
wikipedia:LLaMA2 pretrained models are trained on 2 trillion tokens.
Advertising: