LLaMA2
Jump to navigation
Jump to search
wikipedia:LLaMA2 pretrained models are trained on 2 trillion tokens.
Related
- q4
- Ollama uses 4-bit quantization
See also
Advertising: