Difference between revisions of "LLaMA2"

From wikieduonline
Jump to navigation Jump to search
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
[[wikipedia:LLaMa]]
+
[[wikipedia:LLaMA2]] pretrained models are trained on 2 [[trillion]] [[tokens]].
 
* https://ai.meta.com/llama/
 
* https://ai.meta.com/llama/
  
Line 11: Line 11:
 
* {{ollama}}
 
* {{ollama}}
 
* {{llama}}
 
* {{llama}}
 +
* {{Meta AI}}
 
* {{LLM}}
 
* {{LLM}}
  
 
[[Category:AI]]
 
[[Category:AI]]

Latest revision as of 19:24, 22 December 2023

Advertising: