Difference between revisions of "LLaMA2"

From wikieduonline
Jump to navigation Jump to search
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
[[wikipedia:LLaMa]]
+
[[wikipedia:LLaMA2]] pretrained models are trained on 2 [[trillion]] [[tokens]].
 
* https://ai.meta.com/llama/
 
* https://ai.meta.com/llama/
  
Line 10: Line 10:
 
== See also ==
 
== See also ==
 
* {{ollama}}
 
* {{ollama}}
 +
* {{llama}}
 +
* {{Meta AI}}
 
* {{LLM}}
 
* {{LLM}}
  
 
[[Category:AI]]
 
[[Category:AI]]

Latest revision as of 19:24, 22 December 2023

Advertising: