Difference between revisions of "Large language model (LLM)"
Jump to navigation
Jump to search
Line 7: | Line 7: | ||
* [[ChatBot]] | * [[ChatBot]] | ||
* [[Mixtral 8x7B]] | * [[Mixtral 8x7B]] | ||
+ | |||
+ | * [[Apple Ferret]] [[MLLM]] | ||
== See also == | == See also == |
Revision as of 07:54, 26 January 2024
wikipedia:Large language model (Trends)
- Gemini (Dec 2023)
See also
- LLM, MLLM, LoRA, LLaMA, LLaMA3, QLoRA, Falcon, PaLM 2, Gemini, Mixtral 8x7B, BitNet, Measuring Massive Multitask Language Understanding (MMLU), NVLM
- Artificial neural networks, Neuronal network (NN), CNN, Micrograd, NPU, ConvNet, AlexNet, GoogLeNet, Apache MXNet, Neural architecture search, DAG, Feedforward neural network, NeurIPS, Feature Pyramid Network, TPU, NPU, Apple Neural Engine (ANE), LLM, TFLOPS
- AI: Autonomous driving, OpenAI, Google AI,Eliezer Yudkowsky, DeepMind, Computer Vision, Neural network, Vertex AI, Instadeep, Deep learning, Infogrid, Sapling, AssemblyAI, V7, MTIA, Yann LeCun, AI WiW, Salesforce AI, Pika, Amazon Q, LLM, Ollama, Cloud AI Developer Services, Hugging Face, Databricks
Advertising: