Difference between revisions of "Transformer (machine learning model)"
Jump to navigation
Jump to search
Tags: Mobile web edit, Mobile edit |
|||
Line 6: | Line 6: | ||
* [[Attention is all you need (2017)]] | * [[Attention is all you need (2017)]] | ||
* [[GPT]]: [[GPT-4]], [[GPT-3]] | * [[GPT]]: [[GPT-4]], [[GPT-3]] | ||
− | + | * [[Diffusion]] | |
== See also == | == See also == |
Revision as of 11:12, 9 April 2023
wikipedia:Transformer (machine learning model)
Related
See also
- Transformer, GPT, Transformer 8, Ethched, Megatron-Core
- GPT, GPT-2, GPT-3, GPT-4, GPT-4o, Tiktoken, Bigram, Transformer, PaLM, ChatGPT
- Machine learning, Deep learning, AWS Sagemaker, PyTorch, Kubeflow, TensorFlow, Keras, Torch, Spark ML, Tinygrad, Apple Neural Engine, Scikit-learn, MNIST, MLOps, AutoML, ClearML, PostgresML, AWS Batch, Transformer, Diffusion, Backpropagation, JAX, Vector database, LLM, The Forrester Wave: AI/ML Platforms
- OpenAI, GitHub Copilot, ChatGPT, OpenAI Codex, GPT-3, GPT-4, Whisper, Sam Altman, Mira Murati, Greg Brockman, Ilya Sutskever, OpenAI board, John Schulman
Advertising: