Difference between revisions of "Generative Pre-trained Transformer"
Jump to navigation
Jump to search
Line 13: | Line 13: | ||
== Related == | == Related == | ||
* [[/usr/lib/systemd/system-generators/systemd-gpt-auto-generator]] | * [[/usr/lib/systemd/system-generators/systemd-gpt-auto-generator]] | ||
+ | * [[GUID]] | ||
== See also == | == See also == |
Latest revision as of 15:20, 24 August 2023
wikipedia:Generative Pre-trained Transformer
- GPT-4 (Mar 2023)
- GPT-3 (Jun 2020, beta) the architecture is a decoder-only transformer network with a 2048-token-long context and 175 billion parameters, requiring 800GB to store.
- GPT-2 (Feb 2019)
Related[edit]
See also[edit]
- Transformer, GPT, Transformer 8, Ethched, Megatron-Core
- GPT, GPT-2, GPT-3, GPT-4, GPT-4o, Tiktoken, Bigram, Transformer, PaLM, ChatGPT
Advertising: