Difference between revisions of "GPT-3"
Jump to navigation
Jump to search
Line 1: | Line 1: | ||
− | [[wikipedia:GPT-3]] | + | [[wikipedia:GPT-3]] (Jun 2020) |
[[wikipedia:Generative Pre-trained Transformer 3]] | [[wikipedia:Generative Pre-trained Transformer 3]] |
Revision as of 17:54, 8 April 2023
wikipedia:GPT-3 (Jun 2020)
wikipedia:Generative Pre-trained Transformer 3
The architecture is a decoder-only transformer network with a 2048-token-long context and then-unprecedented size of 175 billion parameters, requiring 800GB to store.
See also
Advertising: