GPT-3
Generative Pre-trained Transformer 3 is an autoregressive language model that uses deep learning to produce human-like text.
175B parameters (100x larger than GPT-2)
Interesting new phenomenon (appearing only when network is large enough)
Generative Pre-trained Transformer 3 is an autoregressive language model that uses deep learning to produce human-like text.
175B parameters (100x larger than GPT-2)
Interesting new phenomenon (appearing only when network is large enough)