Generative pre-trained transformer - Wikipedia On May 28, 2020, OpenAI introduced GPT-3, a model with 175 billion parameters that was trained on a larger dataset compared to GPT-2 It marked a significant advancement in few-shot and zero-shot learning abilities