$\color{black}\rule{365px}{3px}$
“Language Models are Few-Shot Learners” - 2020
Link to Paper:
Table of Contents
$\color{black}\rule{365px}{3px}$
$\color{black}\rule{365px}{3px}$
One-shot / Few-shot Learning is to give single/multiple example(s) (or "shot(s)") of a task in its input prompt and is expected to generalize and perform well on that task. This is possible because GPT is retrained on a huge amount of diverse text data, that taught the model to develop a general understanding of language and a wide variety of tasks.