Pages that link to "2020 LanguageModelsAreFewShotLearner"
Jump to navigation
Jump to search
The following pages link to 2020 LanguageModelsAreFewShotLearner:
Displayed 6 items.
- Brown, Mann et al., 2020 (redirect page) (← links)
- Brown et al., 2020 (redirect page) (← links)
- Ilya Sutskever (← links)
- 2019 LanguageModelsAreUnsupervisedMu (← links)
- 2020 ItsNotJustSizeThatMattersSmallL (← links)
- 2022 EmergentAbilitiesofLargeLanguag (← links)
- 2023 DERAEnhancingLargeLanguageModel (← links)
- Zero-Shot In-Context Learning Task (← links)
- Few-Shot Information Extraction (IE) Task (← links)
- 2023 TowardsExpertLevelMedicalQuesti (← links)
- 2021 CUADAnExpertAnnotatedNlpDataset (← links)
- 2023 TuningLanguageModelsAsTrainingD (← links)
- OpenAI GPT-3 Large Language Model (LLM) (← links)
- 2022 PTuningPromptTuningCanBeCompara (← links)
- 2024 ExtensiblePromptsforLanguageMod (← links)
- Multi-Head Attention Mechanism (← links)
- In-Context-based (ICL) LLM Task (← links)
- Iterative Learning Rate Reduction Technique (← links)
- Dario Amodei (1983-) (← links)
- Language models are few-shot learners (redirect page) (← links)
- Brown et al. (2020) (redirect page) (← links)
- Few-Shot Natural Language Processing (NLP) Task (← links)
- OpenAI GPT-3 Large Language Model (LLM) (← links)