Few-Shot Example Discovery Task
(Redirected from Example Mining Task)
Jump to navigation
Jump to search
A Few-Shot Example Discovery Task is a prompt optimization task that automatically finds optimal demonstration examples through clustering and similarity metrics.
- AKA: Few-Shot Example Discovery, Automatic Example Selection Task, Few-Shot Learning Optimization Task, Demonstration Discovery Task, Example Mining Task.
- Context:
- It can select representative examples using clustering algorithms and centroid selection.
- It can measure example similarity through embedding distances and semantic metrics.
- It can optimize example diversity to cover different patterns and edge cases.
- It can balance example relevance with computational constraints and context limits.
- It can utilize active learning to identify informative examples from large datasets.
- It can apply stratified sampling to ensure class balance and feature coverage.
- It can leverage retrieval systems to find contextually relevant examples at inference time.
- It can implement dynamic selection where examples change based on input characteristics.
- It can evaluate example quality through ablation studys and performance metrics.
- It can combine manual curation with automatic discovery for hybrid approaches.
- ...
- It can range from being a Static Few-Shot Example Discovery Task to being a Dynamic Few-Shot Example Discovery Task, depending on its selection timing.
- It can range from being a Random Few-Shot Example Discovery Task to being an Optimized Few-Shot Example Discovery Task, depending on its selection strategy.
- It can range from being a Task-Specific Few-Shot Example Discovery Task to being a Universal Few-Shot Example Discovery Task, depending on its application scope.
- It can range from being a Small-Set Few-Shot Example Discovery Task to being a Large-Set Few-Shot Example Discovery Task, depending on its example count.
- ...
- Example(s):
- BootstrapFewShot in DSPy, which generates validated examples using teacher programs.
- Retrieval-Augmented Few-Shot, which selects examples based on query similarity.
- Diversity-Based Selection, which maximizes example coverage using clustering.
- Performance-Based Selection, which chooses examples that maximize validation scores.
- ...
- Counter-Example(s):
- Random Example Selection, which lacks optimization and selection criterion.
- Manual Example Curation, which relies on human judgment rather than automatic discovery.
- Zero-Shot Learning, which uses no examples for task completion.
- Full Training Set, which uses all available data rather than selected examples.
- See: Prompt Optimization Task, Few-Shot Learning, Example Selection Algorithm, DSPy Framework, Clustering Algorithm, Similarity Metric, Active Learning, Retrieval System, In-Context Learning.