In-Context Learning

470 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find In-Context Learning models and implementations
2 papers
18,673
2 papers
7,288
2 papers
6,630
See all 8 libraries.

Most implemented papers

TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second

automl/tabpfn 5 Jul 2022

We present TabPFN, a trained Transformer that can do supervised classification for small tabular datasets in less than a second, needs no hyperparameter tuning and is competitive with state-of-the-art classification methods.

Neural Codec Language Models are Zero-Shot Text to Speech Synthesizers

microsoft/unilm 5 Jan 2023

In addition, we find Vall-E could preserve the speaker's emotion and acoustic environment of the acoustic prompt in synthesis.

From system models to class models: An in-context learning paradigm

forgi86/sysid-neural-transformers 25 Aug 2023

Is it possible to understand the intricacies of a dynamical system not solely from its input/output pattern, but also by observing the behavior of other systems within the same class?

PanGu-$α$: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation

mindspore-ai/models 26 Apr 2021

To enhance the generalization ability of PanGu-$\alpha$, we collect 1. 1TB high-quality Chinese data from a wide range of domains to pretrain the model.

Data Distributional Properties Drive Emergent In-Context Learning in Transformers

deepmind/emergent_in_context_learning 22 Apr 2022

In further experiments, we found that naturalistic data distributions were only able to elicit in-context learning in transformers, and not in recurrent models.

Large Language Models Are Human-Level Prompt Engineers

keirp/automatic_prompt_engineer 3 Nov 2022

By conditioning on natural language instructions, large language models (LLMs) have displayed impressive capabilities as general-purpose computers.

OpenICL: An Open-Source Framework for In-context Learning

shark-nlp/openicl 6 Mar 2023

However, the implementation of ICL is sophisticated due to the diverse retrieval and inference methods involved, as well as the varying pre-processing requirements for different models, datasets, and tasks.

Enhancing In-Context Learning with Answer Feedback for Multi-Span Question Answering

nju-websoft/FBPrompt 7 Jun 2023

Previous researches found that in-context learning is an effective approach to exploiting LLM, by using a few task-related labeled data as demonstration examples to construct a few-shot prompt for answering new questions.

VILA: On Pre-training for Visual Language Models

mit-han-lab/llm-awq 12 Dec 2023

Visual language models (VLMs) rapidly progressed with the recent success of large language models.

What needs to go right for an induction head? A mechanistic study of in-context learning circuits and their formation

aadityasingh/icl-dynamics 10 Apr 2024

By clamping subsets of activations throughout training, we then identify three underlying subcircuits that interact to drive IH formation, yielding the phase change.