Search

In-context learning (overview)

In-context learning (ICL) is a type of machine learning where a model learns to perform a new task by being given a few examples of the task in the context of a prompt. This is in contrast to traditional machine learning, where models are trained on large datasets of labeled data.

ICL is particularly useful for tasks where it is difficult or expensive to collect labeled data, or where the task is novel and there is no existing labeled dataset. For example, ICL can be used to train a model to translate a new language, or to generate a new type of creative content.

ICL works by leveraging the model’s pre-training on a large dataset of text and code. The pre-training allows the model to learn general patterns in language and code, which can then be used to learn new tasks quickly.

To use ICL, the user provides the model with a prompt that consists of a few examples of the task, along with a description of the task itself. The prompt can be written in natural language, and it can be as simple as a few sentences.

Once the model has been given the prompt, it can be used to perform the task on new examples. For example, if the task is to translate a sentence from English to French, the model can be used to translate any new English sentence into French.

ICL is a powerful new technique for machine learning, and it has the potential to revolutionize the way that we develop and deploy AI applications.

See Also: In-context learning (examples included)

<< Return to Glossary

Subscribe to our newsletter

Join over 1,000+ other people who are mastering AI in 2024

You will be the first to know when we publish new articles

Subscribe to our newsletter

Join over 1,000+ other people who are mastering AI in 2024