Faculty Recruiting Support CICS

Contrastive Decoding: Open-Ended Text Generation as Optimization

14 Feb
Tuesday, 02/14/2023 11:30am to 12:30pm
Virtual via Zoom
Seminar
Speaker: Xiang Lisa Li

Abstract: Likelihood, although useful as a training loss, is a poor search objective for guiding open-ended generation from language models (LMs). Existing generation algorithms must avoid both unlikely strings, which are incoherent, and highly likely ones, which are short and repetitive. We propose contrastive decoding (CD), a more reliable search objective that returns the difference between likelihood under a large LM (called the expert, e.g. OPT-13b) and a small LM (called the amateur, e.g. OPT-125m). CD is inspired by the fact that the failures of larger LMs are even more prevalent in smaller LMs, and that this difference signals exactly which texts should be preferred. CD requires zero training, and produces higher-quality text than decoding from the larger LM alone. It also generalizes across model types (OPT and GPT2) and significantly outperforms four strong decoding algorithms in automatic and human evaluations.

Bio: Xiang Lisa Li is a third-year PhD student in computer science at Stanford University, advised by Percy Liang and Tatsunori Hashimoto. She works on controllable text generation/decoding and efficient adaptation of pre-trained language models. Lisa is supported by a Stanford Graduate Fellowship and is the recipient of an EMNLP Best Paper award.

Related paper: https://arxiv.org/pdf/2210.15097.pdf

Join the Seminar

Host
: