Faculty Recruiting Support CICS

CIIR Talk Series - Universal Models for Language and Beyond

16 Sep
Friday, 09/16/2022 1:30pm to 2:30pm
CS 150/151; Zoom
Seminar

Title: Universal Models for Language and Beyond

Abstract: This talk makes the case for universal models in language as well as potential ways to go beyond text modality - an emergent and compelling research direction. Universal models are models that dispense with task specific niches altogether. I discuss the recent emerging trend of unification with respect to model, task, and input representation and how these research areas come together to push forward the vision of truly universal language models. Specifically, I discuss several efforts that we are working on such as Unifying Language Learning Paradigms (UL2 model), Unifying Retrieval and Search with NLP (Differentiable Search Indexes) and Multimodal Unifications (ViT, PolyViT, etc.), and finally how scaling up is a safe bet toward such a goal.

Bio: Mostafa is a research scientist at Google Brain. He has been working on scaling neural networks for language, vision, and robotics. Besides large scale models, he works on improving the allocation of compute in neural networks, in particular Transformers, via adaptive and conditional computation. Mostafa obtained his Ph.D. from the University of Amsterdam where he worked on training neural networks with imperfect supervision.

To attend this talk via Zoom, click here. To obtain the passcode for this series, please see the event advertisement on the seminars email list or reach out to Alex Taubman. For any questions about this event with the Center for Intelligent Information Retrieval, please contact Jean Joyce.