Faculty Recruiting Support CICS

Neural Generative Models and Representation Learning for Information Retrieval

20 May
Add to Calendar
Monday, 05/20/2019 10:00am to 12:00pm
CS 151
Ph.D. Seminar
Speaker: Qingyao Ai


Information Retrieval (IR) concerns about the structure, analysis, organization, storage, and retrieval of information. Among different retrieval models proposed in the past decades, generative retrieval models, especially those under the statistical probabilistic framework, are one of the most popular techniques that have been widely applied to Information Retrieval problems. They are famous for their well-grounded theory and good empirical performance in text retrieval. Recently, advances in deep learning techniques provide new opportunities for representation learning and generative models for information retrieval. In contrast to statistical models, neural models have much more flexibility because they model information and data correlation in latent spaces without explicitly relying on any prior knowledge. It has been shown that semantically meaningful representations of text, images and many types of information can be acquired with neural models through supervised or unsupervised training. The effectiveness of neural models for information retrieval, however, is mostly unexplored. In this thesis, we develop a new generative retrieval framework with neural models for information retrieval. We present the first theoretical analysis and adaptation of existing neural embedding models for ad-hoc retrieval tasks. We propose the first embedding-based neural generative model for retrieval tasks with heterogeneous information (i.e., personalized product search). And further, we generalize our neural generative framework for complicated retrieval scenarios that concern text, images, knowledge entities and their relationships. Empirical results show that the proposed models can significantly improve the effectiveness of information retrieval systems in a variety of search scenarios.

Advisor: W. Bruce Croft