Faculty Recruiting Support CICS

Incremental Non-Greedy Clustering at Scale

09 Sep
Thursday, 09/09/2021 2:00pm to 4:00pm
Zoom
PhD Thesis Defense
Speaker: Nicholas Monath

Abstract: Clustering is the task of organizing data into meaningful groups. Modern clustering applications such as entity resolution put several demands on clustering algorithms: (1) scalability to massive numbers of points as well as clusters, (2) incremental additions and deletions of data, (3) support for any user-specified similarity functions.

Hierarchical clustering is often desired because it represents multiple alternative clusters (e.g., at different granularities), providing for both fine-grained clusters as well as uncertainty in the presence of newly arriving data. Previous work on hierarchical clustering does not fully address all of these properties. Work on incremental hierarchical clustering often makes greedy, irrevocable clustering decisions that are regretted in the presence of future data. Work on scalable hierarchical clustering does not support incremental additions or deletions. These methods often make requirements on the similarity functions used and/or tend to over merge clusters resulting in less than desired accuracies.

In this thesis, we present incremental and scalable methods for clustering to empirically satisfy the above criteria. A method would need to represent uncertainty and meaningful alternative clusterings, efficiently and effectively reconsider past decisions in the incremental case, and also use parallelism to scale to massive datasets in the batch setting. Our method, Grinch, handles incrementally arriving data, in a non-greedy fashion, by applying tree structure re-arrangements based on any specified similarity function to reconsider past decisions. To utilize parallel processing for scalability to massive datasets, our method, SCC, operates in a round-based bottom-up manner, making certain decisions independently in parallel within each level to achieve efficiency. We show how SCC can be combined with the tree-structure rearrangements in Grinch to form a mini-batch online hierarchical clustering method achieving both scalable and incremental performance. Lastly, we generalize our hierarchical clustering approaches to DAG-structured ones, which can better represent uncertainty in clustering by representing overlapping (but not nested) clusters.

For each of the proposed methods, we provide both a theoretical and empirical analysis. Empirically, our methods achieve state-of-the-art results on clustering benchmarks in both the batch and the incremental settings.

Join the Zoom Meeting