Feaured Speaker: Mustafa Hajij, Assistant Professor University of San Francisco
Over the past decade, deep learning has been remarkably successful at solving a massive set of problems on data types including images and sequential data. This success drove the extension of deep learning to other discrete domains such as sets, point clouds, graphs, 3D shapes, and discrete manifolds. While many of the extended schemes have successfully tackled notable challenges in each particular domain, the plethora of fragmented frameworks have created or resurfaced many long-standing problems in deep learning such as explainability, expressiveness and generalizability. Moreover, theoretical development proven over one discrete domain does not naturally apply to the other domains. Finally, the lack of a cohesive mathematical framework has created many ad hoc and inorganic implementations and ultimately limited the set of practitioners that can potentially benefit from deep learning technologies. In this talk I will talk about a generalized higher-order domain called combinatorial complex (CC) and utilize it to build a new class of attention-based neural networks called higher-order attention networks (HOANs). CCs generalize many discrete domains that are of practical importance such as point clouds, 3D shapes, (hyper)graphs, simplicial complexes, and cell complexes. The topological structure of a CC encodes arbitrary higher-order interactions among elements of the CC. By exploiting the rich combinatorial and topological structure of CCs, HOANs define a new class of higher-order message passing attention-based networks that unify existing higher-order models based on hypergraphs and cell complexes. I will demonstrate the reducibility of any CC to a special graph called the Hasse graph, which allows the characterization of certain aspects of HOANs and other higher order models in terms of graph-based models. Finally, the predictive capacity of HOANs will be demonstrated in shape analysis and in graph learning, competing against state-of-the-art task-specific neural networks. All lectures are free and open to the public.