AGATE Seminar

Logo

Applied Geometry, Algebra, and Topology in Edinburgh

Applied Geometry, Algebra, and Topology in Edinburgh (AGATE)

AGATE is an informal, hybrid seminar hosted at the University of Edinburgh, with Heriot-Watt and University of Glasgow as participating institutions, open to anyone who is interested in applied aspects of geometry, algebra and topology. AGATE’s remit spans topics including algebraic statistics, geometric deep learning, and topological data analysis. Each semester has a thematic focus on applications to a particular domain.

When: Wednesdays 15:05 to 16:00
Spring 2025 Location: 2.11 Appleton Tower and on Zoom.

Organizers: Djordje Mihajlovic, Siddharth Setlur, Sjoerd Beentjes, Darrick Lee, and Emily Roff

To join the mailing list, send an email to sympa at mlist.is.ed.ac.uk with nothing in the subject line and in the message body put the following:

SUBSCRIBE agate-seminar [your name]
QUIT

Spring 2025 Talks

Applications to Biology and Medicine

Feb. 12   Guowei Wei (Michigan State University, online)
Topological Deep Learning on Graphs, Manifolds and Curves

In the past few years, topological deep learning (TDL), a term coined by us in 2017, has become an emerging paradigm in artificial intelligence (AI) and data science. TDL is built on persistent homology (PH), a vital tool in topological data analysis (TDA) that bridges the gap between complex geometry and abstract topology through multiscale analysis. While TDA has made huge strides in a wide variety of scientific and engineering disciplines, it has many limitations. I will discuss our recent effort in extending the scope of TDA from graphs to manifolds and curves, through new formulations from algebraic topology, geometric topology, and differential topology. I will also discuss how TDL achieved its victories in worldwide annual competitions in computer-aided drug design, discovered SARS-CoV-2 evolutionary mechanism, and accurately predicted emerging dominant viral variants.

Feb. 19   Kelly Maggs (MPI-CBG)
Cohomology Classes in the RNA Transcriptome

In this talk, I will discuss the use of persistent cohomology to detect circular structure in scRNA-seq data, which we will use to define a system of statistically enriching gene sets for circular structure. We will also develop a differential form-based technique for estimating the phase of genes exhibiting cyclic expression patterns. I will present this applied to real datasets studying the cell cycle, tissue re-generation and senescence in diverse experimental conditions.

Feb. 26   Agnese Barbensi (University of Queensland, online) (Different Time: 10:05 - 11:00, Location - 2.04 Appleton Tower)
Topologically steered simulations and the role of geometric constraints in protein knotting

We introduce a method to determine the optimal pathway by which a polymer may knot or unknot, while subject to a given set of physics, and we investigate the effect of imposing geometric constraints. We show that with protein-like geometric constraints, the frequency of twist knots increases, similar to the observed abundance of twist knots in protein structures. This is joint work with A.Klotz and D.Goundaroulis.

Mar. 5   Roan Talbut (Imperial College London)
Tropical Gradient Descent

The field of tropical statistics - motivated by the identification of the tropical Grassmannian and the space of phylogenetic trees - has produced a range of unconstrained optimisation problems over the tropical projective torus. We will review the types of convexity exhibited by tropical loss functions in statistics, and we propose a new gradient descent method for solving tropical optimisation problems. Theoretical results establish global solvability for tropically star-quasi-convex problems, and numerical experiments demonstrate the method's superior performance over classical descent for tropical optimisation problems which exhibit tropical quasi-convexity but not classical convexity. Notably, tropical gradient descent seamlessly integrates into advanced optimisation methods, such as Adam, offering improved overall performance.

Mar. 12   Katharina Limbeck (Helmholtz Munich)
Studying the Shape of Omics Data

This seminar explores geometric and topological approaches for analysing omics data across multiple scales, focusing on two case studies. First, we identify spatial patterns in transcriptomics data using persistent homology. Specifically, we leverage functional summaries to perform permutation testing for spatial randomness in gene expression values. Our approach offers greater robustness and accuracy than alternative methods for detecting spatial dependence. Second, we examine metric space magnitude, a recently established geometric invariant that summarises the effective size and diversity of a space. Applied to cancer genomics, magnitude quantifies tumour genomic heterogeneity, a key factor in cancer progression and clinical outcomes, and distinguishes cancer subtypes based on copy-number alterations. These case studies demonstrate the power of using multi-scale geometric descriptors, namely persistent homology and magnitude, within a statistical framework to uncover meaningful structure in complex omics data.

Mar. 19   Eleni Panagiotou (Arizona State University, online)
Novel Measures of Complexity of Open Curves in 3-space and the Topological Landscape of Proteins

Filamentous material may exhibit structure dependent material properties and function that depends on their entanglement. Even though intuitively entanglement is often understood in terms of knotting or linking, many of the filamentous systems in the natural world are not mathematical knots or links. In this talk we will introduce a novel framework in knot theory that can characterize the complexity of (collections of) open curves in 3-space in general. This leads to novel metrics of entanglement of open curves in 3-space that generalize classical topological invariants, like, for example, the Jones polynomial and Vassiliev invariants. For open curves, these are continuous functions of the curve coordinates and tend to topological invariants of classical knots and links when the endpoints of the curves tend to coincide. We will apply our methods to proteins and we will show that these enable us to create a new framework for understanding protein folding, which is validated by experimental data. Using the topological landscape of proteins, we show that the static native state geometry and topology reflects protein evolution dynamics. These results suggest that these topological metrics could serve as valuable reaction coordinates, bridging the gap between protein structure topology and dynamics for the first time.

Mar. 26   Iris Yoon (Wesleyan University, online) (Cancelled)
Topological tracing of encoded circular coordinates between neural populations

Recent developments in in vivo neuroimaging in animal models have made possible the study of information coding in large populations of neurons and even how that coding evolves in different neural systems. Topological methods, in particular, are effective at detecting periodic, quasi-periodic, or circular features in neural systems. Once we detect the presence of circular structures, we face the problem of assigning semantics: what do the circular structures in a neural population encode? Are they reflections of an underlying physiological activity, or are they driven by an external stimulus? If so, which specific features of the stimulus are encoded by the neurons? To address this problem, we introduced the method of analogous bars (Yoon, Ghrist, Giusti 2023). Given two related systems, say a stimulus system and a neural population, or two related neural populations, we utilize the dissimilarity between the two systems and Dowker complexes to find shared features between the two systems. We then leverage this information to identify related features between the two systems. In this talk, I will briefly explain the mathematics underlying the analogous bars method. I will then present applications of the method in studying neural population coding and propagation on simulated and experimental datasets. This work is joint work with Gregory Henselman-Petrusek, Robert Ghrist, Spencer Smith, Yiyi Yu, and Chad Giusti.

Apr. 2   Marc Fersztand (University of Oxford)
Harder-Narasimhan Filtrations of Persistence Modules

The Harder-Narasimhan types are a family of discrete isomorphism invariants for representations of finite quivers. We evaluate their discriminating power in the context of persistence modules over a finite poset, including multiparameter persistence modules (over a finite grid). In particular, we introduce the skyscraper invariant and proved amongst other that it is strictly finer than the rank invariant and incomparable with the generalised rank invariant. In order to study the stability of the skyscraper invariant, we extend its definition from the finite to the infinite setting and consider multiparameter persistence modules over Z^n and R^n. We then establish an erosion-type stability result for the skyscraper invariant in this setting. This talk is based on the content of the articles 10.1112/tlm3.70003 (with E. Jacquard, V. Nanda and U. Tillmann) and arXiv:2406.05069.

Apr. 9   Iolo Jones (Durham University)
New methods in diffusion geometry

Diffusion geometry is a new framework for geometric and topological data analysis that defines Riemannian geometry for probability spaces. This lets us apply the huge wealth of theory and methods from classical differential geometry as tools for data analysis. In this talk, I will outline the basic theory of diffusion geometry, like the construction of vector fields and differential forms. I will also survey a range of new data analysis tools, including vector calculus, solving spatial PDEs on data, finding integral curves and geodesics, and finding circular coordinates for de Rham cohomology classes. In the very special case of data from manifolds, we can compute the curvature tensors and dimension. These methods are highly robust to noise and fast to compute when compared with comparable methods like persistent homology.



In Spring 2025, AGATE is supported by the Engineering and Physical Sciences Research Council [grant number: EP/Y035232/1] and [grant number: EP/R034826/1]

Autumn 2024 Talks

Applications to Machine Learning

Oct. 2   Darrick Lee
Path Signatures in Machine Learning

The path signature is a way to represent a path as an infinite sequence of tensors. We provide a high level introduction to signatures, highlighting the algebraic and geometric aspects of this construction, and discuss how this can be used to study sequences (time series) in machine learning.

Oct. 9   No Seminar

Oct. 16   Amos Storkey
Topological and Geometric Elements in Modern Deep Learning - Benefits and Challenges

This talk will take a simple introduction to machine learning, especially as used in computer vision. We then go on to see the different ways issues of geometry and topology turn up and are handled within the field. We examine the promise, in terms of generalisation, that building geometric understanding adds to a model. At the same time we recognise the challenges that imposing a rigid abstract geometry on a real world space can bring. I will give one example of our work decomposing structure and motion using a Hamiltonian model structure, before opening things up for discussion as to what the future opportunities are.

Oct. 23   Ting Lin (Peking University)
(Different Location: 2.14 Appleton Tower)
Universal Approximation Properties of Deep Neural Networks: A Control Theory Perspective

In this talk, I will discuss the approximation properties of deep neural networks, with a particular focus on residual-type structures, a popular architecture in deep learning. We will conceptualize ResNet as a continuous control system, specifically as a parametric dynamical system. Based on this framework, we will explore the universal approximation and interpolation properties of deep neural networks. We show that any nonlinear activation function can have universal approximation property. Furthermore, we will discuss extensions to symmetric cases, including permutation and translation invariance, which are useful in scientific computing. This is based on joint work with Jingpu Cheng (NUS), Qianxiao Li (NUS), and Zuowei Shen (NUS).

Oct. 30   Daniel Windisch
Real Algebraic Geometry for Games and their Equilibria

The classical notion of Nash equilibria imposes the somewhat unnatural assumption of independent non-cooperative acting on the players of a game. In 2005, the philosopher Wolfgang Spohn introduced a new concept, called dependency equilibria, that also takes into consideration cooperation of the players. Dependency equilibria are, however, much more involved from a mathematical viewpoint. This talk will give the necessary background in game theory and will show how basic (real) algebraic geometry can be used to study dependency equilibria and game theoretical questions in general. It is based on joint work with Irem Portakal.

Nov. 6   Kaibo Hu
The Bernstein-Gelfand-Gelfand (BGG) machinery and applications

In this talk, we first review the de Rham complex and the finite element exterior calculus, a cohomological framework for structure-preserving discretisation of PDEs. From de Rham complexes, we derive other complexes with applications in elasticity, geometry and general relativity. Algebraic structures (information on cohomology) imply a number of analytic results, such as the Hodge-Helmholtz decomposition, Poincaré-Korn inequalities and compactness. The derivation, inspired by the Bernstein-Gelfand-Gelfand (BGG) construction, also provides a general machinery to establish results for tensor-valued problems (e.g., elasticity) from de Rham complexes (e.g., electromagnetism and fluid mechanics). We discuss some applications in this direction, including the construction of bounded homotopy operators (Poincaré integrals) and finite elements.

Nov. 13   Patrick Rubin-Delanchy
(Different Time: 16:05 - 17:00)
What makes a good embedding?

Embeddings are continuous vector representations of entities, such as words or nodes, perhaps most widely known for their role in modern AI systems such as large language models. In this talk I consider a different goal, which is statistical analysis, or the creation of knowledge. An embedding is an instrument which allows us to observe complex, unstructured, or otherwise intractable data, in a way that we can use. In embeddings, simple statistical models are tenable; concepts like similarity, or trend, have a `shape’; abstract notions such as political opinion, the health of a patient, the function of a cell, can be made geometric and measurable; and we can uncover truths that could have seemed completely absent from the raw data. I illustrate these points with new theory connecting statistical models, embeddings and the manifold hypothesis, and with motivating problems in science, security, and recent work with Southmead hospital at Bristol. We welcome feedback on our codebase, pyemb, a work in progress implementing these ideas: https://pyemb.github.io/pyemb/html/index.html

Nov. 20   Alexandros Keros
Understanding and navigating the self-assembly of nanoparticles

Material synthesis though nanoparticle self-assembly enables the creation of specialized structures with transformative applications in engineering and biology. However, efficient and robust control of the assembly process, and prediction of macro-scale properties, are obstructed by the inherent stochasticity and complexity of particle dynamics. I will review topological and geometric methods for characterising particle configurations in the context of material science, and explore learning and control approaches for steering their dynamics.

Nov. 27   Rik Sarkar
Hyperbolic representation of trees and applications in machine learning

Hyperbolic geometry has recently become an increasingly important topic in machine learning due to its usefulness in representing hierarchies, graphs and other types of non-euclidean data. In this talk we will discuss hyperbolic geometry and a theorem that any tree can be embedded in the hyperbolic plane with arbitrarily low distortion. Then we will review how similar ideas are used in several areas of machine learning.