Discrete-to-continuum graphical methods for large-data clustering, classification and segmentationMS17

Graph methods for machine learning have been found to be extraordinarily successful in several imaging and data analysis applications. They aim to build a graph from the data by encoding similarities between elements and use possible non-local similarity measures for comparison, clustering and classification. The study of large-data limits of state-of-the-art graph models such as Ginzburg-Landau functionals, Cheeger cuts etc. is fundamental for the design of efficient optimisation strategies. In this mini-symposium we gather experts in the field of mathematical graph modelling and large-data convergence to highlight analogies and differences between continuum and discrete variational models for data analysis.

Scaling Results in Lp Regularised Semi-Supervised Learning
Matthew Thorpe (University of Cambridge)
Discrete to continuum limit of the graph Ginzburg-Landau functional
Yves van Gennip (University of Nottingham)
Gromov-Hausdorff limit of Wasserstein spaces on point clouds
Garcia Trillos Nicolas (Brown University)
Large data and zero noise limits of graph-based semi-supervised learning algorithms
Matt Dunlop (Caltech)
Luca Calatroni (CMAP, École Polytechnique CNRS)
Daniel Tenbrinck (University of Münster)
Matthew Thorpe (University of Cambridge)
bayesian methods, graphical methods, image reconstruction, image segmentation, machine learning, partial differential equation models