Basic Notions and Applied Topology:
The Centre is running a hybrid seminar on basic notions and applied topology jointly organized with University of Gdansk and Gdansk University of Technology. The seminar is mainly aimed at PhD students, advanced Masters students, and early PostDocs. Anyone else who is interested is also welcome to join and also to give talks if they would like to, however preference will be given to the mentioned target group.
The main idea of this seminar Basic Notions seminar is to bring together scientists from different backgrounds to give introductory talks on the basic concepts they use on a daily basis. In this way we improve our foundation for interdisciplinary work.
The seminar takes place on Mondays, 12:30 in IMPAN's room 1, in University
of Gdansk's lecture room 1.14 at the Institute of Informatics, and via Zoom under
Zoom Coordinates: 935 8975 4430
Please contact Julian Brüggemann for the pass code.
If you would like to give a talk as part of our seminar, please contact Julian Brüggemann, Jacek Gulgowski, or Justyna Signerska-Rynkowska.
Upcoming Talks
- 08/12/2025, 12:30,
Potential meeting to have a group discussion on implementing the classification tools from the previous week in Python. - 15/12/2025, 12:30,
Jan Senge: Resampling Methods
Abstract: This talk, based on Chapter 5 of An Introduction to Statistical Learning with Applications in Python, introduces Resampling Methods such as cross-validation and the bootstrap. We’ll discuss how these techniques improve model assessment and selection by providing more accurate estimates of prediction error and model variability, illustrated through practical Python applications. - Christmas Break
- 12/01/2026, 12:30,
Mateusz Masłowski: Linear Model Selection and Regularization (Part 1)
Abstract: This session, based on the first half of Chapter 6 of An Introduction to Statistical Learning with Applications in Python, explores Linear Model Selection techniques for improving model interpretability and performance. We’ll cover best subset, forward, and backward stepwise selection, discussing how these approaches identify the most informative predictors and balance complexity with predictive power. - 19/01/2026, 12:30,
Jacek Gulgowski: Linear Model Selection and Regularization (Part 2)
Abstract: In the second session, we turn to Regularization Methods, focusing on ridge regression and the lasso. We'll examine how these techniques use penalty terms to control model flexibility, reduce overfitting, and enhance prediction accuracy, with hands-on Python examples illustrating their practical differences and applications. - 26/01/2026, 12:30,
No seminar due to ATMCS in Leipzig - 02/02/2026, 12:30,
Jacek Gulgowski: Moving Beyond Linearity
Abstract: This talk introduces key techniques for modeling nonlinear relationships in supervised learning. We begin by examining polynomial regression and step functions, then develop more flexible approaches using basis functions and splines, including cubic splines and smoothing splines, to capture complex structure in data. The seminar also covers Generalized Additive Models (GAMs), which extend linear models by allowing nonlinear functions of predictors while retaining interpretability. - 09/02/2026, 12:30,
Marta Marszewska: Tree based methods
Abstract: This talk provides an accessible overview of tree-based methods for regression and classification. We will explore the fundamental concepts behind decision trees, including recursive partitioning, tree construction, and pruning for improved generalization. Building on these foundations, we will introduce ensemble methods - bagging, random forests, and boosting - which substantially enhance predictive accuracy by aggregating many weak learners. - 16/02/2026, 12:30,
Jan Senge: Support Vector Machines
Abstract: This seminar provides an intuitive introduction to Support Vector Machines (SVMs). We begin with the maximal margin classifier and support vector classifier, building geometric intuition for how SVMs separate classes with optimal margins. We then extend these ideas to the kernel trick, enabling highly flexible nonlinear decision boundaries through polynomial and radial basis function kernels. The talk also highlights key tuning parameters, practical considerations for model fitting, and strategies for avoiding overfitting. - 23/02/2026, 12:30,
Jakub Malinowski: Deep Learning (Part 1)
Abstract: This first session introduces the fundamental concepts and motivations behind deep learning. We begin with a discussion of why and when deep learning can outperform traditional statistical methods - especially for large, high-dimensional data. Next, we explore the architecture of neural networks: from simple single-layer networks to multilayer (deep) networks. Key learning mechanisms - including backpropagation, regularization, and stochastic gradient descent (SGD) - will be explained intuitively and with math as appropriate. We will also review practical considerations (e.g., network tuning, overfitting, capacity control), providing Python code examples to illustrate how deep networks are defined and trained in a real-world context. - 02/03/2026, 12:30,
Jakub Malinowski: Deep Learning (Part 2)
Abstract: The second session expands on the foundations by covering more advanced deep-learning techniques and their applications. We will examine methods such as dropout learning, network tuning strategies, and architectural choices that influence model performance. The talk will show how deep learning can tackle complex tasks in domains like image recognition, text classification, or other high-dimensional prediction problems. - 09/03/2026, 12:30,
Sylwester Piątek: Survival Analysis and Censored Data
Abstract: This seminar introduces the key concepts and methods of survival analysis. We begin by discussing the nature of survival (or time-to-event) data and the complications introduced by censoring - when the event of interest has not occurred for some subjects by study end or loss to follow-up. The talk then presents classical and modern tools for analyzing such data: we will cover the nonparametric estimation of survival curves (via the Kaplan-Meier estimator), compare survival experiences with the Log-Rank test, and introduce regression models for survival outcomes - in particular, the Cox proportional hazards model (hazard-based modeling), including discussion of the hazard function and handling of covariates. The talk also touches on more advanced considerations such as shrinkage for Cox models, time-dependent covariates, and diagnostic checks (e.g., verifying the proportional hazards assumption). - 16/03/2026, 12:30,
Janusz Przewocki: Unsupervised Learning (Part 1)
Abstract: This first session introduces the motivations and foundational methods for dimensionality reduction under unsupervised learning. We begin by discussing why dimension reduction matters - especially in high-dimensional data settings - and how it helps address issues like the "curse of dimensionality," multicollinearity, overfitting, and challenges in visualization and interpretation. Then we focus on Principal Component Analysis (PCA): its mathematical foundations, how it identifies dominant modes of variation, how to interpret the principal components, and how to choose the number of components. - 23/03/2026, 12:30,
Janusz Przewocki: Unsupervised Learning (Part 2)
Abstract: The second session delves into clustering methods and other techniques for uncovering latent structure in data without relying on response variables. We cover K-means clustering and Hierarchical clustering, including how they work, how to choose the number of clusters, how to decide on distance metrics, and practical pitfalls (e.g. scaling, sensitivity to initialization). We discuss how to interpret clusters, validate clustering solutions, and when unsupervised grouping might be appropriate.
Past Talks
- 01/12/2025, 12:30,
John Rick Dolor Manzanares: Classification
Abstract: This talk, based on Chapter 4 of An Introduction to Statistical Learning with Applications in Python, explores key methods for classification, including logistic regression, discriminant analysis, and K-nearest neighbors. We’ll discuss how these approaches model categorical outcomes, evaluate their performance using metrics like accuracy and ROC curves, and demonstrate their implementation through practical Python examples. - 17/11/2025, 12:30,
Clemens Bannwart: Morse-Smale vector fields: definition, properties and induced structures
Abstract: In this talk we introduce some topics which are important for my second talk (which will be given in the TDA Seminar on the following day). We unpack the definition of Morse-Smale vector fields and discuss some of their properties, such as structural stability and genericity. We devote some time to the gradient-like case, which is closely linked to Morse theory. We see how in this case we can obtain a chain complex, called the Morse complex, as well as a CW decomposition of the underlying manifold. Time-permitting, we discuss the relation between Morse theory and persistent homology. - 03/11/2025, 12:30,
Michał Bogdan: Linear Regression in Practice
Abstract: This will be a tutorial rather than a talk, and consider the second part of chapter 3 of "An Introduction to Statistical Learning with Applications in Python". We will discuss how to use linear regression in python and consider a couple of examples. - 27/10/2025, 12:30,
Iason Papadopoulos(University of Bremen): Multiparameter Persistence
Abstract: This talk is the first in a series of two talks (the second one will be in the Dioscuri TDA seminar), outlining a new vectorization method for multiparameter persistence modules with an arbitrary number of parameters. Multiparameter persistence extends the foundational ideas of persistent homology. Importantly, it can capture topological information of a point clouds with several functions. This talk introduces the definition and motivation behind multiparameter persistence. We will compare the structure and interpretability of multiparameter persistence modules with their one-parameter counterparts, highlighting the challenges that arise when working with multiple parameters. To address these issues, we will explore several approaches that extract meaningful topological information without requiring full classification of the modules. In particular, we will take a closer look at the Generalized Rank Invariant Landscape (GRIL), a recent vectorization method that provides a computable and interpretable invariant. - 20/10/2025, 12:30,
Michał Bogdan: Linear Regression
Abstract: This talk is based on the third chapter of "An introduction to statistical learning with applications in Python". Linear regression is a standard tool for predicting quantitative quantities. This talk explores both simple and multiple linear regression, detailing how these methods model relationships between predictors and a response variable. We will discuss estimation of coefficients, interpretation of model parameters, and evaluation metrics and residual standard error. - 13/10/2025, 12:30,
Julian Brüggemann: Introduction to Statistical Learning
Abstract: This talk is the first of a series of talks on the topics of statistical learning, machine learning, and similar topics. We follow the book "An introduction to statistical learning with applications in Python". In this talk, I will provide an overview over the series of talks to come and will capture some of the topics from chapter 1 and 2. - 06/10/2025, 12:30,
Justyna Signerska-Rynkowska: Dynamical and geometrical mechanism shaping response precision in neuron models
Abstract: Experimental studies of neuronal dynamics involve recording of both spontaneous activity patterns and the responses to sustained and short-term inputs. In the first part of the talk, I will describe underlying dynamical structures governing phenomena such as post inhibitory facilitation (PIF) and slope detection in a response to transient inputs in a class of nonlinear adaptive hybrid neuron models. In PIF an otherwise subthreshold excitatory input can induce a spike if it is applied with proper timing after an inhibitory pulse, while neurons displaying slope-detection property spike to a transient input only when the input’s rate of change is in a specific, bounded range. A key concept in this analysis is a firing threshold curve which allows us to explain these phenomena in the non-autonomous systems, building upon our understanding of corresponding systems with constant stimulus.
On the other hand, studying phenomena such as phase locking requires the time depending sustained stimulus and the use of our knowledge on the underlying autonomous system is very limited in this case. Nevertheless, phase-locking of ongoing oscillations to a periodic signal can be explored with a variety of analytical approaches. However, much less is known about what factors determine the response precision of excitable cells that are intrinsically at rest but are activated by periodic forcing and noise. We shed light on this coding precision by introducing a new tool, the dynamic threshold curve (DTC), which we apply to the study of well-established auditory neuron model.
The talk is based on joint works with Jonathan Rubin (University of Pittsburgh) and Jonathan Touboul (Brandeis University). - 09/06/2025, 12:30,
Marta Marszewska: Topological methods in feature extraction and classification of dynamical systems
Abstract: In this talk, we will present several topological methods for analyzing vector fields, including the Conley index, persistent homology, and the Euler Characteristic Curve (ECC) and Profile (ECP). We will discuss the advantages and limitations of each approach, with a focus on how they capture the underlying structure of dynamical systems. Additionally, we will introduce a continuous version of the Euler Characteristic Profile. This new approach is particularly useful for studying interval-valued functions, which arise when exact values are not known or are affected by uncertainty. Such methods are especially relevant in applications where robustness and interpretability are important. - 26/05/2025, 12:30,
Janusz Przewocki: Applications of Mapper
Abstract: tba - 19/05/2025, 12:30,
Giuliamaria Menara: From magnitude to path homology: a tour of graph invariants
Abstract: In this talk we provide an introduction to magnitude homology, a homology theory for graphs (and metric spaces) that captures combinatorial and geometric information through the lens of magnitude, a numerical invariant akin to Euler characteristic. We will explore the foundational aspects of magnitude homology, its relationship with graph theory, and its connections to other homology theories. Building on this, we will introduce *Eulerian magnitude homology*, a refined version of magnitude homology which offers new insights into the structure of graphs. We will discuss its properties, computational aspects, and potential applications. Finally, we will compare magnitude homology with *path homology*, another homology theory for digraphs, highlighting similarities, differences, and possible interactions between the two frameworks. - 28/04/2025, 12:30,
Asier López-Gordón : A friendly invitation to geometric mechanics
Abstract: Differential geometry is the natural language for studying mechanical systems, as well as classical field theories, quantisation, control theory, etc. Understanding the geometry underlying a dynamical system provides a plethora of methods for studying the system, either quantitatively or qualitatively, analytically or numerically. I will start my talk by recalling some basic notions of differential geometry (requiring the audience only knowledge of linear algebra, elementary calculus, and some general topology). After that, I will present the main ideas of symplectic geometry and Hamiltonian mechanics. I will conclude by exposing how symplectic geometry can be used to construct numerical methods that preserve global properties of the original dynamical system (such as conservation of the energy or the volume). - 14/04/2025, 12:30,
Michał Bogdan: Agents, flocks, dynamic transitions- an opening for TDA?
Abstract: The talk will cover the basic strategies of modelling the behaviour of active groups and flocks of such biological agents as fish, birds, cells and bacteria using such models as the Vicsek model, the Toner-Tu model and related models. We will discuss the basic predictions of such models and progress to more recent developments. We will cover the first existing attempts to further the field by topological data analysis and ponder what uncovered potential of TDA there may yet to be realised. - 31/03/2025, 12:30,
Janusz Przewocki: Primer on mapper algorithm and its applications
Abstract: The Mapper algorithm, introduced by Singh, Mémoli, and Carlsson, is a key tool in topological data analysis for uncovering structure in complex data. In this talk, we will explore its theoretical foundations and examples from the original paper. We will also examine its impactful application related to disease-specific genomic analysis by Nicolau et al. In this talk, we aim to provide both theoretical understanding and practical insights into the use of Mapper in data science. - 24/03/2025, 12:30,
Julian Brüggemann: Discrete Morse theory, homology, and cancelling critical simplices
Abstract: In this talk, we will introduce discrete Morse functions, their induced combinatorial gradient vector fields, and their flow lines. Then we will use these notions to compute Morse homology, prove the discrete Morse inequalities, and consider special cases in which discrete Morse functions can detect that a space is contractible or equivalent to a wedge of spheres. Afterwards, we will consider cancellation of critical simplices along unique flow lines and will discuss techniques on how to find such cancellations. - 10/03/2025, 12:30,
Mateusz Masłowski: Topological signatures of phase transitions: Insight from the 2D Ising Model
Abstract: Phase transitions mark dramatic shifts in system behaviour as external parameters, such as temperature, change. While classical approaches often depend on defining a Hamiltonian, many complex systems lack a clear energy function. This talk explores a topological perspective on phase transitions, using the 2D Ising model as a classic example. Through Monte Carlo simulations with Metropolis and Wolff algorithms, I examine how spin configurations evolve with temperature, uncovering topological patterns that signal critical behaviour. The broader aim is to establish a general method for detecting phase transitions, even in systems without an explicit Hamiltonian. This work lays the groundwork for future extensions to higher dimensions or more complex models. - 10/02/2025, 14:15,
Davide Gurnari: Dimensionality reduction techniques: the good, the bad, the ugly
Abstract: In this talk we will discuss some of the most famous and used techniques for dimensionality reduction and visualization of high dimensional datasets. By exploring the algorithms behind PCA, LDA, t-SNE and UMAP we will build intuition on the underlying principles and trade-offs of each method. We will discuss common pitfalls and challenges that can arise when applying these tools, offering insights into best practices for selecting the right approach for specific data analysis tasks. - 13/01/2025, 8:15,
Niklas Hellmer: Probability Meets Topological Data Analysis
Abstract:In this talk, I will present a collection of results at the interface between probability and topological data analysis (TDA). From a data driven perspective, we assume that samples lie on or near a submanifold (hence, topology enters) in high-dimensional feature space and are governed by a (usually unknown) probability distribution. The results I present illustrate different aspects of cross-fertilization between the subjects and were obtained over the course of my PhD studies. - 09/12/2024, 11:15-12:00,
Julian Brüggemann: Spaces of discrete Morse functions, merge trees, and barcodes
Abstract: This is a practice talk for the talk I am going to present at the AMS Special Session on the Open Neighborhood of Applied Topology at the Joint Maths Meeting of the AMS 2025.
Discrete Morse theory is a versatile tool from combinatorial algebraic topology. In a nutshell, discrete Morse theory uses certain well-behaved functions, the so-called discrete Morse functions, from the face poset of any regular CW complex to the real numbers. In addition to providing sublevel-filtrations, discrete Morse functions provide numerous tools to investigate topological properties and to find theoretic guarantees for various algorithms. Merge trees are combinatorial descriptors for the development of path components within different levels of filtered spaces. They have been introduced in the context of visualization, where they are used to approximate and compute contour trees.
In this talk, we consider the space of discrete Morse functions, investigate some basic properties, and see how this space is connected to spaces of merge trees and barcodes. Previous knowledge on discrete Morse theory will not be required for this talk, but some familiarity with posets (i.e. partially ordered sets) will be very helpful. - 02/12/2024, 10:15,
Marta Marszewska: Using ECP for the Analysis of Dynamic Systems
Abstract: The seminar will explore the use of Euler Characteristic Profiles (ECP) as a tool for analyzing vector fields in dynamic systems. In addition to ECP, alternative approaches such as optimal transport metrics and histogram analysis will be introduced, offering new perspectives for understanding the structure and behavior of vector fields. - 25/11/2024, 10:15,
Jakub Malinowski: Symmetries and crystallography
Abstract: This talk is an introduction to the topic of space/crystallographic groups. During the talk, I will present the definition of space groups and classic Bieberbach theorems. As motivation, I will talk briefly about the use of symmetry in crystallography. - 18/11/2024, 10:15,
Mateusz Masłowski: Efficient Techniques for Simulating One-Dimensional Electromagnetic Waves in Vacuum
Abstract: In this talk, we'll delve into the numerical modeling of one-dimensional electromagnetic waves in a vacuum. I'll introduce the basic concepts of Maxwell's equations in a simplified, one-dimensional form and discuss the finite-difference time-domain (FDTD) method for solving these equations. We'll explore the role of the Green's function in this context, providing a framework for efficiently computing the evolution of electromagnetic fields. As motivation, I'll touch on the importance of electromagnetic wave modeling in applications like wireless communication and data transmission. - 04/11/2024, 10:15
John Rick Manzanares: Exploring Centrality in Networks
Abstract: In our interconnected world, the study of networks provides valuable insights into the dynamics of complex systems. This talk will explore some measures of centrality that identify influential elements within networks, examining how we can quantify importance across different network structures. Additionally, we will briefly discuss an interplay between network science and topological data analysis, highlighting how these areas complement each other in understanding network behavior. - 28/10/2024, 10:15
Julian Brüggemann: Discrete Morse Theory and TDA
Abstract: In this talk, we will introduce/recall some basic notions from comtinatorial topology, e.g. cell/ simplicial complexes, face posets, and discrete Morse functions. Moreover, we will discuss possible applications in TDA.