Edinburgh Research Archive

Methodological contributions to state and parameter inference for state-space models

dc.contributor.advisor
Elvira Arregui, Victor
dc.contributor.advisor
Paulin, Daniel
dc.contributor.author
Tsampourakis, Kostas
dc.date.accessioned
2026-03-13T13:33:54Z
dc.date.issued
2026-03-13
dc.description.abstract
State-space models (SSMs) are a widespread framework for modeling dynamical systems in a wide variety of fields, including signal processing, robotics, neuroscience, and finance. These models represent systems in terms of latent variables that evolve over time according to Markovian dynamics, and are linked to observations through a probabilistic measurement process. Bayesian inference within SSMs comprises two core problems: state inference, which involves estimating the latent states given data and parameters, and parameter inference, which seeks to estimate unknown parameters of the model given observed data. Both problems are analytically tractable only in the special case of linear-Gaussian SSMs, where Kalman filtering and smoothing provide closed-form solutions. In general, however, practical inference in SSMs demands approximate methods due to nonlinearities and non-Gaussianity, as well as the intractability of the marginal likelihood over latent states. This thesis is devoted to advancing the methodology for state and parameter inference in general SSMs. It begins by reviewing the classical literature, covering state inference methods such as Gaussian filters, Gaussian sum filters (GSFs), and particle filters (PFs), as well as parameter inference methods and discusses the limitations and trade-offs inherent to each approach. The thesis then proposes novel solutions for both state and parameter inference. The first main contribution of this thesis is the development of the augmented Gaussian sum filter (AGSF), a novel class of Bayesian filters that addresses longstanding limitations of both GSFs and PFs. The AGSF is based on a Gaussian splitting scheme, that exploits a convolution identity to decompose a Gaussian distribution into a weighted mixture of lower-variance Gaussians. This representation enables a controlled trade-off between deterministic Gaussian approximations and stochastic particle-based methods. The AGSF generalizes standard GSFs and PFs, recovering both as limiting cases depending on the choice of augmentation covariances. Furthermore, an adaptive version of the AGSF is proposed that automatically sets these covariances via an optimization problem minimizing an upper bound on the mean squared error of moment estimates. The result is a flexible and robust filtering algorithm that dynamically adapts to the local nonlinearity of the model, blending the strengths of GSFs (efficiency) and PFs (accuracy and stability). Empirical results demonstrate that the AGSF achieves superior performance across various tasks, including maneuvering target tracking and systems with mixed linear and nonlinear dynamics. The second major contribution is the introduction of truncated sequential neural likelihood (T-SNL), a new algorithm for parameter inference in SSMs based on simulation-based inference (SBI). Traditional approaches to parameter inference suffer from high computational costs and instabilities due to the intractability of the SSM likelihood. Recent advances in SBI, particularly neural likelihood estimation via autoregressive normalizing flows have shown promise in high-dimensional, likelihood-free settings. Sequential neural likelihood (SNL) is a popular SBI method that learns a model of the likelihood, trained iteratively on simulated data near the posterior. T-SNL builds on SNL by exploiting a key structural property of many SSMs: exponential forgetting, whereby the influence of initial conditions on the filtering distribution decays over time. T-SNL leverages this property by truncating the factors of the complete model likelihood. This truncation yields a significantly larger and more diverse training dataset from each simulation, vastly improving sample efficiency and training stability. Compared to SNL, T-SNL is able to make more efficient use of each simulation run, reducing the number of simulations required to reach high-quality posterior approximations. Moreover, T-SNL is easier to train than SNL, scales naturally to longer temporal sequences and is amortized. Experimental results show that T-SNL consistently outperforms existing SBI methods and classical inference techniques in both linear and nonlinear SSMs, including stochastic volatility models, and ecological population models. Together, these contributions advance the state of the art in both state and parameter inference for SSMs. The AGSF provides a unified and adaptive framework for filtering that is capable of robustly interpolating between Gaussian and particle-based methods. T-SNL introduces a principled and scalable approach to simulator-based parameter inference by taking advantage of the temporal structure of SSMs. The thesis concludes with a discussion of future research directions, including the integration of AGSF into parameter inference frameworks, online learning settings, and applications in real-world domains such as control, neuroscience, and time-series forecasting.
dc.identifier.uri
https://era.ed.ac.uk/handle/1842/44480
dc.identifier.uri
https://doi.org/10.7488/era/6997
dc.language.iso
en
dc.publisher
The University of Edinburgh
en
dc.relation.hasversion
Kostas Tsampourakis and Víctor Elvira. “An Augmented Gaussian Sum Filter through a mixture Decomposition”. In: ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2023, pp. 1–5. doi: 10.1109/ICASSP49357. 2023.10095899
dc.subject
Bayesian inference
dc.subject
filtering
dc.subject
state-space models
dc.subject
SSMs
dc.subject
simulation-based inference
dc.title
Methodological contributions to state and parameter inference for state-space models
dc.type
Thesis
dc.type.qualificationlevel
Doctoral
dc.type.qualificationname
PhD Doctor of Philosophy

Files

Original bundle

Now showing 1 - 1 of 1
Name:
Tsampourakis2026.pdf
Size:
3.35 MB
Format:
Adobe Portable Document Format

This item appears in the following Collection(s)