We develop inference methods that make biophysical modelling fast, robust, and scalable. Our thermodynamic Variational Laplace (Thermo-VL) augments classical VL with annealed likelihoods, low-rank curvature, and smarter noise updates for reliable fits to nonlinear dynamical systems (e.g., neural mass models, agents).
- Adaptive low-rank covariances factorise the Hessian into a rank-k subspace (
V D V^T
) updated each iteration, capturing dominant curvature while keeping inference tractable.
- Heteroscedastic noise update observation variance per feature/epoch to prevent over-confidence and absorb structured residuals (e.g., frequency-dependent M/EEG noise).
- Thermodynamic integration (TI) anneal the likelihood with
β∈[0,1]
to stabilise optimisation and enable path-sampling estimates of the log evidence.
- Solenoidal mixing optional skew-symmetric term in the natural-gradient preconditioner to align updates with posterior geometry.
- Laplace-domain transfer functions linearise dynamics with delays to compute cross-spectra efficiently for DCM-style fits.
- Mixture-of-Gaussians features represent data/model features with Gaussian bases for flexible spectral fits and denoising.
How fitVariationalLaplaceThermo
works (brief)
Core loop: refine mean m
, structured covariance V,D
, and noise, while annealing β
and tracking ELBO.
// Pseudocode (see MATLAB implementation)
init m=m0, S0 → low-rank V,D; set β schedule; set σ²
repeat until convergence:
// Tempered objective
Lβ(m) = β·log p(y|m,σ²) + log p(m)
// Curvature & low-rank update
H ≈ ∇²Lβ(m); [U,S]=svd(H); V=U(:,1:k)·sqrt(S₁:k)
D = diag(diag(H) - rownorm(V)^2)
// Mean update (preconditioned / natural gradient)
m ← m + step · (V,V,D)⁻¹ ∇Lβ(m)
// Noise & heteroscedastic variance
σ² ← update_per_feature(residuals)
// Anneal β and compute ELBO / g_ELBO for line-search
until |ΔELBO| < tol
MATLAB source: fitVariationalLaplaceThermo.m ·
Overviews: Comp-Neuro · Neuro-AI