Neural Preconditioning via Krylov Subspace Geometry

Keywords

Computational learning
Advanced Numerical Methods for Scientific Computing
Code:
13/2026
Title:
Neural Preconditioning via Krylov Subspace Geometry
Date:
Thursday 5th February 2026
Author(s):
Dimola, N.; Coclite, A.; Zunino, P.
Download link:
Abstract:
We propose a geometry-aware strategy for training neural preconditioners tailored to parametrized linear systems arising from the discretization of mixed-dimensional partial differential equations (PDEs). Such systems are typically ill-conditioned due to embedded lower-dimensional structures and are solved using Krylov subspace methods. Our approach yields an approximation of the inverse operator employing a learning algorithm consisting of a two-stage training framework: an initial static pretraining phase, based on residual minimization, followed by a dynamic fine-tuning phase that incorporates solver convergence dynamics into the training process via a novel loss functional. This dynamic loss is defined by the principal angles between the residuals and the Krylov subspaces. It is evaluated using a differentiable implementation of the Flexible GMRES algorithm, which enables backpropagation through both the Arnoldi process and Givens rotations. The resulting neural preconditioner is explicitly optimized to enhance early-stage convergence and reduce iteration counts across a family of 3D–1D mixed-dimensional problems exhibiting geometric variability in the 1D domain. Numerical experiments show that our solver-aligned approach significantly improves convergence rate, robustness, and generalization.
This report, or a modified version of it, has been also submitted to, or published on
Bollettino dell'Unione Matematica Italiana, 1-23.