Speaker
Mr
Peter Gysbers
Description
The properties of nuclei can be computed from first principles starting from realistic interactions between nucleons. Using suitable basis functions, the many-body wavefunction is found by diagonalizing a Hamiltonian matrix (i.e. solving the Schrödinger equation).
Due to limited computational resources only a finite basis size can be used. This is frequently insufficient for complete convergence. The "true value" of a calculated quantity (e.g. ground state energy) is predicted by extrapolating to infinite basis size.
The functional form of such an extrapolation is unknown but by the variational principle the signs of the derivatives are known. In this work we use knowledge of monotonicity and convexity to constrain a Gaussian Process (GP) model and predict ground state energies.
A GP is a machine learning tool which generates a distribution of viable functions which satisfy known data points. It is easy to compute and automatically produces uncertainties on predictions. However, applying derivative constraints is not simple and requires iteratively tightening the constraints and improving the GP model via Sequential Monte Carlo.
This novel method shows promise in providing more meaningful confidence intervals on theoretical predictions than existing methods, allowing a more useful comparison to experiment.