A Mixed Precision Randomized Preconditioner for the LSQR Solver on GPUs

A Mixed Precision Randomized Preconditioner for the LSQR Solver on GPUs

Monday, May 22, 2023 4:00 PM to 4:25 PM · 25 min. (Europe/Berlin)
Hall 4 - Ground Floor
Research Paper
Mixed Precision AlgorithmsNumerical Libraries

Information

Randomized preconditioners for large-scale regression problems have become extremely popular over the past decade. Such preconditioners are known to accelerate large-scale regression solvers both from a theoretical and a practical perspective. In this paper, we present a mixed precision randomized preconditioner for LSQR solvers, focusing on overdetermined, dense least squares problems. We implement and evaluate our method on GPUs and we demonstrate that it outperforms the standard double precision version of randomized, preconditioned LSQR by up to $20\%$ on the NVIDIA A100. We present extensive numerical experiments utilizing the half-precision and tensorcore units to demonstrate that, in many cases, constructing the preconditioner in \textit{reduced precision} does not affect the convergence of LSQR solvers. This leads to important speedups without loss of accuracy.
Contributors:
Format
On-siteOn Demand
Beginner Level
10%
Intermediate Level
30%
Advanced Level
60%