As a research area, randomized numerical linear algebra (RNLA) is as hot as ever. To celebrate the exciting work in this space, I’m starting a new series on my blog where I celebrate cool recent algorithms in the area. At some future point, I might talk about my own work in this series, but for now I’m hoping to use this series to highlight some of the awesome work being done by my colleagues.
Given a tall matrix with , its (economy-size) QR factorization is a decomposition of the form , where is a matrix with orthonormal columns and is upper triangular. QR factorizations are used to solve least-squares problems and as a computational procedure to orthonormalize the columns of a matrix.
Here’s an example in MATLAB, where we use QR factorization to orthonormalize the columns of a test matrix. It takes about 2.5 seconds to run.
>> A = randn(1e6, 1e2) * randn(1e2) * randn(1e2); % test matrix
>> tic; [Q,R] = qr(A,"econ"); toc
Elapsed time is 2.647317 seconds.
The classical algorithm for computing a QR factorization uses Householder reflectors and is exceptionally numerically stable. Since has orthonormal columns, is the identity matrix. Indeed, this relation holds up to a tiny error for the computed by Householder QR:
>> norm(Q'*Q - eye(1e2)) % || Q^T Q - I ||
ans =
7.0396e-14
The relative error is also small:
>> norm(A - Q*R) / norm(A)
ans =
4.8981e-14
Here is an alternate procedure for computing a QR factorization, known as Cholesky QR:
function [Q,R] = cholesky_qr(A)
R = chol(A'*A);
Q = A / R; % Q = A * R^{-1}
end
This algorithm works by forming , computing its (upper triangular) Cholesky decomposition , and setting . Cholesky QR is very fast, about faster than Householder QR for this example:
>> tic; [Q,R] = cholesky_qr(A); toc
Elapsed time is 0.536694 seconds.
Unfortunately, Cholesky QR is much less accurate and numerically stable than Householder QR. Here, for instance, is the value of , about ten million times larger than for Householder QR!:
>> norm(Q'*Q - eye(1e2))
ans =
7.5929e-07
What’s going on? As we’ve discussed before on this blog, forming is typically problematic in linear algebraic computations. The “badness” of a matrix is measured by its condition number, defined to be the ratio of its largest and smallest singular values . The condition number of is the square of the condition number of , , which is at the root of Cholesky QR’s loss of accuracy. Thus, Cholesky QR is only appropriate for matrices that are well-conditioned, having a small condition number , say .
The idea of randomized Cholesky QR is to use randomness to precondition , producing a matrix that is well-conditioned. Then, since is well-conditioned, we can apply ordinary Cholesky QR to it without issue. Here are the steps:
- Draw a sketching matrix of size ; see these posts of mine for an introduction to sketching.
- Form the sketch . This step compresses the very tall matrix to the much shorter matrix of size .
- Compute a QR factorization using Householder QR. Since the matrix is small, this factorization will be quick to compute.
- Form the preconditioned matrix .
- Apply Cholesky QR to to compute .
- Set . Observe that , as desired.
MATLAB code for randomized Cholesky QR is provided below:1Code for the sparsesign subroutine can be found here.
function [Q,R] = rand_cholesky_qr(A)
S = sparsesign(2*size(A,2),size(A,1),8); % sparse sign embedding
R1 = qr(S*A,"econ"); % sketch and (Householder) QR factorize
B = A / R1; % B = A * R_1^{-1}
[Q,R2] = cholesky_qr(B);
R = R2*R1;
end
Randomized Cholesky QR is still faster than ordinary Householder QR, about faster in our experiment:
>> tic; [Q,R] = rand_cholesky_qr(A); toc
Elapsed time is 0.920787 seconds.
Randomized Cholesky QR greatly improves on ordinary Cholesky QR in terms of accuracy and numerical stability. In fact, the size of is even smaller than for Householder QR!
>> norm(Q'*Q - eye(1e2))
ans =
1.0926e-14
The relative error is small, too! Even smaller than for Householder QR in fact:
>> norm(A - Q*R) / norm(A)
ans =
4.0007e-16
Like many great ideas, randomized Cholesky QR was developed independently by a number of research groups. A version of this algorithm was first introduced in 2021 by Fan, Guo, and Lin. Similar algorithms were investigated in 2022 and 2023 by Balabanov, Higgins et al., and Melnichenko et al. Check out Melnichenko et al.‘s paper in particular, which shows very impressive results for using randomized Cholesky QR to compute column pivoted QR factorizations.
References: Primary references are A Novel Randomized XR-Based Preconditioned CholeskyQR Algorithm by Fan, Guo, and Lin (2021); Randomized Cholesky QR factorizations by Balabanov (2022); Analysis of Randomized Householder-Cholesky QR Factorization with Multisketching by Higgins et al. (2023); CholeskyQR with Randomization and Pivoting for Tall Matrices (CQRRPT) by Melnichenko et al. (2023). The idea of using sketching to precondition tall matrices originates in the paper A fast randomized algorithm for overdetermined linear least-squares regression by Rokhlin and Tygert (2008).