Qr Householder Method

  1. Qr Householder Method Name
  2. Qr Householder Method Definition

The QR decomposition (or factorization) is an algorithm that converts a given matrix into a product of an orthogonal matrix and a right or upper triangular matrix with. In the following we consider two methods for the QR decomposition. The Householder transformation. $ begingroup$ The short answer is a Givens rotation allows us to zero out one entry below the diagonal, while a Householder reflection can zero out all the subdiagonal entries in a column. If it would be of interest, I can sketch out some of the trade-offs in using one approach vs. The other in an Answer. $ endgroup$ – hardmath Aug 22 '16 at 3:04. The householderDecomposition method is a member of the cMatrix object. The cMatrix object is a rather simple object to perform a few matrix operations. It has a few operator overloads and the ability to transpose a matrix. Download the project at the end of this post to see the details. Here is the Householder decomposition. Example: Solving a Least Squares Problem using Householder transformations Problem For A = 3 2 0 3 4 4 and b = 3 5 4, solve minjjb Axjj. Solution: Householder transformations One can use Householder transformations to form a QR factorization of A and use the QR factorization to solve the least squares problem.

$initialize$So far, we have only shown existence , but How do we compute QR decompositions?

Householder Reflections

Householder reflection is $H=I-2vv'$ where $norm v2=1$ which reflects over hyperplane orthogonal to $v$

Householder

Lemma. Householder reflections are orthogonal matrices

Proof. $~$ HOMEWORK

We use a series of Householder reflections to reduce $APi$ to an upper triangular matrix, and the resultant product of Householder matrices gives us $Q,$, i.e. $$underbrace{H_rcdots H_1}_{Q'}APi=R$$

First, find $H_1$ that makes every entry below $R_{11}$ zero,
begin{align*}
H_1a_1&=R_{11}e_1
R_{11}e_1&=a_1-2v_1v_1'a_1
v_1(underbrace{2v_1'a_1}_text{scalar})&=a_1-R_{11}e_1
implies~v_1&=frac{a_1-R_{11}e_1}{norm{a_1-R_{11}e_1}2}
implies~R_{11}e_1&=pmnorm{a_1}2
v_1&=frac{a_1-norm{a_1}{}e_1}{norm{vphantom{big };!a_1-norm{a_1}{}e_1}2}
implies H_1&=I-frac{(a_1-norm{a_1}{}e_1)(a_1-norm{a_1}{}e_1)'}{norm{vphantom{big };!a_1-norm{a_1}{}e_1}{}_2^2}
end{align*}

Then, we have
$$H_aAPi~=~defhmatres{{begin{matrix}begin{matrix}R_{11}&text{--------------}~~end{matrix}begin{matrix}begin{matrix}0vdots0&end{matrix}&!!!!!!mat{&&&tilde{A}_2&&&}end{matrix}end{matrix}}}left[{begin{matrix}begin{matrix}R_{11}&text{--------------}~~end{matrix}begin{matrix}begin{matrix}0vdots0&end{matrix}&!!!!!!underbrace{mat{&&&tilde{A}_2&&&}}_text{repeat on these}end{matrix}end{matrix}}^{vphantom{Big }}right]$$

Givens Rotations

Givens rotations $Gij$ where $Gij$ is the identity matrix except
- $Gij_{ii}=Gij_{jj}=lambda$
- $Gij_{ij}=sigma$
- $Gij_{ji}=-sigma$

$$text{for example, },G^{(2,4)}=mat{1&0&0&00&lambda&0&sigma0&0&1&00&-sigma&0&lambda}$$

HOMEWORK $~$ Show that Givens rotation is orthogonal when $sigma^2+lambda^2=1$

We can also use Givens rotations to compute the decomposition, so that $$underbrace{G_Tcdots G_1}_{Q'}APi=R$$

To find $G_1$, we look for a Givens rotation $G^{(1,2)}$ that will make the entry in row 2 column 1 of the product equal to zero. Consider the product with the following matrix $M$ $$mat{lambda&sigma-sigma&lambda},mat{M_{11}&M_{12}&cdots&M_{1m}M_{21}&M_{22}&cdots&M_{2m}}~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~=mat{hphantom{-}lambda M_{11}+sigma M_{21}&hphantom{-1}lambda M_{12}+sigma M_{22}&cdots&hphantom{-1}lambda M_{1m}+sigma M_{2m}-sigma M_{11}+lambda M_{21}&-sigma M_{12}+lambda M_{22}&cdots&-sigma M_{1m}+lambda M_{2m}}$$

To find $G_1$, we solve $$begin{matrix}begin{split}-sigma M_{11}+lambda M_{21}&=0text{s.t. }~~~~~~~~~~~~~sigma^2+lambda^2&=1end{split}end{matrix}~~~~~~~implies~~~~~~~begin{matrix}lambda=frac{M_{11}}{sqrt{M_{11}^2+M_{21}^2}}~~~text{and}~~~sigma=frac{M_{21}}{sqrt{M_{11}^2+M_{21}^2}}end{matrix}$$

Now, we can repeat this process to successively make all entries below the diagonal zero, giving us a QR decomposition.

HOMEWORK

  1. Determine the computational complexity for QR decomposition using
    • Gram-Schmidt
    • Modified Gram-Schmidt
    • Householder reflections
    • Givens rotations
  2. Compare the complexity of Householder vs Givens for a sparse matrix
  3. Implement QR decomposition using Householder reflections, (input matrix A of full column rank and output Q,R)
  4. Repeat 3 using Givens rotations

$$~$$

'Large' data least squares

$AinRnm$, where $ngg m~~$ (which is a different situation than $mgg n$)

Optional reference : 'Incremental QR Decomposition' , by Gentleman.

Qr Householder Method Name

We can apply QR decomposition to solving the least squares equation $Ax=b$ since
begin{align*}
hat{x}&=(A'A)inv A'b
&=((QR)'(QR))inv(QR)'b
&=(R'cancel{Q'Q}R)inv R'Q'b
&=(R'R)inv R'Q'b
&=Rinvcancel{(R')inv R'}Q'b
&=Rinv Q'b
end{align*}

We need to find $Rinv$ and $Q'b$. To do so, consider just the first $m!+!1$ rows of the matrix $mat{A&b},$ and perform QR decomposition to get $$mat{tilde{R}&tilde{Q}'tilde{b}0&tilde{s}}$$ Then, we add one row at a time to the bottom and perform Givens rotations so that $$mat{tilde{R}&tilde{Q}'tilde{b}0&tilde{s}a_{m+2}&b_{m+2}}~~~implies~~~mat{tilde{R}_{m+2}&tilde{Q}_{m+2}'tilde{b}_{m+2}0&tilde{s}_{m+2}0&0}$$

$$~$$

Qr Householder Method Definition