Symmetry, Transpose

Tags

Symmetric matrices

A=ATA = A^T is a symmetric matrix. A=ATA = -A^T is an antisymmetric matrix. All symmetric and antisymmetric matrices are square.

Symmetric

The sufficient condition is that A=ATA = A^T.

If a matrix is symmetric, this means that

  1. Can be decomposed into UΣUTU\Sigma U^T where UU is the the eigenbasis (spectral theorem)
  1. Has orthogonal eigenbasis (spectral theorem)
  1. Can be factored into VVTVV^T. If Σ\Sigma is non-negative, the VV is real (derived from the factorization)
  1. Inverse is also symmetric

Note that symmetry does not guarentee invertibility as the eigenbasis may have eigenvalues of 00.

Side note: if a matrix is not symmetric, it can still be decomposed through SVD.

Sums of symmetric matrices

A+ATA + A^T is symmetric and AATA - A^T is antisymmetric. Furthermore, because A=12(A+AT)+12(AAT)A = \frac{1}{2} (A + A^T) + \frac{1}{2}(A - A^T), we can express any matrix AA as the sum of a symmetric and an antisymmetric matrix!!

Transpose

Transpose is just flipping rows and columns. Theoretically it deals with dual spaces but practically it's just the flip

(AT)ij=Aji(A^T)_{ij} = A_{ji}

These properties are important

Other properties

A matrix is symmetric if AT=AA^T = A and slew-symmetric if AT=AA^T = -A

If you transpose a rectangular matrix, think of it as rotating around the major diagonal

(AB)i,jT=BTAT(AB)^T_{i,j} = B^TA^T

(Ax)T=xTAT=x1a1T+...+xnanT(Ax)^T = x^TA^T = x_1a_1^T +... + x_na_n^T where aka_k is a column in the original matrix AA

Proof : this is really simple. It's just using the transpose rule with the definition of a matrix-vector product