BSC, HS23 - CheatSheet LinAlg.
BSC, HS23 - CheatSheet LinAlg.
2001 - 22-918-18-957
A is regular and A−1 = AH (AT ) Def outer product: m-vector x and n-vector
AAH (AAT ) = I y: xy T
A−1 is unitary (orthogonal) S 2.14: A m × n-matrix has rank 1 if it is the
Complex Numbers AB is unitary (orthogonal) outer product of an m-vector ̸= 0 and n-vector
Def Zerodiviser: If AB = 0 ⇔ A,B ̸= 0
Zerodiviser, Nullteiler S 2.15: The orthogonal projection Py x of the
Def transposes: (AT )ij = Aji n-vector x onto y is defined as:
y
Py x = ∥y∥2 · yy H x = uuH = Pu where u = ∥y∥
1
Def conjugate transposed: AH = (A)T = AT
Basics Def symmetric: AT = A ⇔A symmetric Def projections matrix: Py = 1
∥y∥2
· yy H It has
Compability Conditions: br+1 = ... = bm = 0 Def skew-symmetric: AT = −A ⇔A the properties: PyH
= Py
S 1.1: Ax = b hat min eine Lösung ⇔ r = m skew-symmetric
Def hermitian: AH = A ⇔A hermitian (hermitian/symmetric) and Py2 = Py
Def : z = a + bi ⇔ ℜ(z) = a, Im(z) = b oder r < m + V B dann: r = n ⇔ 1 Lösung , (idempotent)
r < n ⇔ ∞ Lösungen S 2.6: Also accounts for AT instead of AH . α
Def : z = a + bi ⇔ z̄ = a − bi ⇔ r · e2π−ϕ simplifies to α
z = r · cos(ϕ) Cor 1.7: For a quadratic SLE with n
Def : p + i · sin(ϕ)
√ equations and n variables we have the (AH )H = A Inverse
Def : |z| =r = x2 + y 2 = z · z̄ (αA)H = αAH
y following set of equivalence, of wich ONLY one
arctan x
y
1. Q of them can be true; So, EITHER (A + B)H = AH + B H
Def : ϕ = arctan x +π 2./3. Q i Rank(A) = n (A is regular) (AB)H = B H AH Def invertible: ∃A−1 ⇔ A−1 · A = A · A−1 = I
arctan y + 2pi 4. Q
ii for every b there exist at least one S 2.17: A is invertible⇔ ∃X : AX = I ⇔ X is
x S 2.7: For symmetric matrices A and B it
solution holds that: AB = BA ⇔ AB is symmetric It unique ⇔ A is regular
iii for every b there exists exactly one holds for arbitrary matrix C that: C T C and S 2.18: If A, B are regular:
Operations solution CC T are symmetric. The same holds for the A−1 is regular and A−1−1 = A
iv the corresponding homogeneous system hermitian case AB is regular and (AB)−1 = B −1 A−1
has only the trivial solution AH is regular and (AH )−1 = (A−1 )H
OR the following equivalences hold
Def : z1 ± z2 : (x1 + x2 ) ± i(y1 + y2 ) v Rank(A) < n (A is singular)
Scalarproduct and Norm S 2.19: If A is regular das LGS Ax = b has the
unique solution x = A−1 b
Def : vi for some b there exits no solution row
Finding an inverse [A|I] −−−→ I|A−1 if
z1 · z2 : (x1 + i · y1 ) + (x2 + i · y2 ) = r1 · r2 ei(ϕ1 +ϕ2 ) vii for no b a unique solution exists op
z1 r1 i(ϕ1 −ϕ2 ) z ·z¯ Def euclidian scalarproduct:
Def : z2 : r2 e = |z1 |22 viii for some b infinity many solution exists
in Pn a b
⟨x, y⟩ = xT y =
Pn
k=1 xk · yk −
−→ k=1 xk · yk A= and det(A) ̸= 0 ⇔ A is invertible
√ 2
ix the corresponding homogeneous system c d
Def : n a ⇔ a = z ⇔ |a| · eiϕ = r n · eiωn ⇔ r =
n R
p
n ϕ+2kπ has non-trivial solutions S 2.9:
d −b
|a|, ω = n S1 ⟨x, y + z⟩ = ⟨x, y⟩ + ⟨x, z⟩(linear in 2nd ⇔ A−1 = ad−bc
1
−c a
Matrices and Vectors factor)
S1 ⟨x, αy⟩ = α ⟨x, y⟩ (linear in 2nd factor) A=
a11 a12
⇔ A−1
−1
a11 a−1
12
Polynomials S2 for E = R: a21 a22 −1
a21 −1
a22
⟨x, y⟩ = ⟨y, x⟩ (symmetric)
Definitions S2’ for E = C:
The roots of a complex polynomial√
are ⟨x, y⟩ = ⟨y, x⟩(hermitian) Orthogonal and unitary matrices
b± b2 −4ac S3 ⟨x, x⟩ > 0, ⟨x, x⟩ = 0 ⇔ x = 0(positiv
pairwise conjugated. Def : z = 2a A m × n matrix hat m row (Zeilen)↓ definite)
Def : az n + c = 0 ⇔ z = n − a Def unitary/orthogonal: AAH = I, AAT = I ⇔
p c
and n columns (Spalten)→ Cor 2.10:
, in which the i,j element gets noted by ai,j or S4 for E = R: linear in 1st factor A is unitary/orthogonal ⇔ det(A) = ±1
(A)i,j ⟨w + x, y⟩ = ⟨w, y⟩ + ⟨x, y⟩ S 2.20: A,B are unitary/orthonormal:
Def nullmatrix: Has in every entry 0 ⟨αx, y⟩ = α ⟨x, y⟩ A is regular and A−1 = AH
Def diagonalmatrix: Has in every entry 0 S4’ for E = C: conjugate-linear in 1st factor AAH = In
except for the diagonal: (D)ij = 0 for i ̸= j one ⟨w + x, y⟩ = ⟨w, y⟩ + ⟨x, y⟩ A−1 is unitary/orthogonal
can write Diag(d11 , · · · , dnn ) ⟨αx, y⟩ = α ⟨x, y⟩ √ AB is unitary/orthogonal
Def identity: The identity is written as
p
Def norm: ∥x∥ = ⟨x, x⟩ = xT x = columns are orthonormal
In = Diag(1, · · · , 1) It holds that AI = IA = A pPn in
2 −
qP
n 2
S 2.21: Images from unitary/orthonormal
Def upper triangular matrix: We have k=1 (|xk |) − →
R k=1 xk matrices are conformal (längen-winkeltreu)
(R)ij = 0 for i > j (Rechtsdreiecksmatrix) S 2.11: | ⟨x, y⟩ | ≤ ∥x∥ · ∥y∥( Cauchy-Schwarz cosϕ sinϕ
Def lower triangular matrix: We have Def 2d rotation: R(ϕ) =
inequality, ”=” holds when y is a multiple of x −sinϕ cosϕ
(R)ij = 0 for i < j (Linksdreiecksmatrix) or vice verca) Def 3d rotation:
Def Matrix-set: The set of m × n-matrices is Def CBS: CBS is a property of the scalar 1 0 0
written as: Em×n For vectors we have: En , product: CBS squared yields: Rx (ϕ) = 0 cosϕ −sinϕ , Ry (ϕ) =
where E is R or C | ⟨x, y⟩ |2 ≤ ⟨x, x⟩ ⟨y, y⟩ 0 sinϕ cosϕ
Def matrix multiplication: If C = AB then S 2.12: For the euclidian norm holds:
cosϕ 0 sinϕ
one can write P N1 ∥x∥ > 0, ∥x∥ = 0 ⇔ x = 0(positiv definit) 0 1 0 , Rz (ϕ) =
n Pn
Cij = (AB)ij = k=1 (A)ik (B)kj = k=1 aik bkj N2 ∥αx∥ = α ∥x∥ (homogeneous) −sinϕ 0 cosϕ
S 2.1: N3 ∥x ± y∥ ≤ ∥x∥ + ∥y∥ (Triangle-inequality)
(αβ)A = α(βA) cosϕ −sinϕ 0
Def : Angle ϕ between x, y: sinϕ cosϕ 0
(A + B) + C = A + (B + C) Re(⟨x,y⟩) in ⟨x,y⟩
, ϕ = arccos −
−→ arccos ∥x∥·∥y∥ 0 0 1
(αA)B = α(AB) ∥x∥·∥y∥
SLE (AB) · C = A · (BC)
R
Def : x, y are orthogonal: ⟨x, y⟩ = 0 ⇔ x ⊥ y
(α + β)A = αA + βA
(A + B) · C = AC + BC
S 2.13: ∥x ± y∥2 = ∥x∥2 + ∥y∥2 ⇔ x ⊥ y
(Pythagoras) Def p-norm:
LU-Decomposition
A · (B + C) = AB + AC 1 The LU-decomposition is useful when multiple
α(A + B) = αA + αB ∥x∥p = (|x1 |p · · · |xn |p ) p SLE have the same A
Gauss Algorithm A+B =B+A Find P A = LR
S 2.20: Let A and B be unitary(orthogonal). Outer Product and Projections solve Lc = P b
It holds: solve Rx = c
1 bring into row echelon form
Lem 4.8: Every set {v1 , · · · , vm } ⊂ V with F : X 7→ Y bijective (isomorphism) ⇔ 2 create a vector for every row, which
|Bv | < m is linear dependant Rank F = dim X = dim Y does not have a pivot. The dimensions
Cor 4.10: in an finite vectorspace, a set with F : X 7→ Y bijective (automorphism) ⇔ of the vectors are E 1×n [3]Solve SLE
n independant vectors is basis of V if Rank F = dim X, ker F = {0} Ax = 0 with the yielded vectors.
dim(V ) = n Cor 5.10: 4 Write the solution as vector
Def : The coefficients ξk are coordinates of x ex
with respect to a basis B ξ = (ξ1 , · · · , ξn )T is a Rank(G ◦ F ) ≤ min(RankF, RankG) −1 −4 7 3 −1 −4 7 3
3 0 −6 0 0 −12 15 9
coordinate vector G is injective Rank(G ◦ F ) = RankF −3 4 1 −3 −−→ 0 0 0 0
−−→
Def : Two subspaces U, U ′ ⊂ V are G is surjective Rank(G ◦ F ) = RankG 1 −4 3 3
(1)
x4 =α
0 0 0 0
(2)
1 0
0 1
Def columnspace: The columnspace Def Maps: Let X, Y be vector spaces with
Definitions
Vectorspaces (Spaltenraum) of A is the subspace
ℜ(A) = im(A) = span{a1 , · · · , an }
dimX = n, dimY = m
F : X 7→ Y a linear map
Def : A vectorspace V Def nullspace: The nullspace (Nullraum) of A A : En 7→ Em′ , ξ 7→ η *
Def linearity: F : V → W is linear: is a subspace N (A) = kerA = L0 (Ax = 0)
over K is a non-empty set, on which
F (v + w) = F (v) + F (w) B : En 7→ Em , ξ ′ 7→ η ′ *
vectoraddition and scalarmultiplication is Def : # free variables = dimN (A) T : En 7→ En ′ , ξ 7→ ξ′ **
αF (v) = F (αv) S 5.12: Rank A =r: and L0 Solution of
defined
Def injective: ∀x, x′ ⊂ X : f (x) = f (x′ ) ⇔ x = x′ S : Em 7→ Em ′ , η 7→ η ′ **
Def Axioms: Ax = 0 ⇒ dimL0 = dimN (A) = dim(KerA) = n − r *Abbildungsmatrix
Def surjective: ∀y ⊂ Y, ∃x ⊂ X, f (x) = y S 5.13: Rank A ∈ M m×n :
V1 : x + y = y + x **Transformationsmatrix
V2 : (x + y) + z = x + (y + z) Def bijective: surjective and injective⇔ f −1
V3 : ∃0 ∈ V : x + 0 = x exists pivots in Row-echelon-form
V4 : ∀x∃ − x : x + (−x) = 0 dim(im(A)) of A : En 7→ Em
V5 : α(x + y) = α · x + α · y Matrix representation dimension of the linear independent
V6 : (α + β)x = αx + βx columns/rows
V7 : (αβ)x = α(βx) Cor 5.14: RankAT = RankAH = RankA
V8 : 1 · x = x Let F be a linear map X → Y . One can write S 5.16: for A ∈ Em×n and BinEp×m :
S 4.1: F (bi ) ∈ Y as aPlinear combination of the basis
m
i : 0·x=0 of Y: F (bi ) = k=1 ak,l · ck Def : The matrix RankBA ≤ min(RankA, RankB)
ii : ·0 = 0 Am×n witht the elements ak,l is a matrix RankB = m ≤ p ⇒ RankBA = RankA
iii : α · x = 0 → x = 0 ∨ α = 0 (Abbildungsmatrix) with respect X, Y RankA = m ≤ n ⇒ RankBA = RankB S 5.20: RankF
iv : (−α)x = α(−x) = −(α · x) F (x) = y ⇔ Aξ = η [H] Cor 5.17: From S.5.16 it follows for quadratic = r has the mappingmatrix*
Ir 0
Def polynomial space: Pn is defined as S all matrices. A ∈ Em×m and BinEm×m A=
0 0
polynomials of degree n. Further: P = ∞ n=0 Pn
S 4.1: I a vectorspace the following holds for RankBA ≤ min(RankA, RankB)
a scalar α and x ∈ V : RankB = m ⇒ RankBA = RankA Vector spaces with scalar
0x = 0 RankA = m ⇒ RankBA = RankB
0α = 0 S 5.18: For quadratic matrix En×n the products
αx = 0 ⇒ α = 0 or x = 0 following statements are equivalent:
(−α)x = α(−x) = −(αx)
S 4.12: {b1 , · · · , bn } ⊂ V is a basis of V ⇔ A is regular Definitions
Pn x ∈ V can be uniquely represented
every vector RankA = n
as: x = k=1 ξk bk Columns are linearly independent
S 4.2: ∀x ∈ V, ∀y ∈ V ∃z ∈ V : x + z = y where z Rows are linearly independent Def Norm: A norm is a function
is unique and z = y + (−x) kerA = N (A) = {0} | · | : V → R, x → ∥x∥ in a vector space which
Def isomorphism: F is bijective ⇔ F is an
A is invertible satisfies:
Subspace isomorphism
ImA = ℜ(A) = En N1 ∥x∥ > 0, ∥x∥ = 0 ⇔ x = 0 (positiv definit)
S 5.19: For Ax = b, b ̸= 0 with the solution x0 N2 ∥αx∥ = α ∥x∥ (homogenous)
Def automorphism: F is isomorphism and N3 ∥x ± y∥ ≤ ∥x∥ + ∥y∥ (Triangle-inequality)
and L0 the solutionset is defined by
X = Y ⇔ F is an automorphism A normed vector space
Def : A subspace (unterraum) U is a Lb = x0 + L0 and is called affine subspace
S 5.1: F is isomorphism ⇔ F −1 exists and is has a norm Def scalar product: is a function
non-empty subset of V. It is closed under (not a real subspace since 0 ∈
/ Lb )
an isomorphism and linear ⟨·, ·⟩ : V × V → E, x, y 7→ ⟨x, y⟩, which satisfies:
vector addition and scalar multiplication. U dim(Im(A)) = n − dim(ker(A)) = n − (n − r) = r
contains the zero-vector S1 ⟨x, y + z⟩ = ⟨x, y⟩ + ⟨x, z⟩(linear in 2nd
S 4.3: Every subspace is a vectorspace Kernel, Image and Rank factor)
Def spanning set: The vectors v1 , · · · , vn are a RC Find Basis of Im A= R(A): S1 ⟨x, αy⟩ = α ⟨x, y⟩ (linear in 2nd factor)
spanning set (erzeugendes System) of V, if 1 bring into row echelon form S2 ⟨x, y⟩ = ⟨x, y⟩(symmetric, hermitian)
∀w ∈ span{v1 , · · · , vn } Def Kern: kerF = {x ∈ X|F (x) = 0} 2 mark rows with pivots S3 ⟨x, x⟩ > 0, ⟨x, x⟩ = 0 ⇔ x = 0(positiv
S 5.6: F injective ⇔ kerF = {0} 3 marked columns in the normal form are definite))
Def Image: ImF = {F (x)|x ∈ X} a Basis Def unitsphere: the set {x ∈ V | ∥x∥ = 1}
Linear dependency, basis, dimensions S 5.6: F surjective ⇔ imF = Y ex Def induced norm: The length
ker A
−1 −4 7 3 −1 −4 7 3 p of a vector is
3 0 −6 0 0 −12 15 9 defined as: ∥·∥ : V 7→ R, ∥x∥ 7→ ⟨x, x⟩
is the solution set of Ax = 0. Im(A) −3 4 1 −3 −−→ 0 0 0 0
−−→
(1) (2) Def angle ϕ: ϕ = ∢(x, y), 0 ≤ ϕ ≤ π is defined
1 −4 3 3 0) 0 0 0
set of all b, such that Ax = b is solvable S 5.7: −1 −4 7 3
(
−1 −4 ⟨x,x⟩ ℜ⟨x,y⟩
Def linear dependency: Vectors v1 , · · · , vn dimX − dim(kerF ) = dim(imF ) = Rank(F ) ↑ −12 15 9 3 0 by: ϕ = ∥x∥·∥y∥ = ∥x∥·∥y∥
0 ↑
−−→
0 0 (3)
span −3 4
arePlinearly dependent Def : The rank F is equal to dim(im(F )) 0 0 0 0 1 −4
Def orthogonal vectors: two vectors x, y are
n
⇔ k=1 αk · vk = 0 → α1 = · · · = αn = 0 Cor 5.8: orthogonal ⇔ ⟨x, y⟩ = 0
Def dimension: the dimension of V is F : X 7→ Y injective ⇔ Rank F = dim X Def orthogonal sets : two sets X, Y are
dimV = |spanV | (dim{0} = 0) F : X 7→ Y surjective ⇔ Rank F = dim Y RC Find Basis of ker A = N (A), A ∈ E m×n : orthogonal ⇔ ∀x ∈ X, ∀y ∈ Y ⟨x, y⟩ = 0
S 6.1: All matrices are unitary/orthogonal Cor 6.12: if n = dimX = dimY < ∞ RC Least Squares with SVD:
|⟨x, y⟩|2 ≤ ⟨x, x⟩ ⟨y, y⟩ = ∥x∥2 · ∥y∥2 (Cauchy ⟨x, y⟩v = ξ H η = ⟨ξ, η⟩v = ξ ′ , η ′ v = ξ ′H η ′ ⇒ T is 4 F is isomorphism 2