Course NMAI058 Lineární algebra II

Exam requirements:

Definitions and theorems marked with ! are VERY IMPORTANT, a NECESSARY (but not sufficient) condition for exam success is to know what ALL of them are saying + (for theorems) at least a basic idea of how they are proved

1. Signature of a permutation - the definition and basic properties, in particular the theorem on the signature of a composition of two permutations

Definitions: 4.6, 4.7, 4.11!

Theorems: 4.13!, 4.14!, 4.15!, 4.16!, 4.17

2. Determinants - definition and basic properties, in particular an expansion along a row and the proof of the theorem on the determinant of a product of two square matrices, and influence of elementary operations of the determinant of a matrix

Definitions: 9.1

Theorems: 9.3, 9.4, all chapter 9.1 !, 9.7!, 9.9!, 9.10

3. Scalar product, norm - axiomatic definitions and basic properties, in particular theorem of Pythagoras and Cauchy-Schwartz inequality

Definitions: 8.1, 8.2, 8.4, 8.5, 8.10, 8.11 Theorems: 8.7, 8.8!, 8.9, 8.19, 8.20, 8.21!, 8.22!, 8.23!

4. Orthogonal and orthonormal basis, Gram-Schmidt orthogonalization algorithm. Orthogonal matrices and their properties

Definitions: 8.46

Theorems: 8.47!, 8.48!, 8.50

5. Eigenvalues and eigenvectors - their equivalent definitions, the characteristic polynomial, basic properties, in particular the theorem on linear independence of eigenvectors. Jordan normal form (without proof)

Definitions: 10.1!, 10.4!, 10.34, 10.35!

Theorems: 10.3!, 10.5!, (10.11), 10.12, 10.29!, 10.36!, 10.38

6. Similarity of matrices, its geometric motivation (similar matrices as alternative representation of the same linear mapping). Given n linearly independent vectors in an n-dimensional space, tell how to construct a matrix having them as eigenvectors. Under what conditions such a matrix is symmetric? n linearly independent eigenvectors as a sufficient condition for diagonalizability.

Definitions: 10.20! (+the insert from the english version of the English notes), 10.25!

Theorems: 10.21 (+to know how the matrix S from the definition of similarity transforms eigenvectors of A to eigenvectors of SAS-1), 10.27!, 10.30!, example 10.31

7. Similarity of a symmetric real matrix with a diagonal matrix by means of an orthogonal matrix. Generalization to complex matrices.

Definitions: 10.44

Theorems: 10.46!, 10.47!!!, 10.49

8. Basic methods of computing eigenvalues and the corresponding eigenvectors.

Theorems: 10.49, 10.55, 10.57

9. Positive (semi)definite symmetric matrices - definition and basic properties, in particular characterisation by eigenvalues and the theorem on "square root" of a p. d. matrix. P. d. matrices and a scalar product

Definitions: 11.1, a solution of a system of linear equations witha positive definite matrix as a minimum of a functional

Theorems: 11.6, 11.7, 11.8

10. Cholesky decomposition

Theorems: 11.10!!, 11.11

11. Quadratic form - Sylvester's law of inertia, solution of a system linear equations with a p.d. matrix as a minimum of a quadratric form, Conjugate Gradient Method (without proof)

Definitions: TBA

Theorems: TBA


Literature

J. Matoušek : Podrobný minimální sylabus přednášky Lineární algebra I

J. Rohn : Lineární algebra a optimalizace na slidech

J. Fiala : On-line sbírka úloh z matematiky

Uriel Feige: Course on Algorithms and Linear Programming

Hladik (English) Chapter 8

Hladik (English) Chapter 9

Hladik (English) Chapter 10

Hladik (English) Chapter 11

Pictures about positive definite matrices

Conjugate Gradient Method (EN) Metoda konjugovanych gradientu (CZ)

Eigenvalue motivation applet

Skewed ellipsoid