Probability and Statistics 1 (Summer 2026)
This course covers a number of topics around randomness. The first half will be dedicated to Probability theory, the second half to Statistics.
Prerequisites: "Discrete Mathematics", Analysis 1 (essential). Linear algebra 1 & 2, Analysis 2, Combinatorics & Graphs 1 (recommended).
Organisational remarks
The exam will be written. To attend the exam, a zápočet (course credit) is necessary. It has to be obtained through tutorials assigned to this lecture (see below).
Old zápočets, or zápočets acquired at parallel Czech tutorials are not valid.
Reading materials:
- Typed lecture notes will be provided.
- In SIS there is a list of recommended books.
- Dr. Feldmann's page from 2022.
- Dr. Šámal's page from 2021. [RS]
- MIT OpenCourseWare videos. [MIT]
- Hans-Otto Georgii. Stochastik: Einführung in die Wahrscheinlichkeitsrechnung und Statistik, 5th Ed. [HG]
Tutorials
The
tutorials will be held by Joonas Kisel , William Stowe , and myself .
Syllabus
- Lecture 1 (18.2): Notes lecture 1
Introduction. Two interpretations of probability. Probability theory vs Statistics. Problems with the naïve approach: Bertrand's paradox, Vitali sets (both non-examinable).
Probability spaces, Kolmogoroff's axioms. Examples: uniform and geometric distributions. Basic properties of probability measures (monotonicity, inclusion-exclusion, continuity, union bound).
Further references: [RS] Lecture 1. [MIT] Lecture 1. Wikipedia: Bertrand's paradox,
Vitali set (see also this video).
- Lecture 2 (25.2): Notes lecture 2
Conditional probability, independence of two events. Operations with conditional probabilities: chain rule, the law of total probability. Application: Gambler's ruin. Bayes theorem and its interpretation. Independence for a family of events (vs pairwise independence).
Further references: [RS] Lectures 1 and 2, [MIT] Lecture 2 and [MIT] Lecture 3
- Lecture 3 (4.3): Notes lecture 3
Discrete random variables, probability mass function, cumulative distribution function. Common discrete distributions: Bernoulli, Binomial, Geometric, Hypergeometric, Poisson. Poisson as an approximation for Binomial. A heuristic example with meteorites. Operations on random variables: applying univariate and bivariate functions.
Further references: [RS] Lecture 3, [MIT] Lecture 5. An interactive tool for basic distributions (courtesy of Dr. Šámal).
- Lecture 4 (11.3): Notes lecture 4
Expectation of a discrete random variable: introduction and definition. Existence of the expectation: issues with (absolute) convergence. Properties of the expectation: monotonicity, linearity. LOTUS: law of the unconscious statistician. Examples: expectations of Bernoulli, binomial, geometric.
Conditional expectation, law of total expectation.
Further references: [RS] Lecture 4, [MIT] Lectures 5-6.
- Lecture 5 (18.3): Notes lecture 5
The variance: motivation, definition, basic properties. Joint distribution of multiple random variables: definition and a basic example. Marginals, conditionals. Independence of discrete random variables and its consequences for the expectation and variance. The convolution formula.
Further references: [RS] Lectures 5-6, [MIT] Lectures 6-8.