Probability and Statistics 1 (Summer 2025)
This course covers a number of topics around randomness. The first half will be dedicated to Probability theory, the second half to Statistics.
Prerequisites: "Discrete Mathematics", Analysis 1 (essential). Linear algebra 1 & 2, Analysis 2, Combinatorics & Graphs 1 (recommended).
Organisational remarks
The exam will be written. To attend the exam, a zápočet (course credit) is necessary. It has to be obtained through tutorials assigned to this lecture (see below).
Old zápočets, or zápočets acquired at parallel Czech tutorials are not valid.
Reading materials:
Tutorials
The
tutorials will be held by Gaurav Kucheriya , Lluís Sabater Rojas , and myself .
Syllabus
Lecture 1 (19.2): Notes lecture 1
Introduction. Two interpretations of probability. Probability theory vs Statistics. Problems with the naïve approach: Bertrand's paradox, Vitali sets (both non-examinable).
Probability spaces, Kolmogoroff's axioms. Examples: uniform and geometric distributions. Basic properties of probability measures (monotonicity, inclusion-exclusion, continuity, union bound).
Further references: [RS] Lecture 1. [MIT] Lecture 1. Wikipedia: Bertrand's paradox,
Vitali set (see also this video).
Lecture 2 (26.2): Notes lecture 2
Conditional probability, independence of two events. Operations with conditional probabilities: chain rule, the law of total probability. Application: Gambler's ruin. Bayes theorem and its interpretation. Independence for a family of events (vs pairwise independence).
Further references: [RS] Lectures 1 and 2, [MIT] Lecture 2 and [MIT] Lecture 3
Lecture 3 (5.3): Notes lecture 3
Discrete random variables, probability mass function, cumulative distribution function. Common discrete distributions: Bernoulli, Binomial, Geometric, Hypergeometric, Poisson. Poisson as an approximation for Binomial. A heuristic example with meteorites. Operations on random variables: applying univariate and bivariate functions.
Further references: [RS] Lecture 3, [MIT] Lecture 5.
Lecture 4 (12.3): Notes lecture 4
Expectation of a discrete random variable: introduction and definition. Existence of the expectation: issues with (absolute) convergence. Properties of the expectation: monotonicity, linearity. LOTUS: law of the unconscious statistician. Examples: expectations of Bernoulli, binomial, geometric.
Conditional expectation, law of total expectation.
Further references: [RS] Lecture 4, [MIT] Lectures 5-6.
Lecture 5 (19.3): Notes lecture 5
The variance: motivation, definition, basic properties. Joint distribution of multiple random variables: definition and a basic example. Marginals, conditionals. Independence of discrete random variables and its consequences for the expectation and variance. The convolution formula.
Further references: [RS] Lectures 5-6, [MIT] Lectures 6-8.
Lecture 6 (26.3): Notes lecture 6
Continuous measures and random variables: the Lebesgue sigma algebra, the Lebesgue measure, the Lebesgue integral (facts). The uniform measure on an interval. Common continuous distributions: Exponential, Cauchy, Normal, Gamma.
General random variables: definition. Continuous random variables: pushforward measure, probability density function, cumulative distribution function.
Further references: [RS] Lectures 6-7, [MIT] Lecture 8.
Lecture 7 (2.4): Notes lecture 7
Expectation of continuous rv's: LOTUS, linearity. Variance and its properties. Joint distribution and independence. Examples: multivariate uniform and normal distributions.
Further references: [RS] Lecture 8, [MIT] Lectures 8-9.
Lecture 8 (9.4): Notes lecture 8
Covariance and correlation, the probabilistic Cauchy-Schwarz inequality. The convolution formula for continous rv's. The Markov and Chebyshev inequalities.
Further references: [RS] Lecture 9, [MIT] Lectures 11, 19.
Lecture 9 (16.4): Notes lecture 9
Convergence in probability. The weak law of large numbers. Convergence almost surely. The strong law of large numbers (statement). Applications: the Monte Carlo method for computing volumes; Borel's normal numbers.
Weak convergence. The central limit theorem (statement, discussion). The coin toss example: Chebyshev vs CLT's normal approximation.
Further references: [RS] Lecture 10, [MIT] Lectures 19, 20.
Lecture 10 (23.4): Notes lecture 10
Introduction to Statistics. Estimators and their properties.
Further references: [RS] Lecture 11, [MIT] Lecture 23.