7 Compute and manipulate discrete probabilities
7.1 Use the classical definition of probability
Description
This subsection introduces the simplest notion of probability for discrete (i.e. countable) events. Use assumptions that all elementary events have equal probabilities.
(Rosen2019, p.469-475)Introduction to Discrete Probability
(Brehm2020_7_1_1)An Intro to Discrete Probability
(Brehm2020_7_1_2)Discrete Probability Practice
7.1.1. Define the classical definition of probability and event
7.1.2. Define union, intersection and comlement of events
7.1.3. Compute probability for some events and combinations of events
7.1.4. Analyze probabilistic “paradoxes” and surprises
7.2 Introduce the basic concepts of the probability theory
Description
This chapter introduces all the fundamental concepts of discrete probabilities once they have been defined.
(Rosen2019, p.477-491)Probability Theory
(Brehm2020_7_2_1)Probability Theory
(Brehm2020_7_2_2)Random Variables and the Binomial Distribution
7.2.1. Add probabilities of elementary events in various combinations
7.2.2. Use the sum for pairwise disjoint events
7.2.3. Use the inclusion/exclusion principle for probabilities
7.2.4. Define and use conditional probabilities
7.2.5. Define and use independent events
7.2.6. Define Bernoulli distribution
7.2.7. Define Binomial distribution
7.2.8. Define and use the concept of random variable
7.2.9. Introduce and solve the Birthday problem (1st kind of waiting time)
7.2.10. Introduce other types of waiting times
7.2.11. Define probabilistic algorithms and probabilistic simulations
7.3 Apply Bayes’ theorem
Description
The subsection shows a very practical way to express unknown conditional probabilities using directly observable ones (which show the “opposite” conditional probabilities).
(Rosen2019, p.494-501)Bayes’ Theorem
7.3.1. Formulate and prove Bayes’ Theorem
7.3.2. Formulate some variants and generalizations of Bayes’ Theorem
7.3.3. Show the applications of Bayes’ Theorem in text clusterization, spam filters and Data Leak Prevention
7.4 Summarise data with expected value and variance
Description
There are many ways to summarize larger amounts of numeric data (including median, inter-quartile range, mode, etc.), but here we only deal with the two simplest measures. Expected value (also mean and expectation) is defined for a random variable and it expresses the location. Variance (dispersija) and its square root – standard deviation (standartnovirze) measure the variability of data – how far it goes from the mean value.
(Rosen2019, p.503-517)Expected Value and Variance
7.4.1. Define and compute the expected value of a random variable
7.4.2. Use the linearity of expected value and other properties
7.4.3. Apply the expected value to find average-case computational complexity.
7.4.4. Define the uniform distribution; find its expected value and variance
7.4.5. Define the Poisson distribution with parameters; find its expected value and variance
7.4.6. Define the geometric distribution and find its expected value (the 2nd type of waiting time)
7.4.7. Define Multinomial distributions and their parameters; find its expected value and variance
7.4.8. Define Hypergeometric distrubtions and their parameters; find its expected value and variance
7.4.9. Define independent set of random variables
7.4.10. Define the variance of a random variable and its standard deviation.
7.4.11. Formulate and prove Bienayame’s Formula
7.4.12. Formulate and prove Markov’s Inequality
7.4.13. Formulate and prove Chebyshev’s Inequality