What it is about
- Randomness
- sample point: Ο (a coin flip)
- sample space: \Upomega (heads, tails)
- event β specific collection of sample points (subset of sample space)
Discrete Probability
- requires countable experiments (cannot do half a coin flip)
- P(Ο)β₯0 for any sample point Ο
- βΟβ\UpomegaβP(Ο)=1 for entire sample space (sure-event)
- P(A)=βΟβAβP(Ο)β₯0 for event A (sum of individual probabilities)
- P(AβͺB)=P(A)+P(B) when Aβ©B=β
β¦ Additivity of Probability
- P(Ο)=0 is possible (hitting a specific point on a dart board)
- random Variable: function mapping result from experiment to real manifestation
- e.g. when losing/winning a bet money gets transferred
Expectation
- weighted average of all outcomes according to their probabilities
- in a casino the expectation will be slightly below 1. like 0.95
- the house is always winning
Variance
- the spred-out-edness of the results
- casino gambling machine (insert 1)
- variance in low cashout region is low (0.2, 0.3, 0.4)
- variance in high cashout region is high (2, 5, 50)
- V(X)=E[X2]βE[X]2 is best function to use
Covariance
- no clue how to explain that/write it down
- how much 2 different random Variables are different
- value not of actual value
- sign is important
- positive β deviation is moving in the same direction
- negative β deviation are moving away from each other
- joint distribution function
- first compared to first
- second compared to second
- β¦
- order matters here
Continuous Probability
- all point-probabilities are 0
- Integral under probability curve
- closed/open interval irrelevant
Bernoulli Distribution - Ber(p)
- result of experiment only has 2 possibilities
- e.g. number is greater than another, coin flip, window is open or not
- E[X]=1βp+0β(1βp)=p
- V[X]=E[X2]βE[X]2=pβp2=p(1βp)
- all results of experiment are equally likely
- dice toss, coin flip, roulette
- E[X]=2n+1β
- V(X)=12n2+1β
Binomial Distribution
- repeating a Bernoulli Experiment
- n β¦ amount of repetition
- k β¦ amount of successful experiments
- e.g. 3 / 5 heads in coin toss
- E(X)=nβp
- V(X)=nβpβ(1βp)
- in R:
dbinom(0:n, n, p.suc)
Poisson Distribution
- poisson β¦ french for fish
- intensity function
- Ξ» β¦ intensity factor (how many fish jump out of the water in the lake during a time period)
- E(X)=Ξ»
- V(X)=Ξ»
From discrete to continuous probability
- wheel of fortune
- probability of certain point β 0
- probability of landing in one of the regions β numberΒ ofΒ regions1β
- integral between 2 values of continuous probability
- e.g. wheel of fortune in the top half β 0.5
Gaussian (Normal) Distribution
- parameters with defaults: ΞΌ = 0 and Ο = 1
- area under curve (β«ββββf(x)=1) is always 1
- when calculating or looking up value of probability
- area under each side is 0.5 β symmetry of curve
- important to know whether the interval goes to right or left β subtract or add
- when looking up the z when given a probability
- look at table other way round β find probability and read the z
- E[X]=ΞΌ
- V(X)=Ο2
- Z=ΟXβΞΌβ β¦ shifting by ΞΌ and scaling by Ο
- Z is now a standard normal variable
- X=ΟβZ+ΞΌ
- insert formula with Z into original probability
- then transform the from and to values according to Z expression
- look up probability
Log-Normal Distribution
- if the logarithm of the function is normally distributed
- formulas are on the formula sheet
- problem with Gaussian β negative values have 50% probability
- no negative demand or price in economics β therefore log-normal
- log-normal can only have positive x-values
- then the logarithm is normally distributed
Exponential Distribution
- related to Poisson Distribution
- events occurring at some rate, counting the time between events (e.g. 5 minutes between buses)
- made to measure and calculate waiting time
- E=1/Ξ»
- V=1/Ξ»2
- different parameters than Ξ» possible (e.g. Ξ²)
Independent Random Variables
- correlation/causation problem kind of
- example: temperature and ice cream consumption
- E[fβg]=E[f]βE[g]
- Cov(X,Y)=0 β¦ independence requires 0 covariance
- 0 covariance does not require independent variables
- also true for n random variables Cov(X1β,...,Xnβ)=0
Properties of Expectation, Variance, Convergence
- playing around with formulas and proving parts of the formula sheet
- expectation: E[aX+bY]=aE[X]+bE[Y] β¦ linear function
- variance (X,Y dependent): V(aX+bY)=a2V(X)+2abCov(X,Y)+b2V(Y) β¦ square function
- variance (X,Y independent): V(aX+bY)=a2V(X)+b2V(Y)
- X,Y independent β¦ Cov(X,Y)=0