Quick Summary (Probability)

Quick Summary (Probability)

JC Mathematics, Mathematics, University Mathematics

University is starting for some students who took A’levels in 2016. And, one of my ex-students told me to share/ summarise the things to know for probability at University level. Hopefully this helps. H2 Further Mathematics Students will find some of these helpful.


Random Variables

Suppose X is a random variable which can takes values x \in \chi.

X is a discrete r.v. is \chi is countable.
\Rightarrow p(x) is the probability of a value of x and is called the probability mass function.

X is a continuous r.v. is \chi is uncountable.
\Rightarrow f(x) is the probability density function and can be thought of as the probability of a value x.

Probability Mass Function

For a discrete r.v. the probability mass function (PMF) is

p(a) = P(X=a), where a \in \mathbb{R}.

Probability Density Function

If B = (a, b)

P(X \in B) = P(a \le X \le b) = \int_a^b f(x) ~dx.

And strictly speaking,

P(X = a) = \int_a^a f(x) ~dx = 0.

Intuitively,

f(a) =  P(X = a).

Properties of Distributions

For discrete r.v.
p(x) \ge 0 \forall x \in \chi.
\sum_{x \in \chi} p(x) = 1.

For continuous r.v.
f(x) \ge 0 \forall x \in \chi.
\int_{x \in \chi} f(x) ~dx = 1.

Cumulative Distribution Function

For discrete r.v., the Cumulative Distribution Function (CDF) is
F(a) = P(X \le a) = \sum_{x \le a} p(x).

For continuous r.v., the CDF is
F(a) = P(X \le a ) = \int_{- \infty}^a f(x) ~dx.

Expected Value

For a discrete r.v. X, the expected value is
\mathbb{E} (X) = \sum_{x \in \chi} x p(x).

For a continuous r.v. X, the expected value is
\mathbb{E} (X) = \int_{x \in \chi} x f(x) ~dx.

If Y = g(X), then     For a discrete r.v. X,   latex \mathbb{E} (Y) = \mathbb{E} [g(X)] = \sum_{x \in \chi} g(x) p(x)$.

For a continuous r.v. X,
\mathbb{E} (Y) = \mathbb{E} [g(X)] = \int_{x \in \chi} g(x) f(x) ~dx.

Properties of Expectation

For random variables X and Y and constants a, b, \in \mathbb{R}, the expected value has the following properties (applicable to both discrete and continuous r.v.s)

\mathbb{E}(aX + b) = a \mathbb{E}(X) + b

\mathbb{E}(X + Y) = \mathbb{E}(X) + \mathbb{E}(Y)

Realisations of X, denoted by x, may be larger or smaller than \mathbb{E}(X),

If you observed many realisations of X, \mathbb{E}(X) is roughly an average of the values you would observe.

\mathbb{E} (aX + b)
= \int_{- \infty}^{\infty} (ax+b)f(x) ~dx
= \int_{- \infty}^{\infty} axf(x) ~dx + \int_{- \infty}^{\infty} bf(x) ~dx
= a \int_{- \infty}^{\infty} xf(x) ~dx + b \int_{- \infty}^{\infty} f(x) ~dx
= a \mathbb{E} (X) + b

Variance

Generally speaking, variance is defined as

Var(X) = \mathbb{E}[(X- \mathbb{E}(X)^2] = \mathbb{E}[X^2] - \mathbb{E}[X]^2

If X is discrete:

Var(X) = \sum_{x \in \chi} ( x - \mathbb{E}[X])^2 p(x)

If X is continuous:

Var(X) = \int_{x \in \chi} ( x - \mathbb{E}[X])^2 f(x) ~dx

Using the properties of expectations, we can show Var(X) = \mathbb{E}(X^2) - \mathbb{E}(X)^2.

Var(X)
= \mathbb{E} [(X - \mathbb{E}[X])^2]
= \mathbb{E} [(X^2 - 2X \mathbb{E}[X]) + \mathbb{E}[X]^2]
= \mathbb{E}[X^2] - 2\mathbb{E}[X]\mathbb{E}[X] + \mathbb{E}[X]^2
= \mathbb{E}[X^2] - \mathbb{E}[X]^2

Standard Deviation

The standard deviation is defined as

std(X) = \sqrt{Var(X)}

Covariance

For two random variables X and Y, the covariance is generally defined as

Cov(X, Y) = \mathbb{E}[(X - \mathbb{E}[X])(Y - \mathbb{E}[Y])]

Note that Cov(X, X) = Var(X)

Cov(X, Y) = \mathbb{E}[XY] - \mathbb{E}[X] \mathbb{E}[y]

Properties of Variance

Given random variables X and Y, and constants a, b, c \in \mathbb{R},

Var(aX \pm bY \pm b ) = a^2 Var(X) + b^2 Var(Y) + 2ab Cov(X, Y)

This proof for the above can be done using definitions of expectations and variance.

Properties of Covariance

Given random variables W, X, Y and Z and constants a, b, \in \mathbb{R}

Cov(X, a) = 0

Cov(aX, bY) = ab Cov(X, Y)

Cov(W+X, Y+Z) = Cov(W, Y) + Cov(W, Z) + Cov(X, Y) + Cov(X, Z)

Correlation

Correlation is defined as

Corr(X, Y) = \dfrac{Cov(X, Y)}{Std(X) Std(Y)}

It is clear the -1 \le Corr(X, Y) \le 1.

The properties of correlations of sums of random variables follow from those of covariance and standard deviations above.

Probability Question #4

Probability Question #4

JC Mathematics

A gambler bets on one of the integers from 1 to 6. Three fair dice are then rolled. If the gambler’s number appears k times (k = 1, 2, 3), he wins $ k. If his number fails to appear, he loses $1. Calculate the gambler’s expected winnings

Question of the Day #18

JC Mathematics

Here is a very very interesting question involving probability that a student saw in her tutorial and asked me. Here it is 🙂

A student is concerned about her car and does not like dents. When she drives to school, she has a choice of parking it on the street in one space, parking it on the street and taking up two spaces, or parking in the lot.
If she parks on the street in one space, her car gets dented with probability 0.1.
If she parks on the street and takes two spaces, the probability of a dent is 0.02 and the probability of a $15 ticket is 0.3.
Parking in a lot costs $5, but the car will not get dented.
If her car gets dented, she can have it repaired, in which case it is out of commission for 1 day and costs her $50 in fees and cab fares. She can also drive her car dented, but she feels that the resulting loss of value and pride is equivalent to a cost of $9 per school day.
She wishes to determine the optimal policy for where to park and whether to repair the car when dented in order to minimize her (long-run) expected average cost per school day. What should the student to maximise her utility (minimise her cost)?

This is an interesting question, I guess its good to know some JCs are trying to introduce decision making process in teaching probability.

I’ll post a solution here soon. But to start off, we observe that we have two states here and student has 4 decisions. Have fun! 🙂

2002 A-level H2 Mathematics Paper 2 Question 30 Suggested Solutions

JC Mathematics

All solutions here are SUGGESTED. Mr. Teng will hold no liability for any errors. Comments are entirely personal opinions.

(a)
(i)
\frac{7 \times 6 \times 5}{7 \times 7 \times 7} = \frac{30}{49}
Alternatively, \frac{^7 P_3}{7^3}

(ii)
When n=4, required probability = \frac{^7 P_4}{7^3} = \frac{7 \times 6 \times 5 \times 4}{7 \times 7 \times 7 \times 7} = \frac{120}{343}

(b)
Key GC with either sequence or use table functions.
We want to solve \frac{^12 C_n \times n!}{12^n} \textless \frac{1}{2}
Plot y_1 = \frac{^12 C_x \times x!}{12^x} and check for the x that gives a corresponding y_1 value that is less than \frac{1}{2}

(c)
When n =21, probability that all 21 people have different birthdays = \frac{^365 P_21}{365^21} = 0.55631

Probability that all 21 people have different birthdays = \frac{^365 P_23}{365^23} = 0.4927

Thus, when n =23, probability that at least two of the people are the same birthday = 1 - 0.4927 = 0.5073 (more than \frac{1}{2})

KS Comments:

A very interesting probability question here. This show show counter intuitive probability is! The results shows that the we can expect to find someone with the same birthday in a room of 23, more than half the time.

Financial Engineering (I)

University Mathematics

Before attempting to read what we have here, students should revise their basic probability and linear algebra first.

Financial Engineering (I) #1 – Overview
Financial Engineering (I) #2 – Introduction to No Arbitrage
Financial Engineering (I) #3 – Interest rates and fixed income instruments
Financial Engineering (I) #4 – Floating Rate Bonds and Term Structure of Interest Rates
Financial Engineering (I) #5 – Forward Contracts
Financial Engineering (I) #6 – Swaps
Financial Engineering (I) #7 – Futures
Financial Engineering (I) #8 – Options
Financial Engineering (I) #9 – Options Pricing
Financial Engineering (I) #10 – The 1-Period Binomial Model
Financial Engineering (I) #11 – Option Pricing in the 1-Period Binomial Model
Financial Engineering (I) #12 – The Multi-Period Binomial Model
Financial Engineering (I) #13 – Pricing American Options
Financial Engineering (I) #14 – Replicating Strategies
Financial Engineering (I) #15 – Dividends, Pricing in the Binomial Model
Financial Engineering (I) #16 – Black-Scholes Model
Financial Engineering (I) #17 – Introduction to Term Structure Lattice Models
Financial Engineering (I) #18 – Cash Account and Pricing Zero-Coupon Bonds
Financial Engineering (I) #19 – Fixed Income Derivatives (1)
Financial Engineering (I) #20 – Fixed Income Derivatives (2)
Financial Engineering (I) #21 – The Forward Equation
Financial Engineering (I) #22 – Model Calibration
Financial Engineering (I) #23 – Pricing in a Black-Derman Toy Model
Financial Engineering (I) #24 – Modelling and Pricing Default-able bonds
Financial Engineering (I) #25 – Credit Default Swaps and Pricing Credit Default Swaps
Financial Engineering (I) #26 – Mortgage Mathematics and Mortgage-Backed Securities
Financial Engineering (I) #27 – Prepayment Risks and Pass-Throughs
Financial Engineering (I) #28 – Principal-Only and Interest Only Mortgaged-Backed Securities
Financial Engineering (I) #29 – Collateralised Mortgage Obligations
Financial Engineering (I) #30 – Pricing Mortgage-Backed Securities

Geometric Brownian Motion

JC Mathematics, University Mathematics

This is really important for anyone interested in Finance Modelling. As what the movie Wolf on Wall Street says:

Fugazi. Source: AZ Quotes
Fugazi. Source: AZ Quotes

They are referring to a geometric brownian motion.

Firstly, we will begin with the definitions.

We say that a random process, X_t, is a geometric Brownian motion (GBM) if for all t \ge 0
X_t = e^{(\mu - \frac{\sigma^2}{2})t} + \sigma W_t
where W_t is a Standard Brownian Motion
Here \mu is the drift and \sigma is the volatility. We write X_t \sim GBM (\mu, \sigma)

Also note that
X_{t+s}
= X_0 e^{(\mu - \frac{\sigma^2}{2})(t+s) + \sigma W_{t+s}}
= X_0 e^{(\mu - \frac{\sigma^2}{2})(t+s) + \sigma W_{t} + (\mu - \frac{\sigma^2}{2})s + \sigma (W_{t+s} - W_t)}; This is a common technique for solving expectations.
= X_t e^{(\mu - \frac{\sigma^2}{2})(s) + \sigma (W_{t+s} - W_t)}. This is very useful for simulating security prices.

Consider \mathbb{E}_t [X_{t+s}]

\mathbb{E}_t [X_{t+s}]
= \mathbb{E}_t [X_{t}e^{(\mu - \frac{\sigma^2}{2})s + \sigma(W_{t+s} - W_t)}]; Notice this expansion is similar to before.
= X_t e^{(\mu - \frac{\sigma^2}{2})s} \mathbb{E}^t [e^{\sigma (W_{t+s} - W_t)}]
= X_t e^{(\mu - \frac{\sigma^2}{2})s} e^{\frac{\sigma^2}{2}s}
= e^{\mu s} X_t
This result tells us that the expected growth rate of X_t is \mu.

From the definitions of Brownian Motion introduced earlier, we extend them to Geometric Brownian motion.
1. Fix t_1, t_2, \ldots , t_n. Then \frac{X_{t_2}}{X_{t_1}}, \frac{X_{t_3}}{X_{t_2}}, \ldots \frac{X_{t_n}}{X_{t_{n-1}}} are mutually independent.
2. Paths of X_t are continuous as function of t, meaning they do not jump.
3. For s > 0, \mathrm{log}(\frac{X_{t+s}}{X_t}) \sim \mathrm{N}((\mu - \frac{\sigma^2}{2})s, \sigma^2 s)

So now lets try to do some modelling of stock prices as a geometric brownian motion.

Suppose X_t \sim GBM(\mu, \sigma). Clearly
1. X_t > 0 \Rightarrow X_{t+s} > 0 for any s > 0
This tells us that the limited liability of stock price is not violated.
2. The distribution of \frac{X_{t+s}}{X_t} only depends on s and not on $latex X_t.
We will look at the Black-Scholes option formula next time and will come back to review the geometric brownian motion for the underlying model.

Introduction to Brownian Motion

JC Mathematics, University Mathematics

Lets look at brownian motion now. And yes, its the same as what our high school teachers taught about the particles moving in random motion. Here, we attempt to give it a proper structure and definition to work with.

A Brownian Motion is a random process \{ X_t : t \ge  0 \} with parameters (\mu, \sigma) if
For 0 \textless t_1 \textless t_2 \textless \ldots \textless t_{n-1} \textless t_n, (X_{t_2} - X_{t_1}), (X_{t_3} - X_{t_2}), \ldots, (X_{t_n} - X_{t_{n-1}}) are mutually independent. This is often called the independent increments property.
For s > 0, X_{t+s} - X_t \sim \mathrm{N} ( \mu s, \sigma^2 s)
X_t is a continuous function of t.
We say that X_t is a \mathrm{B} (\mu, \sigma) Brownian motion with drift \mu and volatility \sigma.

For the special case of \mu = 0 and \sigma = 1, we have a standard Brownian motion. We can denote it with W_t and assume that W_0 = 0
If X_t \sim \mathrm{B}(\mu, \sigma) and X_0 = x then X_t = x + \mu t + \sigma W_t where W_t is a standard brownian motion. Thus, X_t \sim \mathrm{N}(x+\mu t, \sigma^2 t)

Random Paths of Brownian Motion. Source: Columbia University
Random Paths of Brownian Motion. Source: Columbia University

The next concept is important in finance, that is, Information Filtrations.
For any random process, we will use \mathcal{F}_t to denote the information available at time t.
– the set \{\mathcal{F}_t\}_{t \ge 0} is then the information filtration.
\mathbb{E}[.|\mathcal{F}_t] denotes an expectation conditional on time t information available.

Note: The independent increment property of Brownian Motion implies that any function of W_{t+s} - W_t is independent of \mathcal{F}_t and that (W_{t+s}-W_t) \sim \mathrm{N}(0,s).

So let us do a bit of math to obtain \mathbb{E}_0[W_{t+s}W_s] for instance.

Using condition expectation identity, we have
\mathbb{E}_0 [W_{t+s}W_s]
= \mathbb{E}_0 [(W_{t+s} - W_s + W_s)Ws]
= \mathbb{E}_0 [(W_{t+s}-W_s)W_s] + \mathbb{E}_0 [{W_s}^2]
= \mathbb{E}_0 [\mathbb{E}_s[(W_{t+s} - W_s)W_s]] + s
= \mathbb{E}_0 [W_s \mathbb{E}_s[(W_{t+s} - W_s)]] + s
= \mathbb{E}_0 [W_s 0] + s
= 0 + s
= s