Questions For Unit 5 RM
Questions For Unit 5 RM
Theory: A random variable X with CDF F (x) can be generated by transforming a uniform random variable U ∼
Uniform(0, 1) using the inverse of its CDF:
X = F −1 (U ).
Example: Generate samples from the exponential distribution with rate λ = 2.
1. Write the CDF of the exponential distribution:
F (x) = 1 − e−λx , x ≥ 0.
2 Accept-Reject Algorithm
• Concept: Generate samples from a complex target distribution f (x) using a simpler proposal distribution g(x).
• Steps:
1. Choose a proposal distribution g(x) and a constant M such that f (x) ≤ M g(x) for all x.
2. Generate X ∼ g(x) and U ∼ Uniform(0, 1).
f (X)
3. Accept X if U ≤ M g(X) ; otherwise, reject X and repeat.
• Applications:
– Sampling from distributions without closed-form inverses.
– Example: Generating standard normal variates using exponential proposals.
Theory: Used when direct sampling is difficult. A simpler distribution g(x) serves as a proposal distribution.
Example: Sample from f (x) = 2x on [0, 1] using g(x) = 1 (Uniform distribution on [0, 1]).
1. Find M such that f (x) ≤ M g(x). Here M = 2.
2. Generate X ∼ g(x) ∼ Uniform(0, 1).
f (X)
3. Generate U ∼ Uniform(0, 1) and accept X if U ≤ M g(X) .
1
5. Solution:
f (0.6) = 2 × 0.6 = 1.2, M g(0.6) = 2 × 1 = 2.
Compute:
f (0.6) 1.2
= = 0.6.
M g(0.6) 2
- Since U = 0.5 ≤ 0.6, accept X = 0.6.
3 Metropolis Algorithm
• Concept: A Markov Chain Monte Carlo (MCMC) method to generate samples from a target distribution π(x).
• Steps:
1. Start with an initial value X0 .
2. Propose a new state X ∗ using a symmetric proposal distribution q(X ∗ |Xt ) (e.g., Gaussian).
3. Compute the acceptance probability:
π(X ∗ )
α = min 1, .
π(Xt )
4. Accept X ∗ with probability α; otherwise, set Xt+1 = Xt .
5. Repeat for a large number of iterations.
• Applications:
– Sampling from high-dimensional distributions.
– Bayesian inference.
Theory: Samples from a target distribution π(x) using a symmetric proposal distribution.
2
Example: Sample from π(x) ∝ e−x (standard normal distribution).
1. Start with X0 = 0.
2. Propose X ∗ ∼ Uniform(Xt − 0.5, Xt + 0.5).
3. Compute acceptance probability:
π(X ∗ )
α = min 1, .
π(Xt )
4. Generate U ∼ Uniform(0, 1). Accept X ∗ if U ≤ α; otherwise, set Xt+1 = Xt .
5. Problem: Perform one iteration starting with X0 = 0, propose X ∗ = 0.3, and use U = 0.4.
6. Solution: 2 2
π(0.3) = e−0.3 = 0.91393, π(0) = e−0 = 1.
Compute:
0.91393
α = min 1, = 0.91393.
1
- Since U = 0.4 ≤ 0.91393, accept X ∗ = 0.3.
4 Gibbs Sampling
• Concept: MCMC method where sampling is performed conditionally for each variable in a multivariate distribu-
tion.
• Steps:
1. Start with an initial vector X = (X1 , X2 , . . . , Xd ).
2. Sequentially sample Xi from its conditional distribution:
Xi ∼ P (Xi |X−i ),
where X−i represents all variables except Xi .
3. Repeat the above step for several iterations.
• Applications:
– Bayesian hierarchical models.
– Spatial statistics and image processing.
2
5 Monte Carlo Integration
• Concept: Approximation of integrals using random samples.
• Steps:
Rb
1. To compute I = a
h(x)f (x) dx, generate X1 , X2 , . . . , Xn ∼ f (x).
2. Estimate the integral as:
n
1X
Iˆ = h(Xi ).
n i=1
3. For multidimensional integrals, generalize X ∼ f (x) and use the same averaging approach.
• Applications:
6 MCMC Principle
• Key Idea: Create a Markov chain that has the desired target distribution π(x) as its stationary distribution.
• Steps:
1. Define transition probabilities ensuring irreducibility, aperiodicity, and positivity.
2. Use methods like Metropolis or Gibbs sampling to generate a sequence of samples.
3. After a burn-in period, use the samples to approximate expectations:
n
1X
E[h(X)] ≈ h(Xi ).
n i=1
• Applications:
– Bayesian inference.
– High-dimensional optimization.
7 Metropolis-Hastings Algorithm
• Concept: Generalization of the Metropolis algorithm for non-symmetric proposals.
• Steps:
3
8 Bootstrap Method
• Concept: A resampling technique to estimate the distribution of a statistic.
• Steps:
1. Given a dataset X = {x1 , x2 , . . . , xn }, generate B bootstrap samples by sampling with replacement.
2. For each bootstrap sample, compute the statistic of interest Tb∗ .
3. Use the empirical distribution of {T1∗ , T2∗ , . . . , TB∗ } to estimate the standard error, confidence intervals, etc.
• Applications:
– Non-parametric confidence intervals.
– Hypothesis testing.