0% found this document useful (0 votes)
928 views8 pages

Proposal Distribution

The proposal discusses the challenges of direct sampling from complex distributions and introduces a simpler proposal distribution q(x) for efficient sampling. It outlines the Rejection Sampling algorithm, its limitations, and the transition to Importance Sampling, which assigns weights to samples to improve efficiency. The document emphasizes the importance of selecting appropriate q(x) and envelope M for optimal performance in high-dimensional spaces.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
928 views8 pages

Proposal Distribution

The proposal discusses the challenges of direct sampling from complex distributions and introduces a simpler proposal distribution q(x) for efficient sampling. It outlines the Rejection Sampling algorithm, its limitations, and the transition to Importance Sampling, which assigns weights to samples to improve efficiency. The document emphasizes the importance of selecting appropriate q(x) and envelope M for optimal performance in high-dimensional spaces.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

PROPOSAL

DISTRIBUTION
Machine Learning
IIIy CSE IDP and IDDMP

Refer : Stephen
Marsland​​
Motivation
1. Direct sampling from p(x) is often computationally
expensive or infeasible.

2. We can evaluate an un‐normalised version p'(x) but cannot


draw samples easily.

3. Introduce a simpler, easy‑to‑sample “proposal” distribution


q(x).

4. Goal: use samples from q(x) to approximate samples from


p(x) without altering notations.
Proposal Distribution
1. We assume

with unknown normalisation constant Zp

2. Choose q(x) such that for all x:

where M is a finite constant (the “envelope” factor).

3. Sampling procedure: draw x∗∼q(x), then decide acceptance


via a uniform test under M q(x∗)
The Rejection Sampling
Algorithm
Rejection Sampling Example & Transition to
Importance Sampling
Example (Mixture of Gaussians): (Refer next slide for graph)
Proposal q(x) uniform (dotted line).
With M=0.8, ≈50% of samples are rejected. With M=2, ≈85% of samples are rejected.
Issues:
A) High rejection rates waste computation. B)Curse of dimensionality exacerbates
inefficiency.
Two Remedies:
Develop more sophisticated exploration methods. Bias sampling toward high‑probability
regions.
Importance Sampling:
Assign each draw x(i)∼q(x) a weight

Allows estimation of expectations without discarding samples.


Resampling:
Use weights in a Sampling‑Importance‑Resampling scheme to obtain a representative
set of points.
Rejection Sampling Example & Transition to
Importance Sampling
The Sampling-Importance-Resampling
Algorithm
Summary And Conclusion
Challenge: Direct sampling from complex p(x) is often infeasible.
Rejection Sampling: Uses a simpler proposal q(x) and an envelope M q(x) to
filter samples.
Easy to implement but can discard many draws, especially in high dimensions.
Improvements:
Better exploration strategies (local moves, MCMC).
Importance sampling to weight rather than reject.
Importance Sampling:
Assigns importance weights to all samples, avoiding waste.
Enables weighted estimation and resampling (SIR).
Takeaway:
Choice of q(x) and M is critical for efficiency.

You might also like