Question on Importance Sampling (Monte Carlo method)

  • Context: Graduate 
  • Thread starter Thread starter kasraa
  • Start date Start date
  • Tags Tags
    Method Sampling
Click For Summary
SUMMARY

This discussion focuses on the application of Importance Sampling in Monte Carlo methods, specifically addressing the adjustment of weights when estimating a target distribution, p, from samples drawn from a proposal distribution, q. The standard approach involves computing unnormalized weights as p(s_i)/q(s_i) and normalizing them. The user questions whether it is justified to modify these weights by incorporating prior weights, u_i, resulting in a new formulation of the weights as p(s_i)u_{i}/q(s_i). The consensus indicates that altering the weights requires careful consideration of the sampling method to maintain the integrity of the empirical distribution approximation.

PREREQUISITES
  • Understanding of Monte Carlo methods
  • Familiarity with Importance Sampling techniques
  • Knowledge of probability distributions and their properties
  • Experience with empirical distribution functions
NEXT STEPS
  • Research the derivation of Importance Sampling weights in Monte Carlo simulations
  • Study the change of measure formula in probability theory
  • Explore advanced sampling techniques to improve empirical distribution approximations
  • Learn about the implications of weighted sampling on convergence and accuracy
USEFUL FOR

Statisticians, data scientists, and researchers involved in probabilistic modeling and Monte Carlo simulations who seek to enhance their understanding of Importance Sampling methodologies.

kasraa
Messages
15
Reaction score
0
Hi,

Suppose I have N iid samples from a distribution q, and I want to estimate another distributin, p, using those samples (Importance Sampling).

By "standard importance sampling", I mean the case where samples (prior samples. i.e. samples from q) have equal weights ([tex]w_i = 1/N[/tex]).

In the case of "standard importance sampling", I should perform these steps:

1) compute (unnormalized) weights for those sample according to [tex]p(s_i)/q(s_i)[/tex] ([tex]s_{i}[/tex] is the i'th sample from q)
2) normalize those weights
3) then an estimate of p would be this:
[tex]\hat{p} = \sum_{i=1}^N w_{i} \delta(i)[/tex]

(w_i are normalized weights computed at step 2. delta(i) is the Dirac delta function at s_i)


Now consider the case where samples (prior samples, i.e. samples from q) are weighted (different weights, and normalized. for example [tex]u_i[/tex]).

Is it enough (justified) to change the (unnormalized) weights (computed at step 1) to [tex]p(s_i)u_{i}/q(s_i)[/tex]?
(multiplying prior weights and "standard importance sampling" weights together?)


Thanks in advance.
 
Physics news on Phys.org
kasraa said:
... Is it enough (justified) to change the (unnormalized) weights (computed at step 1) to [tex]p(s_i)u_{i}/q(s_i)[/tex]?
(multiplying prior weights and "standard importance sampling" weights together?)

Possibly not, because any Monte Carlo simulation (including importance sampling) is essentially based on approximating the (cumulative) distribution by the empirical distribution, i.e.

[tex]P(x) = Prob[X\le x] = E_P[I[X\le x]] \approx \frac{1}{N}\sum_{i=1}^N I[X_i\le x][/tex]

where I is the Boolean indicator function and the [tex]X_i[/tex] are taken from distribution P. To change the weights from (1/N) to other numbers you'd need to change the sampling method to ensure that the "weighted" empirical distribution remains a good approximation to the CDF.

However if you do find a way to overcome that, the new importance sampling formula would easily follow from the change of measure formula, with

[tex]P(x) = E_P[I[X\le x]] = E_Q\left[I[X\le x]\frac{dP}{dQ}\right] = E_Q\left[I[X\le x]\frac{p(X)}{q(X)}\right] \approx \sum_{i=1}^N w_i I[X_i\le x]\frac{p(X_i)}{q(X_i)}[/tex]

where the [tex]X_i[/tex] are samples such that

[tex]\sum_{i=1}^N w_i I[X_i\le x][/tex]

closely approximates Q(x).
 
Thanks.
 

Similar threads

  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 23 ·
Replies
23
Views
4K
Replies
20
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K