Metropolis-Hastings algorithm

  • Thread starter kulimer
  • Start date
  • Tags
    Algorithm
In summary, the conversation discusses the relation between the functions \pi(x) and \pi(y) and the jump distribution q(y,x) in the context of the equation \alpha(x,y) = \min \left( 1,\frac{\pi (y)q(y,x)}{\pi (x)q(x,y)} \right), with a specific focus on the jump distribution q(y,x) and how it relates to the variables x and y. The question asks for clarification on how to write out q(y,x) using the normal distribution (0,1) and highlights the importance of properly understanding and using placeholders in algebraic equations.
  • #1
kulimer
9
0
I digged out this old tread, but it is closed. I'll repost, but with my question.
https://www.physicsforums.com/showthread.php?t=74004&highlight=metropolis

[itex]\pi(x)[/itex]

and
[itex]\pi(y)[/itex]

and
[itex]q(y,x)[/itex] is the jump distribution

in the relation:
[itex]\alpha(x,y)= \min \left( 1,\frac{\pi (y)q(y,x)}{\pi (x)q(x,y)} \right)[/itex]

Say, my jump distribution(aka transition prob) is normal(0,1). How do you write out [itex]q(y,x)[/itex]? Is it [itex]\frac{1}{\sqrt{2\pi }\sigma }{{e}^{\frac{{{(x-0)}^{2}}}{2{{\sigma }^{2}}}}}[/itex]?

But this doesn't make sense, because it doesn't involve y, since [itex]q(y,x)[/itex] means given y, the transition probability of getting x. We are suppose to relate y to x in the equation.
 
Last edited:
Physics news on Phys.org
  • #2
Variables x, y, z, in algebra, are place holders.

q(y,x)=normal(0,1) means y=0 x=1

Algebra is a incomplete story of placeholders. Be careful where you can plugin the values.
 

1. What is the Metropolis-Hastings algorithm?

The Metropolis-Hastings algorithm is a computational method commonly used in statistics and applied mathematics to generate samples from a probability distribution that is difficult to sample directly.

2. How does the Metropolis-Hastings algorithm work?

The Metropolis-Hastings algorithm works by proposing a new sample from the target distribution based on the current sample, and then accepting or rejecting the proposed sample based on a defined acceptance probability. This process is repeated multiple times to generate a sequence of samples that approximate the target distribution.

3. What are the advantages of using the Metropolis-Hastings algorithm?

The Metropolis-Hastings algorithm is a versatile and widely used algorithm that can be applied to a variety of problems involving complex probability distributions. It also allows for the incorporation of prior knowledge or constraints into the sampling process.

4. What are the limitations of the Metropolis-Hastings algorithm?

One limitation of the Metropolis-Hastings algorithm is that it may require a large number of iterations to generate a sufficient number of samples for accurate estimation of the target distribution. It also requires careful tuning of the proposal distribution to achieve efficient sampling.

5. In what fields is the Metropolis-Hastings algorithm commonly used?

The Metropolis-Hastings algorithm is commonly used in fields such as statistics, machine learning, physics, and biology. It is particularly useful for Bayesian inference, where it can be used to estimate posterior distributions in a variety of applications.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
899
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Advanced Physics Homework Help
Replies
19
Views
801
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
Replies
5
Views
1K
Replies
3
Views
374
  • Introductory Physics Homework Help
Replies
1
Views
983
Replies
9
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
710
Back
Top