Bayesian probability question

In summary: This is called fitting a quantile function or a probability integral transform. If you are willing to assume that each parking time is independently distributed and that the distribution for any particular time has the same shape, then you can use a Bayesian approach. With a Bayesian approach the key issue is choosing a prior distribution and you don't seem to have a prior distribution for the parameters of a distribution of parking times, let alone a prior distribution for the parameters of that distribution. In summary, the speaker is building a model to simulate the travel patterns of electric cars using a Bayesian approach. They are using a dataset to build the probability density functions and generating a parking time distribution based on factors like time of day and location. They are using a uniform
  • #1
bradyj7
122
0
Hello,

I am building a model that simulates the travel patterns of electric cars using a series of iterative conditional distributions. I have a dataset to build the pdfs.

In one part of the model I generate a parking time from a conditional distribution.

I create a parking time distribution for example given the time of day and location etc.

I am using a Bayeisan approach because given certain condition sometimes no observations may be returned from the dataset because none were recorded so no distribution can be created and the simulation stops

So first of all I assume a uniform Prior distribution.

https://dl.dropbox.com/u/54057365/All/prior.JPG

Secondly I return the data in the database given the conditions and create the likelihood function.

https://dl.dropbox.com/u/54057365/All/likelihood.JPG

Then I combine the prior distribution and posterior distribution to form the posterior distribution and I generate a value.

https://dl.dropbox.com/u/54057365/All/posterior.JPG

My question is as follows, whenever no observations are returned the likelihoods is 0 so the posterior distribution is flat like prior distribution.

Instead of using a uniform prior I want to use an informed prior.

I have set the all the hyperparamters of the bins to 1 in the prior distribution but can I assign the hyper parameters according to some distribution instead?

How would I do this?

Thanks
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
bradyj7 said:
So first of all I assume a uniform Prior distribution.

Distribution of what? In a Bayesian approach, you begin with a prior distribution for the parameters of another distribution. If you assume a uniform distribution of parking times, then I don't see how this a distribution for parameters of another distribution.

What you need is a model where parking times are distributed according to some family of distributions ( an exponential family, for example) and each distribution in the family is defined by a particular value of some parameters. (It's simplest to use as few parameters as possible. For example, an exponential distribution is defined by one parameter lambda.) Then you assume a prior distribution for the parameters.

The only way I can make sense of your work is that you assume parking times come from a family of distributions and that each distribution in the family is defined by about 300 parameters lambda1, lambda2,... , lambda300, where lambdaN gives the probability for parking for exactly N minutes.

You can't assume these parameters are jointly and independently uniformly distributed over the interval [0,1] since the parameters are probabilities then they must add to 1. You can assume they are jointly uniformly distributed subject to the condition that they add to 1.

There are various methods for fitting a smooth distribution to discrete data. There is no single best or correct way that works for all situations. (Likewise to use an informed prior, you actually need to have some information or be willing to assume some.) Accounting for Imprecision in measurements is a natural way to produce smoother distributions. For example if a parking time is 30 minutes, the method of measuring the 30 might produce that value from, say, any time between 29.5 and 30.5 minutes. One smoothing technique is to replace the observation of 30 with a set of observations uniformly distributed between 29.5 and 30.5. (The general technique is called using a smoothing "kernel".) You could even argue that an observation of 30 is evidence for a wider range of possibilities. For example, if someone went shopping and returned after 30 minutes, random factors such as delays is checkout lines etc. might lengthen or shorten that time if the same shopping was repeated.

It is common to see distributions fitted to data by fitting them to the cumulative histogram instead of the frequency histogram. The cumulative histogram (psychologically) often looks less erratic than the frequency histogram and people are more confident about guessing a family of distributions for it.
 

What is Bayesian probability?

Bayesian probability is a mathematical framework for calculating the likelihood of an event occurring based on prior knowledge or beliefs about the event. It involves updating beliefs as new evidence is presented, resulting in a posterior probability that takes into account both prior beliefs and new information.

How is Bayesian probability different from traditional probability?

Traditional probability is based on a frequentist approach, where the probability of an event is determined by the number of times it occurs in a large number of trials. Bayesian probability, on the other hand, takes into account prior beliefs and updates them with new evidence, resulting in a more personalized and flexible approach to probability.

What is a prior probability?

A prior probability is the initial belief or probability assigned to an event based on prior knowledge or assumptions. It serves as the starting point for Bayesian probability calculations and is updated with new evidence to create a posterior probability.

How do you calculate a posterior probability?

A posterior probability is calculated by combining the prior probability with the likelihood of the observed data, using Bayes' theorem. This involves multiplying the prior probability by the likelihood and dividing it by the total probability of the observed data.

What are some real-world applications of Bayesian probability?

Bayesian probability has many applications, including in fields such as medical diagnosis, weather forecasting, and machine learning. It is also commonly used in decision-making, risk analysis, and artificial intelligence.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
339
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
961
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
Back
Top