Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is Markov process a Brownian process?

  1. Sep 11, 2012 #1
    Hi all,

    I know that Brownian process can be shown as Markov process but is the converse possible? I mean can we show that a markov process is a brownian process?

    Thanks in advance.
  2. jcsd
  3. Sep 11, 2012 #2


    User Avatar
    Science Advisor
    Gold Member

    There are discrete processes with the Markov property. Brownian motion is continuous.
  4. Sep 11, 2012 #3
    Sorry, I didn't understand.
    The book I am reading shows Brownian as Markov.
  5. Sep 11, 2012 #4


    User Avatar
    Science Advisor

    Not sure what you mean by a Brownian process but if you mean a Weiner process then there are many Markov processes that are not Weiner processes. For instance,in finance, geometric Brownian motions are commonly use to model securities prices.
    Last edited: Sep 11, 2012
  6. Sep 11, 2012 #5


    User Avatar

    Staff: Mentor

    I believe in the scale large enough Brownian motions are indistinguishable from (continuous) diffusion, but on microscopic scale it is just a random walk.

    Edit: but then perhaps I am thinking about a real process, and you are thinking about a mathematical model.
  7. Sep 11, 2012 #6


    User Avatar
    Science Advisor

    In general, Brownian motion in mathematics is not necessarily continuous. Sample paths are only continuous almost surely. There is a version of it where the paths are continuous.

    As far as real processes are concerned, you do not know whether they are continuous or not since you never have anything except discrete samples of them. For instance, even though stock prices are recorded in discrete jumps, the underlying process may be continuous or may be a continuous time process.
  8. Sep 11, 2012 #7

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    You misunderstood what mathman wrote. Brownian motion is a simple example of a Markov process. He picked one example of a Markov process that is not a Wiener process.

    That all Ys are Xs does not necessarily mean that all Xs are Ys.
  9. Sep 11, 2012 #8
    sorry I mean Weiner processes (I am assuming that the mathematical treatment of Brownian is Weiner). The text shows that Brownian (with Gaussian distribution & 0 mean) is martingale, and a local martingale is a Brownian (with Gaussian distribution & 0 mean).
    Similarly, the text shows that the Brownian (with Gaussian distribution & 0 mean) is a Markov process, now my question is that is it necessary for a Markov process to be a Brownian (with Gaussian distribution & 0 mean). I believe that in your reply you said that it is not necessary for a Markov process to be a Brownian (with Gaussian distribution & 0 mean) & then you have given the example of geometric Brownian motions, am I correct?
  10. Sep 11, 2012 #9
    what are real processes?
  11. Sep 11, 2012 #10


    User Avatar
    Science Advisor

    I think Borek is talking a process that actually exists in nature as opposed to one that is a mathematical curiosity or representation on paper (that either doesn't exist or at hasn't yet been observed).
  12. Sep 11, 2012 #11
  13. Sep 11, 2012 #12
    thanks everyone for very good explanation
  14. Sep 11, 2012 #13


    User Avatar
    Science Advisor

    Correct. More generally there are Markov processes that are called Ito diffusions that are not Brownian motions. Infinitesimally they look like a Brownian motion multiplied by a function plus a drift term.
    Last edited: Sep 11, 2012
  15. Sep 12, 2012 #14
    Sir, thank you so much for great help.

    Can you please tell me that what is Geometric Brownian motion, I have tried to read it on google & wiki but unfortunately I can't understand the concept.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Is Markov process a Brownian process?
  1. Markov process (Replies: 4)

  2. Markov Process Proof (Replies: 1)