Optimal control

Optimal control theory is a branch of mathematical optimization that deals with finding a control for a dynamical system over a period of time such that an objective function is optimized. It has numerous applications in science, engineering and operations research. For example, the dynamical system might be a spacecraft with controls corresponding to rocket thrusters, and the objective might be to reach the moon with minimum fuel expenditure. Or the dynamical system could be a nation's economy, with the objective to minimize unemployment; the controls in this case could be fiscal and monetary policy. A dynamical system may also be introduced to embed operations research problems within the framework of optimal control theory.Optimal control is an extension of the calculus of variations, and is a mathematical optimization method for deriving control policies. The method is largely due to the work of Lev Pontryagin and Richard Bellman in the 1950s, after contributions to calculus of variations by Edward J. McShane. Optimal control can be seen as a control strategy in control theory.

View More On Wikipedia.org
  • 9

    Greg Bernhardt

    A PF Singularity From USA
    • Messages
      19,443
    • Media
      227
    • Reaction score
      10,021
    • Points
      1,237
  • 1

    Chung

    A PF Quark From Hong Kong
    • Messages
      1
    • Reaction score
      0
    • Points
      9
  • 1

    Payam30

    A PF Atom From Sweden
    • Messages
      46
    • Reaction score
      1
    • Points
      31
  • 1

    Liferider

    A PF Atom From Kristiansand, Norway
    • Messages
      43
    • Reaction score
      0
    • Points
      34
  • 1

    TammyTsang

    A PF Quark
    • Messages
      1
    • Reaction score
      1
    • Points
      1
  • 1

    manciong

    A PF Quark
    • Messages
      1
    • Reaction score
      0
    • Points
      1
  • Back
    Top