Dynamical systems:From explicit to implicit equations

In summary, the author is discussing the difference between an explicit and an implicit equation for a dynamical system. The explicit equation is used to descend the classical form for the state of a dinamic system, while the implicit form derives the explicit one.
  • #1
pupoz
3
0
Hi! This is my first post...I've a little question about a mathematical issue I found
in the passage from explicit to implicit equations of a dynamical system.

How to demonstrate that??

http://pixhost.eu/show_big.php?/share/2007-01-19/doi.jpg

Thanks to all
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
Explicit to implicit equations? Can you be more specific?
 
  • #3
All I see there is Leibniz' formula for the differentiation of an integral:
[tex]\frac{d}{dx}\int_{\alpha (x)}^{\beta (x)} f(x,t)dt= \frac{d\beta (x)}{dx}f(x,\beta (x))- \frac{d\alpha (x)}{dx}f(x,\alpha (x))+ \int_{\alpha (x)}^{\beta (x)}\frac{\partial f(x,t)}{\partial x}dt[/tex]

There is a discussion of it here:
http://www.chass.utoronto.ca/~osborne/MathTutorial/ECR.HTM
 
  • #4
Many thanks!

This formula is used to descend the classical form for the state of a dinamic system (implicit form):

dx(t)/dt = A(t)x(t)+B(t)u(t)

from the state transition function (explicit form) that express openly the state x(t) function of time t , start time t0 , initial condition x(t0), input function u(.)

x(t)=phi(t,t0,x(t0),u(.))

that is for a linear system

x(t)=phi(t,t0,x(t0),u(.))=PHI(t,t0)x(t0)+INT[t0_t](K(t,tau)u(tau)dtau))

You can descend the implicit form deriving the explicit one.
 
  • #5
Pupoz, be aware that dynamcal systems is a huge field, and there are many, many things you can call a dynamical system besides a system of coupled nonlinear ODEs. For example, in my diss I (following Conway and de Bruijn) studied the space of Penrose tilings as a tiling dynamical system; as was first recognized (independently) by Conway and de Bruijn, such dynamical systems can be fairly characterized as a kind of geometric realization of number theoretic phenomena in the theory of simultaneous rational approximation!

You might want to spend some time with the wonder "picture book" by E. Atlee Jackson, Perspectives of Nonlinear Dynamics (two volumes), plus the fine textbook by Hilborn, Chaos and Nonlinear Dynamics. Together these should provide a solid appreciation of the scope of modern dynamical systems theory.

I could add numerous references on ergodic theory and symbolic dynamics, the most abstract branch of the field of dynamical systems theory. These are mostly at the graduate level, but if you want to understand Markov chains (you mentioned "transition functions") at a combinatorial level, symbolic dynamics is the way to go, and you'll need ergodic theory in order to understand the interaction between combinatorial, topological, and probabilistic structure in a Markov chain. Well, let me mention one undergraduate textbook: the first half of Lind and Marcus, Introduction to Symbolic Dynamics and Coding should give a good introduction, but there's a lot more to this.
 
Last edited:
  • #6
My prof. of System Theory gave us a little theoretical introduction on dynamical systems. The aim of this introduction was to justify the classical mathematical form of linear systems, and in particular, of those linear systems that are stationary (time invariant).
My engineering graduation objective is to use "Optimal Control Theory H-8 (H-infinite)" to solve many pratical problems on robust control of MIMO (Multi Input Multi Output) systems.
My prof. said us that this is the state of the art of control engineering field (maybe he's wrong :smile: ).

Of course, as you said,real dynamical systems are a bit far from linear time invariant models.
Therefore I will follow your tip and I will read some books that you advanced me.
Do you know any recent theoretical development on control of nonlinear and chaotic systems?
 
  • #7
Hi, again, Pupoz,

pupoz said:
Do you know any recent theoretical development on control of nonlinear and chaotic systems?

I probably shouldn't try to answer that in detail, because there are approximately 300 books in my local research library on the topic of chaotic dynamical systems, and dozens of new papers seem to appear every day (see http://www.arxiv.org/list/math.DS/recent and http://www.arxiv.org/list/math.OC/recent and http://www.arxiv.org/list/nlin/recent) for some indication of what I mean). And in addition, control of nonlinear chaotic systems happens to be an area I know comparatively little about.

For some initial orientation, you might try Edward Ott, Tim Sauer, and James A. Yorke, Coping with Chaos, Wiley, 1994. Next, you can look on the web for some recent international conferences in this area. These often feature keynote addresses from which you can get a clue "what's hot". You should also do a literature search for recent review articles, e.g. http://arxiv.org/find/grp_physics,g...XACT+control_theory+abs:+review/0/1/0/all/0/1
You can also Google on the string
Code:
chaos "control theory" group:sci.nonlinear

You should proably also scan the last year or so of issues of journals like Ergodic Theory and Dynamical Systems, Phys. Rev. E, and IEEE Proceedings on Control Theory and Applications. After "doing some homework" of this nature, you might try posting a question in sci.nonlinear asking for more references or tips.

Last but not least, don't forget the obvious: you can ask your professor!
 
Last edited:

1. What is a dynamical system?

A dynamical system is a mathematical model that describes the behavior of a system over time. It consists of a set of variables and a set of equations that govern how those variables change over time.

2. What is the difference between explicit and implicit equations in dynamical systems?

An explicit equation explicitly states the relationship between the variables, while an implicit equation does not. In dynamical systems, explicit equations are typically easier to solve and provide a direct understanding of the system's behavior, while implicit equations may be more complex but can capture more subtle dynamics.

3. How are dynamical systems used in scientific research?

Dynamical systems are used to model and analyze a wide range of natural and artificial systems, such as biological systems, physical systems, and social systems. They can help scientists understand how these systems behave over time and make predictions about their future behavior.

4. What types of problems can be solved using dynamical systems?

Dynamical systems can be used to solve problems related to predicting the behavior of a system over time, finding equilibrium points, and analyzing stability and sensitivity to initial conditions. They can also be used to optimize control strategies for a system.

5. What are some real-world applications of dynamical systems?

Dynamical systems have numerous applications in fields such as physics, biology, economics, engineering, and social sciences. They are used to study the behavior of weather systems, population dynamics, chemical reactions, stock market trends, and many other complex systems. They are also used in the design of control systems for various technologies, such as spacecraft and robots.

Similar threads

  • Differential Equations
Replies
6
Views
1K
  • Differential Equations
Replies
4
Views
2K
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
Replies
2
Views
2K
Replies
4
Views
1K
  • Differential Equations
Replies
5
Views
995
  • Programming and Computer Science
Replies
1
Views
1K
  • Topology and Analysis
Replies
16
Views
2K
  • STEM Academic Advising
Replies
4
Views
805
Back
Top