Definition:Dynamical Systems

From ProofWiki
Jump to navigation Jump to search

Definition

Dynamical systems is a branch of mathematics which studies the long-term behavior of a state space that evolves in time according to a prescribed rule.

A dynamical system can be viewed as a


tuple $(T, X, f)$ where $T$ is a semigroup that represents time, $X$ is the underlying state space, and $f:T\times X\to X$ is a semigroup action that describes the rule. Typically, $T$ is commutative, $X$ is a topological space, and $f$ is a continuous map.





Sources