Dynamics, Robotics and Control

Research in dynamics and control focuses on pattern formation and investigation of the dynamics of a range of complex systems. These systems include unmanned aerial vehicles, spacecraft, unmanned underwater vehicles, robotic systems, flow control, and smart indoor environments. Theoretical, numerical and experimental investigations are aimed at modeling the dynamics to understand basic underlying phenomena, and using these models for estimation and control of such systems.

Dynamics

Dynamics is an old science that describes the time evolution of the states of a system, which could be a mechanical or aerospace system. A good understanding of the dynamics of a system helps to identify phenomena and create mathematical models that capture these phenomena. The dynamics may be represented by ordinary or partial differential equations with time as an independent variable, and these differential equations may be linear or nonlinear. Study of phenomena like critical points, equilibria and relative equilibria, limit cycles, stability, bifurcation and chaos help in understanding the global behavior of a system over time, when system states, parameters and initial conditions change. Mathematical models that describe these phenomena could be used, in turn, for feedback control of these systems.

Control Systems

A control system is usually understood to mean a feedback control system, where information on some or all state(s) of a dynamical system is used to control the system to a desired operating point or trajectory. Describing system properties, including stability, controllability and observability, are important aspects of control theory and practice. Controllability is a system property that defines whether the system can be controlled from a given state to another state. Observability is a system property that defines whether the system states can be reconstructed from observed (measured) variables. Both these properties are defined, in general, for a local neighborhood of a given state of a nonlinear system. For linear systems and in special cases for certain nonlinear systems, these properties can be defined globally over the entire state space. However, for the vast majority of mechanical and aerospace systems, these notions can only be defined globally

Nonlinear control and nonlinear estimation, using the framework of geometric mechanics, is an active research topic in this area. This framework describes the dynamics of maneuverable robotic systems and unmanned vehicles without using local representations of the motion. Recent studies have looked at design of continuous-time nonlinear control and estimation schemes that have guaranteed stability properties and are robust to measurement noise and disturbances. These continuous-time schemes are then discretized for computer implementation, such that these stability and robustness properties are retained in discrete time.

Applied Control

  • Unmanned and robotic systems
  • Spacecraft dynamics and control
  • Noise control in high speed flows
  • Intelligent indoor environments – control of parameters aimed at realizing human comfort