Teacher
|
MINUCCI Simone
(syllabus)
1. INTRODUCTION • Dynamic Systems. Control of dynamic systems: formulation and first examples. Architectures of control systems (open-loop, closed-loop).
2. DYNAMCAL SYSTEMS • Models of fundamental systems. Dynamic linear systems in time domain. • Laplace Transformation. Transfer Function: definition, properties and use. Poles, Zeros and Gain. • Equivalence transformation and duality transformation. • Stability analisys of dynamic linear systems. Stability criteria. • Block diagrams. • Free response and signal response. Canonical response of first and second order systems. • Frequency Response: definition, and relationship with transfer function. Graphical representation of frequency response: Bode diagram, Nyquist Diagram, Nichols diagram. I order and II order filters. Time-Frequency relationships.
3. STATE FEEDBACK CONTROL • Controllability of dynamic linear systems. • State feedback of dynamic linear systems. Design of a state feedback regulator. • Observability of dynamic linear systems. • State feedback of dynamic linear systems by state estimation. Design of an asymptotic observer.
4. FEEDBACK CONTROL • Formalization of a simple control problem. Classification of control systems. • Feedback control systems: features and properties. • Stability: Nyquist and Bode criteria. • Static Performances: steady state error. Dynamical Performances: response speed, bandwidth, stability order. • Stability Margin. Relationship between feedback and feedforward control systems. • Design of a controller: requirements. Static and dynamic design. Compensations. PID regulators.
(reference books)
1. F. White, Principles of Control Engineering, Elsevier 2. L. Keviczky, R. Bars, J. Hetthéssy, C. Bányász, Control Engineering, Springer 3. L. Keviczky, R. Bars, J. Hetthéssy, C. Bányász, Control Engineering: MATLAB Exercises, Springer
|