"This book is a very clear and comprehensive exposition of several aspects of linear control theory connected with the stabilizability problems. The first chapter introduces the basic notions of stability and the problem of stabilization by means of feedback. The interest in the linear case is motivated by the theorem of stability for the first approximation. Chapter 2 is devoted to finite-dimensional, time-continuous, time-invariant linear systems. First, the authors discuss some classical concepts, like controllability, stabilizability, observability, detectability and their relationship. Moreover, optimality and stabilization are related by means of an interesting version of the Kalman-Lurie-Yakubovich-Popov equation. Finally, the authors consider state estimators and stabilization with disturbance attenuation. A similar theory is developed in Chapter 3, for systems with two time scales (singularly perturbed systems) that is systems with fast and slow components. Chapter 4 deals with high-gain stabilization of minimum phase systems, while Chapter 5 is concerned with adaptive stabilization and identification. In the final chapter, the authors study stabilization of systems where the feedback is implemented by means of a sampling technique (digital control)." --Zentralblatt Math

One of the main problems in control theory is the stabilization problem consisting of finding a feedback control law ensuring stability;
Springer Book Archives
Springer Book Archives

Product details

ISBN
9781461271970
Published
2012-11-02
Publisher
Springer-Verlag New York Inc.; Springer-Verlag New York Inc.
Height
235 mm
Width
155 mm
Age
Research, P, 06
Language
Product language
Engelsk
Format
Product format
Heftet