Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions

Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions PDF

Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional. This book, Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions, considers different classes of systems with quadratic performance criteria. It then attempts to find the optimal co…

Link http://sharpbook.net/books/continuous-time-dynamical-systems

Advertisement