Skip to Content

Continuous Time Dynamical Systems

State Estimation and Optimal Control with Orthogonal Functions

By B.M. Mohan, S.K. Kar

CRC Press – 2012 – 247 pages

Purchasing Options:

  • Add to CartHardback: $159.95
    978-1-46-651729-5
    October 24th 2012

Description

Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional.

This book, Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions, considers different classes of systems with quadratic performance criteria. It then attempts to find the optimal control law for each class of systems using orthogonal functions that can optimize the given performance criteria.

Illustrated throughout with detailed examples, the book covers topics including:

  • Block-pulse functions and shifted Legendre polynomials
  • State estimation of linear time-invariant systems
  • Linear optimal control systems incorporating observers
  • Optimal control of systems described by integro-differential equations
  • Linear-quadratic-Gaussian control
  • Optimal control of singular systems
  • Optimal control of time-delay systems with and without reverse time terms
  • Optimal control of second-order nonlinear systems
  • Hierarchical control of linear time-invariant and time-varying systems

Reviews

"… provides a good introduction of using orthogonal function approaches for state estimation and optimal control problems. … the first book I’ve seem that puts it all together in one text. … The authors provide several detailed examples that clearly explain how the shown theory can be applied. This makes it much easier to understand the basic algorithms."

—John L. Crassidis, University at Buffalo, State University of New York

"The approach and selection of topics are very appropriate, because the book has considered all the important components of optimal control problems using orthogonal functions. … Overall, the book is quite good and comprehensive."

—Anish Deb, University of Caluctta, India

Contents

Introduction

Optimal Control Problem

Historical Perspective

Organization of the Book

Orthogonal Functions and Their Properties

Introduction

Block-Pulse Functions (BPFs)

Legendre Polynomials (LPs)

Shifted Legendre Polynomials (SLPs)

Nonlinear Operational Matrix

Rationale for Choosing BPFs and SLPs

State Estimation

Introduction

Inherent Filtering Property of OFs

State Estimation

Illustrative Examples

Conclusion

Linear Optimal Control Systems Incorporating Observers

Introduction

Analysis of Linear Optimal Control Systems Incorporating Observers

Illustrative Example

Conclusion

Optimal Control of Systems Described by Integro-Differential Equations

Introduction

Optimal Control of LTI Systems Described by Integro-Differential Equations

Illustrative Example

Conclusion

Linear-Quadratic-Gaussian Control

Introduction

LQG Control Problem

Unified Approach

Illustrative Example

Recursive Algorithms

Conclusion

Optimal Control of Singular Systems

Introduction

Recursive Algorithms

Unified Approach

Illustrative Examples

Conclusion

Optimal Control of Time-Delay Systems

Introduction

Optimal Control of Multi-Delay Systems

Optimal Control of Delay Systems with Reverse Time Terms

Conclusion

Optimal Control of Nonlinear Systems

Introduction

Computation of the Optimal Control Law

Illustrative Examples

Conclusion

Hierarchical Control of Linear Systems

Introduction

Hierarchical Control of LTI Systems with Quadratic Cost Functions

Solution of Hierarchical Control Problem via BPFs

Extension to Linear Time-Varying Systems

Computational Algorithm

Illustrative Examples

Conclusion

Name: Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions (Hardback)CRC Press 
Description: By B.M. Mohan, S.K. Kar. Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. An optimal control is a set of differential equations describing the paths of the control variables that minimize the...
Categories: Systems & Controls, Systems & Control Engineering, Mathematics & Statistics for Engineers