Optimal Control and the Calculus of Variations (Revised)

by
Edition: 1st
Format: Paperback
Pub. Date: 1995-10-19
Publisher(s): Oxford
  • Free Shipping Icon

    Free Shipping on all Orders Over $35!*

    *excludes Marketplace items.

List Price: $104.53

Buy New

Usually Ships in 8 - 10 Business Days.
$104.01

Rent Textbook

Select for Price
There was a problem. Please try again later.

Rent Digital

Online: 180 Days access
Downloadable: 180 Days
$71.66
Online: 365 Days access
Downloadable: 365 Days
$82.69
Online: 1460 Days access
Downloadable: Lifetime Access
$110.24
$71.66

Used Textbook

We're Sorry
Sold Out

This item is being sold by an Individual Seller and will not ship from the Online Bookstore's warehouse. The Seller must confirm the order within two business days. If the Seller refuses to sell or fails to confirm within this time frame, then the order is cancelled.

Please be sure to read the Description offered by the Seller.

Summary

Optimal control is a modern development of the calculus of variations and classical optimization theory. For that reason, this introduction to the theory of optimal control starts by considering the problem of minimizing a function of many variables. It moves through an exposition of the calculus of variations, to the optimal control of systems governed by ordinary differential equations. This approach should enable students to see the essential unity of important areas of mathematics, and also allow optimal control and the Pontryagin maximum principle to be placed in a proper context. A good knowledge of analysis, algebra, and methods is assumed. All the theorems are carefully proved, and there are many worked examples and exercises. Although this book is written for the advanced undergraduate mathematician, engineers and scientists who regularly rely on mathematics will also find it a useful text.

Table of Contents

Introduction
The maxima and minima of functions
The calculus of variations
Optimal control
Optimization in
Functions of one variable
Critical points, end-points, and points of discontinuity
Functions of several variables
Minimization with constraints
A geometrical interpretation
Distinguishing maxima from minima
The calculus of variations
Problems in which the end-points are not fixed
Finding minimizing curves
Isoperimetric problems
Sufficiency conditions
Fields of extremals
Hilbert's invariant integral
Semi-fields and the Jacobi condition
Optimal Control I: Theory
Introduction
Control of a simple first-order system
Systems governed by ordinary differential equations
The optimal control problem
The Pontryagin maximum principle
Optimal control to target curves
Optimal Control II: Applications
Time-optimal control of linear systems
Optimal control to target curves
Singular controls
Fuel-optimal controls
Problems where the cost depends on X (t l)
Linear systems with quadratic cost
The steady-state Riccai equation
The calculus of variations revisited
Proof of the Maximum Principle of Pontryagin
Convex sets in
The linearized state equations
Behaviour of H on an optimal path
Sufficiency conditions for optimal control
Table of Contents provided by Publisher. All Rights Reserved.

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.