Optimal control theory is the science of maximizing the returns from and minimizing the costs of the operation of physical, social, and economic processes. Geared toward upper-level undergraduates, this text introduces three aspects of optimal control theory: dynamic programming, Pontryagin's minimum principle, and numerical techniques for trajectory optimization.
Chapters 1 and 2 focus on describing systems and evaluating their performances. Chapter 3 deals with dynamic programming. The calculus of variations and Pontryagin's minimum principle are the subjects of chapters 4 and 5, and chapter 6 examines iterative numerical techniques for finding optimal controls and trajectories. Numerous problems, intended to introduce additional topics as well as to illustrate basic concepts, appear throughout the text.
Optimal Control Theory: An Introduction (Dover Books on Electrical Engineering)
📄 Viewing lite version
Full site ›
Book Details
Author(s)Donald E. Kirk
PublisherDover Publications
ISBN / ASIN0486434842
ISBN-139780486434841
AvailabilityIn Stock.
Sales Rank136,765
CategoryTechnology & Engineering
MarketplaceUnited States 🇺🇸
Description ▲
More Books in Technology & Engineering
Carpentry & Building Construction, Student Edition, 20…
View
The Electronics Dictionary for Technicians
View
Electronic Devices and Circuits (Merrill's Internation…
View
8086/8088, 80286, 80386 and 80486 Assembly Language Pr…
View
Digital and Analog Communication Systems
View
Introduction to Robotics
View
The Technology of Metallurgy
View
An Introduction to Transport Phenomena in Materials En…
View
Engineering graphics
View