Stochastic Optimal Control of Structures

Stochastic Optimal Control of Structures

Author: Yongbo Peng

Publisher: Springer

Published: 2019-06-27

Total Pages: 322

ISBN-13: 9811367647

DOWNLOAD EBOOK

This book proposes, for the first time, a basic formulation for structural control that takes into account the stochastic dynamics induced by engineering excitations in the nature of non-stationary and non-Gaussian processes. Further, it establishes the theory of and methods for stochastic optimal control of randomly-excited engineering structures in the context of probability density evolution methods, such as physically-based stochastic optimal (PSO) control. By logically integrating randomness into control gain, the book helps readers design elegant control systems, mitigate risks in civil engineering structures, and avoid the dilemmas posed by the methods predominantly applied in current practice, such as deterministic control and classical linear quadratic Gaussian (LQG) control associated with nominal white noises.


Stochastic Optimal Control in Infinite Dimension

Stochastic Optimal Control in Infinite Dimension

Author: Giorgio Fabbri

Publisher: Springer

Published: 2017-06-22

Total Pages: 928

ISBN-13: 3319530674

DOWNLOAD EBOOK

Providing an introduction to stochastic optimal control in infinite dimension, this book gives a complete account of the theory of second-order HJB equations in infinite-dimensional Hilbert spaces, focusing on its applicability to associated stochastic optimal control problems. It features a general introduction to optimal stochastic control, including basic results (e.g. the dynamic programming principle) with proofs, and provides examples of applications. A complete and up-to-date exposition of the existing theory of viscosity solutions and regular solutions of second-order HJB equations in Hilbert spaces is given, together with an extensive survey of other methods, with a full bibliography. In particular, Chapter 6, written by M. Fuhrman and G. Tessitore, surveys the theory of regular solutions of HJB equations arising in infinite-dimensional stochastic control, via BSDEs. The book is of interest to both pure and applied researchers working in the control theory of stochastic PDEs, and in PDEs in infinite dimension. Readers from other fields who want to learn the basic theory will also find it useful. The prerequisites are: standard functional analysis, the theory of semigroups of operators and its use in the study of PDEs, some knowledge of the dynamic programming approach to stochastic optimal control problems in finite dimension, and the basics of stochastic analysis and stochastic equations in infinite-dimensional spaces.


Deterministic and Stochastic Optimal Control

Deterministic and Stochastic Optimal Control

Author: Wendell H. Fleming

Publisher: Springer Science & Business Media

Published: 2012-12-06

Total Pages: 231

ISBN-13: 1461263808

DOWNLOAD EBOOK

This book may be regarded as consisting of two parts. In Chapters I-IV we pre sent what we regard as essential topics in an introduction to deterministic optimal control theory. This material has been used by the authors for one semester graduate-level courses at Brown University and the University of Kentucky. The simplest problem in calculus of variations is taken as the point of departure, in Chapter I. Chapters II, III, and IV deal with necessary conditions for an opti mum, existence and regularity theorems for optimal controls, and the method of dynamic programming. The beginning reader may find it useful first to learn the main results, corollaries, and examples. These tend to be found in the earlier parts of each chapter. We have deliberately postponed some difficult technical proofs to later parts of these chapters. In the second part of the book we give an introduction to stochastic optimal control for Markov diffusion processes. Our treatment follows the dynamic pro gramming method, and depends on the intimate relationship between second order partial differential equations of parabolic type and stochastic differential equations. This relationship is reviewed in Chapter V, which may be read inde pendently of Chapters I-IV. Chapter VI is based to a considerable extent on the authors' work in stochastic control since 1961. It also includes two other topics important for applications, namely, the solution to the stochastic linear regulator and the separation principle.


Stochastic Linear-Quadratic Optimal Control Theory: Open-Loop and Closed-Loop Solutions

Stochastic Linear-Quadratic Optimal Control Theory: Open-Loop and Closed-Loop Solutions

Author: Jingrui Sun

Publisher: Springer Nature

Published: 2020-06-29

Total Pages: 129

ISBN-13: 3030209229

DOWNLOAD EBOOK

This book gathers the most essential results, including recent ones, on linear-quadratic optimal control problems, which represent an important aspect of stochastic control. It presents the results in the context of finite and infinite horizon problems, and discusses a number of new and interesting issues. Further, it precisely identifies, for the first time, the interconnections between three well-known, relevant issues – the existence of optimal controls, solvability of the optimality system, and solvability of the associated Riccati equation. Although the content is largely self-contained, readers should have a basic grasp of linear algebra, functional analysis and stochastic ordinary differential equations. The book is mainly intended for senior undergraduate and graduate students majoring in applied mathematics who are interested in stochastic control theory. However, it will also appeal to researchers in other related areas, such as engineering, management, finance/economics and the social sciences.


Optimal Stochastic Control Schemes within a Structural Reliability Framework

Optimal Stochastic Control Schemes within a Structural Reliability Framework

Author: Bernt J. Leira

Publisher: Springer Science & Business Media

Published: 2013-09-07

Total Pages: 102

ISBN-13: 3319014056

DOWNLOAD EBOOK

​The book addresses the topic of on-line implementation of structural and mechanical design criteria as an explicit part of optimal control schemes. The intention of the present research monograph is to reflect recent developments within this area. Examples of application of relevant control algorithms are included to illustrate their practical implementation. These examples are mainly taken from the area of marine technology with the multi-component external loading being represented as both varying in time and with magnitudes that are represented as statistical quantities. The relevant target group will be mechanical and structural engineers that are concerned with “smart components and structures” where optimal design principles and control actuators are combined. The book is also relevant for engineers e.g. involved in mechatronics and control applications.


Stochastic Optimal Control Theory with Application in Self-Tuning Control

Stochastic Optimal Control Theory with Application in Self-Tuning Control

Author: Kenneth J. Hunt

Publisher: Springer

Published: 1989-02-06

Total Pages: 324

ISBN-13:

DOWNLOAD EBOOK

This book merges two major areas of control: the design of control systems and adaptive control. Original contributions are made in the polynomial approach to stochastic optimal control and the resulting control laws are then manipulated into a form suitable for application in the self-tuning control framework. A major contribution is the derivation of both scalar and multivariable optimal controllers for the rejection of measurable disturbances using feedforward. A powerful feature of the book is the presentation of a case-study in which the LQG self-tuner was tested on the pressure control loop of a power station. The broad coverage of the book should appeal not only to research workers, teachers and students of control engineering, but also to practicing industrial control engineers.