1 Introduction
1.1 Historical Perspective
1.2 Basic Concepts
1.3 Systems Description
1.4 Design, Modeling, and Analysis
1.5 Text Outline
2 Modeling of Physical Systems
2.1 Introduction
2.2 Mechanical Systems
2.3 Electrical Systems
2.4 Electromechanical Systems
2.5 Thermal Systems
2.6 Hydraulic Systems
2.7 System Components
2.8 Summary
2.9 References
2.10 Problems
3 Models for Control Systems
3.1 Introduction
3.2 System Impulse and Step Responses
3.3 The Transfer Function
3.4 Differential Equation Representation
3.5 Block Diagram Analysis
3.6 State Equation Representation
3.7 Relationship Between System Representations
3.8 Small Disturbance of Nonlinear Systems
3.9 Summary
3.10 References
3.11 Problems
4 Time Response - Classical Method
4.1 Introduction
4.2 Transient Response
4.3 Steady State Response
4.4 Response to Periodic Inputs
4.5 Approximate Transient Response
4.6 Summary
4.7 References
4.8 Problems
5 Time Response - State Equation Method
5.1 Introduction
5.2 Solution of the State Equation
5.3 Eigenvalues of Matrix A and Stability
5.4 Two Examples
5.5 Controllability and Observability
5.6 Summary
5.7 References
5.8 Problems
6 Performance Criteria
6.1 Introduction
6.2 Control System Specification
6.3 Dynamic Performance Indices
6.4 Steady State Performance
6.5 Sensitivity Functions and Robustness
6.6 Summary
6.7 References
6.8 Problems
7 Assessing Stability and Performance
7.1 Introduction
7.2 Stability via Routh-Hurwitz Criterion
7.3 Frequency Response Method
7.4 Root Locus Method
7.5 Dynamic Response Performance Measures
7.6 Summary
7.7 References
7.8 Problems
8 Control Strategies and Plant Sizing
9 System Compensation
10 Discrete Time Control Systems
11 Non Linear Control Systems
12 Systems with Stochastic Inputs
13 Adaptive Control Systems
A Laplace and Z-Transforms
B Symbols,Units and Analogous Systems
C Fundamentals of Matrix Theory
D Computer Software for Control
Index