Abstract: This chapter focuses on matrix calculus of two matrices. It discusses the importance of polynomials and exponentials to calculus and differential equations. In using matrices to solve linear differential equation, one needs to have basic concepts of polynomials and exponentials, and developing techniques for calculating the matrix functions. Polynomials and exponentials of matrices play an equally important role in matrix calculus and matrix differential equations. As a matrix commutes with itself, many of the properties of polynomials, such as addition, subtraction, multiplication, and factoring but not division, are still valid for polynomials of a matrix. A sequence {Bk} of matrices, Bk = [bijk], is said to converge to a matrix B = [bij] if the elements bijk converge to bij for each i and j. In general, it is very difficult to compute functions of matrices from their definition as infinite series; one exception is the diagonal matrix. The Cayley–Hamilton theorem, however, provides a starting point for the development of an alternate, straightforward method for calculating these functions. A very important function in the matrix calculus is eAt, where A is a square constant matrix, that is, all of its entries are constants, and t is a variable.
Publication Year: 1991
Publication Date: 1991-01-01
Language: en
Type: book-chapter
Indexed In: ['crossref']
Access and Citation
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot