Math 240 log, spring 2018 Monday April 2, 2018 Introduction to linear autonomous system Friday March 39, 2018 matrix exponential and solution of system of linear ODE with constant coefficients continued, pertubabtion of systems whose coefficient matrix are not diagonalizable Wednesday March 28, 2018 discussioin of problem 3 in the 2nd in-class exam homog system of linear first order ODE with constant coefficients, solution by exponentiating the coefficient matrix March 26, 2018 2nd in-class exam March 23, 2018 review: reduction of order and variation of parameters Monday March 19, 2018 reduction of order, continued Friday March 16, 2018 review: resonance variation of parameters revisited introduction to reduction of order Wednesday March 14, 2018 forced oscillation, with or without damping Monday March 12, 2018 harmonic oscillators, damped oscillations (no force on system) Friday March 2, 2018 Variation of parameters continued Wednesday Feb 28, 2018 Summary of undetermined coefficients, variation of parameters. Monday Feb 26, 2018 Undermined coefficients: solving non-homogenous linear ODE with constant coefficients Friday Feb 23, 2018 How to solve linear ODE with constant coefficients. commutation relations of d/dx and (multiplication by) g(x), Method of undetermined coefficients. Wednesday Feb 21, 2018 Linear ODE with constant coefficients, existence and uniqueness of solutions with given initial conditions, general solution of homogenous linear ODE with constant coefficients, shape of general solutions of non-homogeneous linear ODE. Monday Feb 19, 2018 Computing exp(tA): an examples illustrating the general method explained on 2/16, where the characteristic polynomial of A is (x-1)^2 (x-3)^3, dim(Ker(A-1))=1, dim(Ker(A-3))=2. Friday Feb 16, 2018 Generalized eigenvector for an eigenvalue, How to compute the number of Jordan blocks for a given eigenvalue procedure to compute the matrix exponential exp(tA) when you knoe the factorization of the characteristic polynomial. Wednesday Feb 14, 2018 Jodan canonical form: statement example of matrix exponential: when the matrix has only one eigenvalue Monday Feb 12, 2018 1st in-class midterm Friday Feb 09, 2018 example of matrix exponential: (a) review the case of diagonalizable matrices (b) use Jordan form (c) a formula/algorithm for matrix exponential Wednesday Feb 07, 2018 Sufficient conditions for a square matrix to be diagonalizable: (a) no mulitiple root in characteristic polynomial (b) real symmetric matrices, real skew-symmetric matrices, orthogonal matrices (c) hermitian, skew hermitian and unitary matrices Cayley-Hamilton theorem 2 by 2 example of Jordan form Monday Feb/05/2018 More about matrix representation of a linear transformation: examples of matrices which are not diagonalizable conjugation preserves matrix addition and multiplication sufficient conditions for a matrix to be diagonalizable Friday Feb/02/2018 How to diagonalize a square matrix (if possible) and motivation of eigenvalues and eigenvectors The process of solving a diagonalization problem for an nxn square matrix A (1) compute the characteristic polynomial (2) Find the roots of characteristic polynomial; they are the eigenvalues (3) For each eigenvalue \lambda, find a basis of the null space of A-\lambda (4) Combine the bases of all the eigenvalues; they are necessarily linearly independent. If the total number of these basis elements is equal to n, then A is diagonalizable: AC=DC, where D is the diagonal matrix with eigenvalues on the diagonal and the columns of C are the elements of the bases of various null spaces you have computed. Wednesday 1/31/2018 Illustration of the following concepts eigenvalues, eigenvectors, eigenspaces, matrix representation under a change of basis, diagonalization, matrix exponential in two examples: (a) the reflectiion about the line {x=y} on the plane, and (b) the rotation by 90 degrees about the origin of the plane Monday 1/29/2018 linear transformation, examples (including d/dt on the vector space of smooth functions on the real line, d/dt on the space of vectors whose entries are smooth functions on the smooth line, polynomials of a linear operator on a vector space is also a linear operator), matrix representation of a linear transformation Friday 1/26/2018 relation between determinants, invertible matrices, and whether the row of a square matrix are linearly independent, whether the columns of a square matrix are linearly independent review: linear span, linear independence, basis and dimension in the context of abstract vector spaces change of basis--without memorization Wednesday 1/24/2018 Review of determinants, including the fact that the product of the cofactor matrix of a square matrix A with A is equal to the determinant of A times the identity matrix. This fact is one formulation of Cramer's rule. Monday 1/22/2018 How to compute a basis of a vector subspace of \RR^ how to compute a basis of a linear span of a subset of a vector space how to compute a basis of the null space of a matrix how to compute a basis of the linear span of the rows/columns of a matrix Friday 1/19/2018 Review the definition of a vector subspce of \RR^n or \CC^n Definition of an abstract vector space, function spaces as examples of infinite dimensional vector spaces the linear span of a subset of \RR^n or \CC^n basis of a vector subspace of \RR^n or \CC^n dimension of a vector subspace of \RR^n or \CC^n linear independence 1/17/2018 Inverse matrix and how to compute the inverse of a square matrix by Gaussian elimination. Rank of a matrix, row rank = column rank, linear span of a subset of \RR^n or \CC^n, vector subspace of a subset of \RR^n or \CC^n (Monday 1/15/2018 is a holiday) 1/12/2018 Gaussian elimination = successively multiply a matrix equation on the left, by suitable elementary matrices and permutation matrices How to use Gaussian elimination to find the general solution of a linear system of equations: reduced echelon forms, find a basis of a homogeneous system of linear equations. After reducing a given linear system of equations A \cdot \vec{x}= \vec{b} where A is an m by n matrix, to reduced echelon form (a) There exists a solution if and only if in the equations in reduced echelon form, there is no equation of the form 0=a non-zero number. (b) The variables where no pivots are located are "free variables"; it is equal to the dimension of the null space of the matrix A. (c) The non-free variable are solved/expressed in terms of the free-varibles. (d) Notice that the general solutions can be written as \vec{x} = t_1 \vec{x}_1+ \cdots + t_s \vec{x}_s + \vec{x}_0 where t_1,\ldots, t_s are the free variables, r=rank(A)=number of pivots=n-s, \vec{x}_0 is a "particular solution" of the given system of equations, \vec{x}_1, \cdots, \vec{x}_s is a basis of the null space of A 1/10/2018: definition of matrix multiplication, how to think of matrix multiplication in a structural way, for instance if A is m by n and B is n by l, then the k-th column of a matrix product A\cdot B = A\cdot (the k-th column of B) = b_{1k}(1st column of A)+ \cdots + b_{nk}(n-th column of A) for k=1,...,l. [Note the difference between "l" (ell) and "1" (one).[