Math 3A Winter 2019

Today we reviewed for the final. We answered several questions people had on topics like change of basis, orthogonal projections, and the matrix associated to a linear transformation. Make sure you review all the homework, quizzes, lectures, exams, and relavent chapters in the book to prepare for the final. Good luck!

We talked about orthogonal subspaces more and also introduced the projection map. Given a subspace W and orthogonal basis, we can compute the project of a vector onto the subspace.

Today we introduced distance and angles between vectors. We defined what it means for two vectors to be orthogonal using the dot product. At the end of class, we introduced the orthogonal complement of a subspace.

We talked about complex eigenvalues today and linear algebra over the complex numbers ℂ. Pretty much everything worked the same as before. At the end of class, we talked about how complex eigenvalues of 2x2 matrices mean the matrix has a rotational component.

Today there was a sub and you covered matrices of linear transformations with respect to different bases. This is a generalization of what we did earlier with finding the matrix associated to a linear transformation.

We talked about diagonalization of matrices. A matrix is diagonalizable if it is similar to a diagonal matrix. If we write A = PDP-1, with D diagonal, then the columns of P are eigenvectors for A. We used this fact to come up with a method to diagonalize a matrix if possible.

Today we mostly discussed similar matrices. Two matrices A,B are similar if we can write A = PBP-1. We saw that similar matrices share a lot of properties. For example, they have the same characteristic polynomial.

Today we continued talking about eigenvalues. We went out the general method for finding eigenvalues, which included finding the characteristic polynomial f(x)=det(A-xI). We also introduced the "algebraic" and "geometric" multiplicities of an eigenvalue.

We introduced eigenvectors, eigenvalues, and eigenspaces. All these concepts come from the equation Av=λv, or (A-λI)v=0. We can find eigenvalues by solving det(A-λI)=0 for λ.

Today was the exam. Good luck everyone!

We reviewed for the exam today. We went over several questions people had that came from various sources, including the review questions that were emailed out earlier.

Today there was a sub and you should have finished covering properties of the determinant. Remember that Monday is a holiday and there is no class.

We talked about more properties of the determinant today. The two big ones were: det(AB) = det(A)det(B) and det(A) = 0 if and only if A is not invertible. We also talked about more tricks to use when calculating a determinant like using row reduction. Remember that Wednesday and there is no class on Monday 2/18, so there is plenty of time to review the material so far and come up with questions.

Today we introduced the determinate of a square matrix. We saw how the determinant measures how a linear transformation scales area. Later we will see how the determinant determines whether a matrix is invertible. We will also see some more ways to calculate a determinant.

We covered coordinate systems using a basis. That is, given a vector v we saw how to write v with respect to different bases. Given a subspace H, the dimension of H is defined to be the length of any basis. Toward the end of class we covered a very important theorem: the rank-nullity theorem. It says that for any matrix A, we have rank(A) + nullity(A) = # columns of A.

Today we talked about subspaces and did several examples. A subspace H of ℝⁿ satisfies three conditions: it contains 0, is closed under addition, and is closed under scaling. The homework will have lots of great problems on what subspaces look like. Toward the end of class we defined a basis for a subspace.

We covered invertible linear transformations today, and related it to invertibility of matrices. We did a few examples, including looking at the differential operator as a linear transformation on the space of polynomials. Exams are still being graded but you should get them back by Thursday or next Tuesday at the latest.

Today we had our first midterm.

Today we talked about inverses. We defined an n×n matrix A to be "invertible" if there exists another matrix A-1 such that AA-1 = A-1A = I. We also went over a general method to find the inverse A-1 by building an augmented matrix of the form [A I], and then row reducing it until the left side is I. The result is [I A-1]. Remember that the exam is on Friday, so keep up the studying!

In class we went over the definition of "one-to-one" and "onto". These terms are about how many solutions there are to the equation T(x) = b for a linear transformation T. After a few examples, we moved on to define matrix multiplication, and how it relates to composition of linear transformations. The important thing is that if S: ℝᵖ → ℝⁿ and T: ℝᵐ → ℝᵖ, then their composition is denoted by T ∘ S and is given by (T ∘ S)(x) = T(S(x)). If S has matrix B and T has matrix A, then S(x) = Bx and T(y) = Ay, so (T ∘ S)(x) = (AB)x.

Today we learned that all linear transformations are given by some matrix. We also learned how to find the associated matrix using the "standard basis vectors". At the end of class we talked about how rotation in the xy-plane is a linear transformtion and we found the corresponding matrix.

We introduced linear transformations. These are transformations T:ℝⁿ→ℝᵐ that satisfy T(u+v) = T(u) + T(v) and T(cu) = cT(u). These are very important in all fields of math. Our first example of a linear transformation was "multiplication by a matrix", i.e. T(x) = Ax for some matrix A.

Today you had a sub again. You introduced the notion of "linear independence" and did several examples. A linearly independent set is one where no vector is a linear combination of the others.

Today you had a sub. They went over section 1.5 in the book which is about solution sets to systems of linear equations and what they look like. This is helpful to enahnce your geometric intuition of linear algebra.

Today we talked about matrix equations of the form Ax = b. We talked about how to multiply a matrix and a vector. We also introduced the dot product and used it to give another way to think about matrix-vector multiplication. At the end of class, we talked about a geometric interpretation of what solutions to equations of the form Ax = b look like.

We introduced the notion of the "span" of a set of vectors in ℝn. This is the collection of all linear combinations of those vectors. We tried to visualize a few examples which gave things like a plane and a line.

Today we covered row reduced echelon form. It's a canonical way to systems of linear equations. Also it's much easier to determine whether a system is consistent when it's in REF or RREF. The first step in solving a system of equations is often putting the associated augmented matrix into REF.

Welcome to Math 3A! In this course we will learn about systems of linear equations, ordinary differential equations, linear transformations, and more. Today we covered systems of linear transformations, and how to represent them with augmented matrices. We also talked about elementary row operations.