LECTURE: MATH 3A, SPRING 2016, M-W-F: 9-9:50 @ MSTB 124
INSTRUCTOR: NAM TRANG
OFFICE HOURS: W 2-3, F 10-12 @ RH 410N

DISCUSSION: M-W: 12-12:50 @ SH 174
TA: LEVY, M.
OFFICE HOURS:


COURSE SYLLABUS AND POLICIES     UCI Academic Honesty Policies     UCI Student Resources

HOMEWORK ASSIGNMENTS:     HW1     HW2     HW3     HW4     HW5    HW6    HW7

FINAL REVIEW AND SAMPLE FINAL:     Final review    Sample final

FINAL'S WEEK: Monday: Review @ RH 306, 4-6 pm     Tuesday: Office hours: 2-4:30.

QUIZZES:
Wed Apr 19: Sections 1.5-1.7
Wed May 4: Sections 2.1-2.3
Wed May 11: Sections 2.2, 2.3 (invertible matrix theorem), 2.8, 2.9
Wed May 18: Sections 3.1, 3.2, 3.3 (focusing on calculations of areas, volumes, and how linear transformations change areas/volumes)
Wed May 25: Sections 5.1, 5.2, 5.3

COURSE PROGRESS

Week 1:
M: Went over the syllabus, introduced systems of linear equations, matrices associated with systems of linear equations (with examples), defined equivalent systems and row operations.
W: Gave examples of systems that are inconsistent, consistent with unique solution, and consistent with infinitely many solutions; drew pictures that illustrate the situations for each system (parallel lines, lines that intersect at exactly one point, and planes whose intersection is a line).
F: Defined and gave examples of row echolon form (REF) and row reduced echelon form (RREF), pivots, and explained how these concepts are related to solving linear systems; introduced vectors.
Week 2:
M: Introduced vectors, vector operations (addition and scalar multiplication), spanning, general method for determining whether a vector b is in span(v_1,..., v_k) (by solving the system whose augmented matrix is [v_1,...,v_k | b])
W: Introduced vector equations, matrix equations, properties of multiplication of an mxn matrix by a vector in R^n (e.g. A(u+v) = Au + Av and A(cu) = c(Au)); discussed the equivalence of the problem of showing if b is in span(v_1,...,v_k) with the problem of showing the system associated to the matrix [a_1, ... ,a_k | b] is consistent.
F: Proved the equivalence of consistency of Ax=b with the statement that A has a pivot in every row; explained the truth tables of AND, OR, NOT, IF ... THEN, IF AND ONLY IF; defined homogeneous systems.
Week 3:
M: Described the solution to Ax=b as x = p + a v, a in R, where p is a particular solution to Ax = b and {a v | a in R} is the set of solutions to the homogenous equation Ax=0 (here Ax = 0 has one free variable); in general the solution set looks like x = p + a_1 v_1 + ... + a_k v_k where {a_1 v_1 + ... + a_k v_k | a_1,...,a_k in R} is the solution set of Ax=0 which has k free variables. Defined linear dependence/independence and gave examples.
W: Several examples of linear independent/dependent sets; proved theorem that says that a set {v_1,...,v_p} is linearly dependent iff some vector v_i is a linear combination of the other vectors. Went over an application that computes equilibrium prices for sectors of an economy.
F: Introduced functions, one-to-one, onto functions, domain,codomain, and range of a function; defined linear transformations and gave various examples; one important example is given by a matrix: given an mxn matrix A, there is an associated linear transformation T_A: R^n --> R^m defined as: T_A(x) = Ax.
Week 4:
M: Gave examples of linear transformations, showed how to get a matrix representation for a linear transformation T:R^n-->R^m, examples of matrix representation of dilation, rotation, projection.
W: Proved theorem that characterizes: one-to-one linear map T in terms of A_Tx=0 has only one solution; proved theorem that characterizes: one-to-one linear map T in terms of columns of A_T are linearly independent, onto linear map T in terms of columns of A_T span R^m. Defined matrix addition, scalar multiplying a matrix and matrix multiplication.
F: Defined matrix multiplication, gave examples of matrix multiplication, demonstrated that matrix multiplication is not commutative (A.B need not be B.A even if both are defined) and does not have the cancellation property (A.B = A.C does not imply B = C); defined power of a square matrix and transpose.
Week 5:
M: Defined matrix inverse, discussed properties of inverse (Ax=b has unique solution for very b if A is invertible, (AB)^-1 = B^-1 A^-1, (A^-1)^-1 = A, (A^T)^-1 = (A^-1)^T), gave algorithm and example that decides whether a matrix A is invertible and if it is, compute its inverse.
W: Defined elementary matrices; showed that A is invertible if and only if A is row equivalent to the identity matrix I_n; showed why inverse is unique; discuss a list of (12) equivalent statements with "A is invertible".
F: MIDTERM
Week 6:
M: Defined subspaces of R^n; gave examples of subspaces of R^1 (there are two subspaces: {0} and R^1), R^2 ({0}, R^2, lines through the origin); showed why certain sets are subspaces and certain sets are not subspaces; stated lemma which says that the set span{v_1,..., v_k} is a subspace, defined basis for a subspace.
W: Defined column space, row space, and null space of a matrix A; gave example that computes the null space of a matrix A and find the basis for the column space and null space of A.
F: Complete the example of finding basis for null space, column space, and row space of a matrix A, defined coordinates of a vector relative to a basis B of a subspace H, stated rank+nullity theorem, and continuation of the invertible matrix theorem
Week 7:
M: Define determinant (cofactor expansion along the first row); stated Lagrange theorem (cofactor expansion along any row or any column), examples of computing determinants of 2x2, 3x3 matrices; proved determinant of a triangular matrix is the product of diagonal entries and determinant of A-transpose is determinant of A; stated theorems regarding determinants of matrices obtained from A by elementary row operations and det(AB)=det(A).det(B)
W: Applications: computing determinants using row operations: row reduce A to REF B and compute det(B) and then compute det(A); proved det(A)det(B)=det(AB), A is invertible iff det(A) not 0.
F: Finish the proof of det(AB)=det(A)det(B). More applications: use determinant to compute area/volume of parallelogram/parallelpiped and other areas (triangles etc.), use linear transformations to compute areas/volumes of general shapes.
Week 8:
M: Introduced eigenvalues, eigenvectors, discussed how to compute eigenvalues (by solving the equation det(A-\lambda I)=0) and how to compute eigenspaces; gave examples of each.
W: Gave an example of computation of eigenvalues/eigenvectors/finding basis for eigenspaces of a 3x3 matrix; defined similarity of two matrices and what it means for a matrix to be diagonalizable.
F: Characterize diagonalizability: A is diagonalizable if A=PDP^{-1} where D is a diagonal matrix whose entries along the diagonal are A's eigenvalues and P is the matrix whose columns are the corresponding eigenvectors; gave examples of matrices that are not diagonalizable (can't find enough independent eigenvectors because one eigenvalue has algebraic multiplicty 2 but geometric multiplicity is 1) and matrices that are diagonalizable (one matrix has all 3 distinct eigenvalues, one with 1 eigenvalue of algebraic multiplicity 1 and one eigenvalue with algebraic multiplicity 2 but also has geometric multiplicity 2).
Week 9:
M: Proved eigenvectors belonging to different eigenspaces are linearly independent; gave two applications: a) computing (complex) eigenvalues/eigenvectors of the rotation matrix [0 -1 | 1 0] b) Solve initial value problems of the form x'(t) = A x(t), x(0) = y for x(t) a vector-valued (in R^n) function of t, y: a vector in R^n, and A is nxn matrix.
W: Defined (inner) dot products, length, angle between two vectors, normalizing a vector to a unit vector, defined distance between two vectors and the formula u.v = |u||v|cos(\phi) where \phi is the angle between u and v.
F: Defined orthogonal vectors; orthogonal complement (v is orthogonal of a subspace W if v is orthogonal to every vector in W), Row(A)-orthogonal complement is Null(A) and Col(A)-orthogonal complement = Nul(A-transpose); defined orthogonal set/orthogonal basis; showed that an orthogonal set of nonzero vectors is linearly independent and gave a formula/example of how to compute the coefficients of y=c_1 u_1 + ... +c_k u_k where {u_1,...,u_k} is an orthogonal basis.
Week 10:
M: Holiday.
W: Computing orthogonal projections of a vector v onto a subspace W (with an orthogonal basis {v_1, ..., v_k}) and stated the least distance approximation theorem.
F: Gave example of computing orthogonal projection of a vector v onto a subspace W; showed that to compute least square solutions to Ax = b, one should solve A^TAx = A^T b.