Nicholas G. Vlamis
mathematician | nvlamis@gc.cuny.edu

Math 231: Linear Algebra I

Fall 2025

(Last updated: Wednesday, Dec. 10 at 5:35pm)

Course Information
  • Instructor: Nicholas G. Vlamis
  • Office: 507 Kiely Hall
  • Email: nicholas.vlamis@qc.cuny.edu
  • Class Meeting: Monday/Wednesday 10:05–11:55am
  • Class Room: 242 Kiely Hall
  • Syllabus
  • Textbook: Linear Algebra and Its Applications, sixth edition by David C. Lay, Steven R. Lay, and Judi J. McDonald.
Exam/Test Dates
  • Exam 1: Wednesday, October 15 (Solutions)
  • Exam 2: Wednesday, November 19 (Solutions)
  • Final Exam: Monday, December 22, 11am–1pm
Quiz Solutions
Homework

Weekly homework will be assigned, but it will not be collected. Instead, there will be a weekly quiz based on the prior week's assignment and lectures.

Week 15
  • Reading: Section 6.4, 6.5, and 6.6
  • Final Exam Information
    • The exam is on Monday, December 22 11am–1pm in Kiely 242 (our standard room).
    • The final exam is cumulative. However, about 50% of the exam will be on the material covered since Exam 2.
    • In the textbook, we have covered the following sections: 1.1–9, 2.1–3, 2.8–9, 3.1–2, 5.1–3, 6.1–5. Additionaly, we discussed Markov chains and stochastic matrices (this is discussed in 10.1–2).
    • You may bring two sheets of notes to the exam. Your notes may contain definitions and theorems, not worked out problems (just as on prior exams).
  • Monday, December 8 (Day 26): Introduced the Gram–Schmidt process for constructing an orthogonal basis for a subspace out of a given basis. Defined orthogonal matrices and gave a characterization in terms of their columns forming an orthonormal basis. Proved the QR Decomposition theorem.
  • Wednesday, December 10 (Day 27): Explained how to use the QR-decomposition theorem to approximate eigenvalues. Introduced the notion of a least-squares solution to a linear system and proved they always exist. Proved a theorem guaranteeing the uniqueness of a least-squares solution whenever the matrix has linearly independent column vectors.
Week 14
  • Reading: Sections 6.2 and 6.3
  • There is a quiz on Wendesday on HW11.
  • Monday, December 1 (Day 24): Introduced the notion of orthogonal sets. Proved that orthogonal sets are linearly independent. Introduced the notion of an orhogonal basis. Established a formula for the coordinates of a vector with respect an orthogonal basis.
  • Wednesday, December 3 (Day 25): Proved the orthogonal projection theorem. Gave a brief overview of mathematical induction. Defined the notion of orthogonal projection onto a subspace. Gave a formula for the projection of a vector onto a subspace with an orthogonal basis.
Week 13
  • Reading: Section 6.1
  • In lieu of a quiz for homework 10, you are required to turn in a complete solution to Exercise 2 on HW10. You can either do this in person on Wednesday or you can email me a pdf version of your homework. To generate a pdf from your work on paper, you can use a smartphone to scan it, see for instance the following link from howtogeek.
  • Monday, November 24 (Day 22): Finish our discussion of Markov chains, stochastic matrices, and Google's PageRank. Defined the dot product of two vectors and the norm of a vector. In dimension two, related angles to dot products using the law of cosines.
  • Wednesday, November 26 (Day 23): Defined distances and angles between vectors in R^n using the dot product. Discussed an application to embedding words and documents into R^n (students interested in learning more can search for information about word embeddings, vector embeddings, word2vec, doc2vec, etc.). Proved the Cauchy—Schwarz inequality and the triangle inequality. Discussed the Pythagorean theorem. Defined the notion of orthogonal vectors.
Week 12
  • Reading: Sections 10.1 and 10.2 (Chapter 10 is an online appendix, which can be found at http://bit.ly/2nj1HhO)
  • Monday, November 17 (Day 20): Introduced Markov chains and stochastic matrices. Proved that the absolute value of an eigenvalue of a stochastic matrix is bounded above by 1 and that 1 is an eigenvalue of every stochastic matrix. Stated the Perron–Frobenius theorem in the context of stochastic matrices.
  • Wednesday, November 19 (Day 21): Exam 2
Week 11
  • Reading: Sections 5.1, 5.2, and 5.3
  • Exam 2 is on Wednesday, November 19. It will cover sections
    • 2.1–3, 8, 9
    • 3.1, 2
    • 5.1, 2
    This corresponds to homework assignments 5 through 9 (HW9 was just posted above).
  • Monday, November 10 (Day 18): Proved a formula for the determinant of an inverse matrix. Discussed a consequence of Cramer's rule as an aside, but skipped the details of Cramer's rule. Introduced the notion of eigenvalue, eigenvector, and eigenspace. Discussed how to find eigenvalues as roots of the characteristic polynomial. Discussed how to find eigenspaces associated to a given eigenvalue.
  • Wednesday, November 12 (Day 19): Showed that a square matrix is invertible if and only if zero is not an eigenvalue of the matrix. Introduced the notion of similar matrices. Defined diagonalizability for a square matrix and proved that being diagonalizable for an n-by-n matrix is equivalent to the matrix having n linearly independent eigen vectors.
Week 10
  • Reading: Sections 3.1 and 3.2
  • Monday, Nov. 3 (Day 16): Proved the basis theorem for subspaces. Defined the determinant. Introduced upper triangular matrices, realized the determinant of an upper triangular matrix as the product of its diagonal entries, and showed that an upper triangular matrix is invertible if and only if its determinant is nonzero. Stated properties of the determinant under elementary row operations.
  • Wednesday, Nov. 5 (Day 17): Explained how to use the Gauss-Jordan elimination algorithm to compute the determinant of a square matrix. Proved that a square matrix is invertible if and only if its determinant is nonzero. Proved that the determinant of a product of matrices is the product of their respective determinants.
Week 9
  • Reading: Sections 2.9 and 3.1
  • Monday, Oct. 27 (Day 14): Recalled the definitions of subspace, column space, null space, and basis. Proved that the pivot columns of a matrix form a basis for the column space of the matrix. Proved that any two bases of a subspace have the same cardinality, and used this to define the dimension of a subspace. Defined rank and nullity of a matrix and proved the Rank-Nullity Theorem.
  • Wednesday, Oct. 29 (Day 15): Extended the inverse matrix theorem to contain statements about the rank and nullity of a matrix. Introduced coordinates for a subspace and established basic properties. Stated the basis theorem.
Week 8
  • Reading: Sections 2.3 and 2.8
  • We have class on Friday, Oct. 24, as CUNY is on a Monday schedule.
  • Wednesday, Oct. 22 (Day 12): Produced an algorithm for determining whether a square matrix is invertible, and in the case it is invertible, finding the inverse. The algorithm relies on Gauss–Jordan elimination. Discussed the Inverse Matrix Theorem, going over several of its components. Disussed invertibility of linear transformations in terms of their standard matrices.
  • Friday, Oct. 24 (Day 13): Introduced the notion of a subspace and discussed several examples. Gave two subspaces associated to a matrix, namely the column space (somethign we've seen before) and the null space. Introduced the notion of a basis and showed, via an example, that the null space of a matrix A always has a basis with the same number of vectors as there are free variables in the matrix equation Ax=0.
Week 7
  • Reading: Sections 2.2 and 2.3
  • Exam 1 is on Wednesday, October 15. See Week 6 notes for details.
  • No class Monday, October 20, but we have a make-up class on Friday, October 24.
  • HW5 has been posted. Quiz 5 will be next Wednesday, Oct. 22.
  • Tuesday, Oct. 14 (Day 10): Defined elementary matrix. Explained how elementary row operations can be accomplished via left multiplication of an elementary matrix. Proved that a square matrix is invertible if and only if it is row equivalent to an identity matrix. The proof gives an algorithm for verifying invertibility and computing the inverse, which we will discuss in detail next time.
  • Wednesday, Oct. 15 (Day 11): Exam 1
Week 6
  • Reading: Sections 2.1 and 2.2
  • Exam 1 is on Wednesday, October 15. You may bring one sheet of notes—containing only statements of theorems and definitions—to the exam.
  • Exam 1 will cover the following sections of the book. From Chapter 1, the exam will cover Sections 1, 2, 3, 4, 5, 7, 8, 9, and from Chapter 2 only section 1. This covers the first six weeks of class and the first four homework assignments.
  • There is no class on Monday, Oct. 13, but we do have class on Tuesday, Oct. 14 at the usual time and place.
  • Monday, Oct. 6 (Day 8): Began with expressing two-dimensional rotations as matrices. Proved the angle-sum formula from trigonometry. Began discussing matrix operations, including addition, scalar multiplication, and the definition of matrix multiplication.
  • Wednesday, Oct. 8 (Day 9): Derived formula for matrix multiplication. Discussed properties of matrix multiplication. Defined invertible matrix. Defined the determinant for a 2x2 matrix and gave a condition for the invertibility of a 2x2 matrix in terms of the determinant, as well as a formula for the inverse when in it exists.
Week 5
  • Reading: Sections 1.8 and 1.9
  • Monday, Sept. 29 (Day 7): Introduced linear and matrix transformations and proved they are one and the same! Gave some standard examples from geometry of linear transformations.
Week 4
  • Reading: Sections 1.5, 1.7
  • No office hours on Wednesday.
  • To simplify some of HW3, you might find it helpful to use a calculator (or another source) to automate Gauss–Jordan elemination. Here is a link to a video of someone showing how to use a TI-84 to row reduce a matrix: https://youtu.be/zhypLK9nK80
  • There are no classes Monday through Wednesday next week at CUNY, so we will not meet again until Monday, Sept. 29.
  • Monday, Sept. 15 (Day 5): Established several equivalent conditions for the matrix equation Ax=b to be consistent for all vectors b. Introduced homogeneous linear systems and showed how to write their solutions as a span of vectors. Proved that the solution set of Ax=b, when consistent, is a translation of the solution set of Ax=0.
  • Wednesday, Sept. 17 (Day 6): Showed how to write the solution of a linear system of the form Ax=b as a translation of a span of vectors. Introduced the notion of linear independence and established several basic facts.
Week 3
  • Reading: Section 1.2–5
  • Our first quiz will be on Wednesday. The quiz will be given during the final 15 minutes of class and will be based on HW1.
  • Monday, Sept. 8 (Day 3): Discussed how to write the general solution of a linear equation using the reduced row echelon form of the associated augmented matrix, leading and free variables, and parameters. Introduced vectors, linear combinations, and vector equations. Observed that vector equations encode a linear system.
  • Wednesday, Sept. 10 (Day 4): Introduced the notion of the span of a set of vectors, introduced the dot product, and defined the notion of multiplication of an m-by-n matrix and an n-dimensional vectors. We then discussed the equivalence of linear systems, vector equations, and matrix equations. Introduced the column space of a matrix and showed that the matrix equation Ax=b has a solution if and only if b is in the column space of A.
Week 2
  • Reading: Section 1.1 and 1.2
  • Office hours on Wednesday, September 3 will be moved to 3:30–4:30pm (due to a department meeting).
  • Wednesday, Sept. 3 (Day 2): Introduced augmented matrices and the Gauss—Jordan elimination algorithm.
Week 1
  • Reading: Section 1.1
  • The first homework assignment will be assigned next week.
  • Wednesday, Aug. 27 (Day 1): Went over two motivating examples: imgae compression and Google PageRank. Defined a linear system, did an example, and visualized solution.