📘 Linear Algebra Done Right
This course provides a rigorous introduction to the fundamental concepts of linear algebra, with an emphasis on the theory of vector spaces and linear maps. Following Sheldon Axler’s Linear Algebra Done Right, the course shifts focus away from rote matrix calculations toward a deeper conceptual understanding of linear transformations, eigenvalues, and the structure of linear operators.
Students will learn how linear algebra serves as a unifying language across mathematics, the sciences, and engineering, while also appreciating its abstract beauty and logical structure.
🎯 Learning Objectives
By the end of the course, students will be able to:
-
Understand and apply the axioms of vector spaces and subspaces.
-
Use the concepts of linear independence, span, basis, and dimension to analyze vector spaces.
-
Define, analyze, and compute with linear transformations and their associated matrices.
-
Understand the role of eigenvalues and eigenvectors in simplifying linear operators.
-
Diagonalize and decompose operators, with attention to the spectral theorem.
-
Communicate mathematical arguments with precision, including proofs of fundamental theorems in linear algebra.
📖 Topics Covered (Axler’s Structure)
-
Vector Spaces
Definitions, subspaces, direct sums, bases, and dimension. -
Linear Maps
Kernel and range, isomorphisms, and matrix representations relative to bases. -
Polynomials
Use of minimal and characteristic polynomials to study operators. -
Eigenvalues, Eigenvectors, and Diagonalization
Conditions for diagonalizability, Jordan form (optional, advanced). -
Inner Product Spaces
Orthonormal bases, Gram–Schmidt process, adjoint operators. -
Operators on Real and Complex Vector Spaces
Normal and self-adjoint operators, the spectral theorem, unitary and orthogonal operators.
🧰 Pedagogical Approach
-
Emphasis on proofs: Students develop the ability to construct and critique rigorous arguments.
-
Conceptual focus: Rather than reducing linear algebra to computational matrix manipulation, students learn the structure and behavior of linear operators on abstract vector spaces.
-
Minimal reliance on determinants: In keeping with Axler’s philosophy, determinants are introduced late (or omitted), after students have a deeper understanding of linear maps.
-
Applications: Selected applications in differential equations, data science, physics, and computer science are highlighted to connect theory to practice.
📊 Assessment Methods
-
Problem sets (proofs + conceptual exercises).
-
Quizzes/exams (testing both reasoning and key computations).
-
Presentations or projects (optional, for applying linear algebra to real-world or theoretical contexts).
👉 In short, this course develops both the abstract theory and the practical reasoning skills that make linear algebra an essential foundation for advanced mathematics and applied fields.
📘 Linear Algebra — 16-Week Syllabus (Axler, Linear Algebra Done Right)
Part I: Vector Spaces and Linear Maps
Week 1 – Introduction to Vector Spaces
-
Read: Chapter 1.A (Complex Numbers), 1.B (Vector Spaces)
-
Topics: Complex numbers, axioms of vector spaces, examples.
-
Exercises: 1.A–1.B selected problems.
Week 2 – Subspaces and Direct Sums
-
Read: Chapter 1.C (Subspaces), 1.D (Sums and Direct Sums)
-
Topics: Subspaces, intersections, sums, direct sums.
-
Proof emphasis: Showing uniqueness of decomposition.
Week 3 – Linear Independence & Span
-
Read: Chapter 1.E (Linear Dependence and Linear Independence)
-
Topics: Definitions, spanning sets, properties of linear independence.
-
Quiz 1 (Weeks 1–2).
Week 4 – Basis and Dimension
-
Read: Chapter 1.F (Bases), 1.G (Dimension)
-
Topics: Basis construction, dimension of subspaces, exchange lemma.
-
First Midterm Exam Review.
Week 5 – Linear Maps: Kernel and Range
-
Read: Chapter 2.A (Linear Maps), 2.B (Null Spaces and Ranges)
-
Topics: Definitions of linear transformations, kernel, range, rank-nullity theorem.
-
Midterm Exam 1 (Chapters 1–2.B).
Part II: Operators, Polynomials, and Eigenvalues
Week 6 – Matrix Representations & Isomorphisms
-
Read: Chapter 2.C (Matrices), 2.D (Isomorphisms)
-
Topics: Change of basis, representation of maps, isomorphic vector spaces.
Week 7 – Invertibility and Duality
-
Read: Chapter 2.E (Invertibility and Isomorphisms), 2.F (Dual Spaces)
-
Topics: Inverse linear maps, dual space, dual basis.
-
Quiz 2 (Weeks 6–7).
Week 8 – Polynomials & Operators on Complex Vector Spaces
-
Read: Chapter 3.A (Polynomials), 3.B (Operators on Complex Vector Spaces)
-
Topics: Division algorithm, minimal polynomial, eigenvalues and eigenvectors.
Week 9 – Invariant Subspaces & Triangular Matrices
-
Read: Chapter 3.C (Invariant Subspaces), 3.D (Upper-Triangular Matrices)
-
Topics: Reducing operators, existence of eigenvalues over $\mathbb{C}$, Schur’s theorem (triangularization).
-
Midterm Exam 2 (Chapters 2.C–3.D).
Week 10 – Eigenvalues & Diagonalization
-
Read: Chapter 5.A (Diagonalizability), 5.B (Invariant Subspaces Revisited)
-
Topics: Diagonalizable operators, Jordan form (optional/advanced).
Part III: Inner Products and the Spectral Theorem
Week 11 – Inner Product Spaces
-
Read: Chapter 6.A (Inner Products), 6.B (Orthonormal Bases)
-
Topics: Definitions, properties, Gram–Schmidt process.
-
Quiz 3 (Weeks 10–11).
Week 12 – Orthogonal Projections & Adjoints
-
Read: Chapter 6.C (Orthogonal Projections), 6.D (Adjoints)
-
Topics: Orthogonal complements, projection theorem, adjoint operators.
Week 13 – Normal and Self-Adjoint Operators
-
Read: Chapter 7.A (Normal and Self-Adjoint Operators), 7.B (Spectral Theorem)
-
Topics: Spectral theorem, diagonalization of self-adjoint operators.
-
Application highlight: Quantum mechanics, PCA in data science.
Week 14 – Unitary and Orthogonal Operators
-
Read: Chapter 7.C (Unitary and Orthogonal Operators), 7.D (Operators on Real Inner Product Spaces)
-
Topics: Properties of unitary/orthogonal operators, real spectral theorem.
-
Quiz 4 (Weeks 12–14).
Part IV: Wrap-up and Applications
Week 15 – Determinants (Optional, Late Introduction)
-
Read: Chapter 10.A (Determinants), 10.B (Determinants and Eigenvalues)
-
Topics: Determinants defined via volume scaling, connection to eigenvalues.
-
Review session.
Week 16 – Final Exam Week
-
Comprehensive Final Exam (Chapters 1–7, selected 10).
-
Student presentations (optional) on applications of linear algebra (machine learning, differential equations, geometry).
🧾 Notes
-
Homework: Weekly, mix of proofs and computational exercises.
-
Quizzes: Short checks every 2–3 weeks.
-
Exams: Two midterms + comprehensive final.
-
Optional applications: SVD (singular value decomposition) could be included as enrichment.

- Teacher: Jayrold Arcede