Harvard Extension School

Mathematics E-21b - Spring 2024

Linear Algebra




Linear Algebra Toolkit

Math E-21b syllabus (2024)


Canvas Site

Recommended Course Textbook

  The older 2nd edition of the text may also be used. The material is fundamentally the same in all editions and all homework assignments will be made available as printable PDFs. Additional supplements on various topics in differential equations will also be made available during the course.

A key matching HW exercises in different editions is available on request.

A copy of the text will be put on reserve in the Grossman Library in Sever Hall.

Announcements :

Final Exam grading update : I have now graded 84 out of 84 exams submitted (as of 11:39pm Wednesday). Exam scores are now posted on Canvas. Course grades must be submitted to the Harvard Extension School no later than Thursday, May 16. Course grades will be available online (not on Canvas) on Tuesday, May 21 .


Total points on exam was 50. Median score was 44 (88%).
Mean score was 41.4 (82.7%). Standard deviation was 8.0.
Scores and graded exams are posted on the Canvas site.
 
46+ A 31+ C+
43+ A– 28+ C
40+ B+ 25+ C–
37+ B 22+ D
34+ B– 0-21 E

Total points on exam was 50. Median score was 46 (92%).
Mean score was 42.8 (85.6%). Standard deviation was 7.6.
 
46+ A 31+ C+
43+ A– 28+ C
40+ B+ 25+ C–
37+ B 22+ D
34+ B– 0-21 E

Total points on exam was 80. Median score was 66 (82.5%).
Mean score was 62.5 (78.1%). Standard deviation was 14.0.
 
70+ A 50+ C+
66+ A– 46+ C
62+ B+ 42+ C–
58+ B 38+ D
54+ B– 0-37 E

The Harvard University Extension School Registrar’s Office sent the following notice today (Sat, May 11):

Today is the last academic day of the term. All coursework should be submitted before the end of the day today , unless the student has otherwise submitted an Extension of Time Request or Make-up Final Exam Appeal that has been approved or pending review.

The two-hour Final Exam will take place Thursday, May 9 in Canvas/Proctorio (plus an extra 15 minutes for the usual downloading, printing, scanning, and uploading). Total time 135 minutes. The exam window will open at 12:00am and close at 11:59pm on Thursday (May 9).

No alternate exam dates will be permitted except as approved by Harvard Extension School Office.

The exam will cover topics from throughout the course with added emphasis on more recent topics not covered on previous exams.

Calculators will be permitted , but no other software or notes . Non-credit students are not permitted to take the Final Exam .

Exam #2 will take place online in Proctorio Apr 18-19. You should do the Proctorio Setup Quiz in Canvas (under Quizzes) to make sure that your Chrome browser is up-to-day and that the Proctorio extension is properly installed and working. The exam will cover topics from Lectures #5-#11 of the course ( Chapters 4-7 of the text ). The exam window will open at 11:00pm EST on Thurs, April 18 and close at 11:59pm on Fri, April 19 .

Homework: Total points and median/mean scores

50 50 50 50 50 50 50 50 50 50 50 50 50
48 49 48 47 47 48 47 48 47 47 43 47 47
46.0 47.1 44.0 44.6 44.8 46.9 43.9 47.0 44.2 45.3 40.1 44.5 41.9

Your lowest HW score will be dropped when determining course grades.

Extra Credit Problem – Four Fundamental Subspaces, Pseudoinverses, Least Squares After you have submitted HW7 , if you’re still eager to do more, here’s an opportunity. This is purely optional. To receive any extra credit you must answer all questions completely. No partial credit. The deadline for submission is the same as HW7.

There was no class on Thursday, March 14 due to Harvard Spring Break.

Exam #1 will take place online in Proctorio Feb 29 - Mar 1. You should do the Proctorio Setup Quiz in Canvas (under Quizzes) to make sure that your Chrome browser is up-to-day and that the Proctorio extension is properly installed and working. The exam will cover topics from the first five lectures of the course (Chapters 1-4 of the text). The exam window will open at 11:00pm EST on Thurs, February 29 and close at 11:59pm EST on Fri, March 1.

Math E-21b Course Information and Syllabus (Spring 2024)       PDF version (Spring 2024)

There are currently 102 students registered in the course (Jan 17, 4:41pm).

Course meetings : The class meets weekly on Thursdays, 8:00pm to 10:00pm [in person (Harvard 101) or via Zoom] or on-demand in Canvas beginning Thurs, Jan 25, 2023 . Optional problem sessions conducted by our Teaching Assistants are now scheduled and are subject to change as the need arises. An optional session with the instructor may also be scheduled at a day and time to be determined.

Optional TA Section meetings (open to all, take place via Canvas/Zoom): Kris Lokere : Tuesdays 3:30pm-5pm, Saturdays 10:30am-12pm - subject to change Jeremy Marcq : Mondays, 6:00pm-7:00pm and Friday mornings, 9:00am-10:00am - subject to change Renée Chipman : Monday and Wednesday afternoons, 4:00pm-5:30pm - subject to change

---

:
4:00pm-5:30pm
:
6:00pm-7:00pm

:
3:30pm-5:00pm

:
4:00pm-5:30pm

:
8:00pm-10:00pm

:
9:00am-10:00am
:
10:30am-12:00pm

These times may change or other times added as the need arises. The duration of meetings may be extended at the discretion of the TAs. Section meetings are accessed via Canvas/Zoom/Gather.

Prerequisites : Math E-16, or equivalent knowledge of algebra and calculus. You should be able to solve simple systems of equations and find the roots of polynomials. Also, you should be able to set up and solve simple differential equations. Math E-21a (or its equivalent) is not specifically necessary in order to take Math E-21b, but it will be very helpful if you have some familiarity with the algebra and geometry of lines and planes in R 2 , R 3 , and possibly R n , and the dot product of two vectors.

-

A word about calculators:

Though you can do this course without a matrix-capable calculator or mathematical software, it's certainly easier if you have an electronic servant to handle the drudge work. I use a TI-85 (no longer sold, but a good buy if you can find a used one) and I've been very happy with it. Ideally, you'll want a calculator that can find the reduced row echelon form of a matrix (RREF). You might also want one that can calculate determinants, eigenvalues, and eigenvectors, but that's a lesser priority. The TI-83 Plus , the TI-84 Plus , the TI-84 Plus Silver Edition , the TI-86 (also discontinued), and the TI-89 calculators can handle these operations. You don't need anything fancier than this.

One feature that I find very handy is the ability to display fractions and convert a decimal expression (for a rational number) to a fraction. That's useful when translating the results of an RREF calculation into parametric equations for a solution to a system of linear equations.

Here's a link that gives a comparison of the various TI calculators . Other manufactures also produce calculators that will work well with this course.

Linear Algebra Toolkit - an excellent online collection of tools that will not only do the calculations but also walk you through the steps. (The PERL scripts are written by Przemyslaw Bogacki.)

Questions, questions, questions..... and some answers

1) I usually post announcements and assignments initially on the working course website: http://math.rwinters.com/E21b/ I also then post them on the Canvas Site which is linked from there, but it's best to first check on this (non-Canvas) site.

2) All of the lectures will be live-streamed and recorded and made available within about one day. All of the lectures will be available via a link on the course's Canvas Site for the duration of the course and for a few weeks thereafter.

3) The main materials of the course are the lectures, the weekly Lecture Notes, the Bretscher text (best to get an inexpensive one online - links for best deals on the course website ), and, of course, the homework assignments.

4) Students will be able to submit HW as a single scanned PDF (easily readable and with a reasonable file size, i.e. not scanned at an unnecessarily high resolution). Since classes are on Thursday and the recorded lectures are available afterwards, I generally have assignments due no later than the following Saturday night. All students must submit their assignments online.

5) We will have two midterm exams and a final exam. Two midterm exams will take place online via Proctorio during 24-hour windows approx. on Feb 29-Mar 1 and April 18-19 . There will be a two-hour final exam online via Proctorio on May 9 . Your course grade will be computed according to the following scheme, subject to minor modification:

.25 (homework) + .40 (midterm exams) + .35 (final exam)

6) Since Linear Algebra is largely independent of the Calculus sequence, any prerequisites other than general mathematical competence are really just recommendations or suggestions. Even the Placement Test is really for your own use to see if you're ready for the course. It is not required for registration.

7) The “Graduate” credit option is for students enrolled in or planning to be enrolled in the Extension School’s “Math for Teaching” program. All other students (including high school students) should register for the “Undergraduate” option or the Noncredit option (if you will not be submitting homework or taking exams). Students registered for “Graduate” credit will be asked to complete additional work on supplemental topics.

8) Noncredit students will not be permitted to take the Final Exam. The Harvard Extension School will not permit students to change their registration from Noncredit to Undergraduate after the course is well underway, e.g. after any exams have occurred.

The Harvard Extension School is committed to providing an accessible academic community . The Accessibility Office offers a variety of accommodations and services to students with documented disabilities.

Please visit https://www.extension.harvard.edu/resources-policies/resources/disability-services-accessibility for more information.

Students should submit their assignments as a single scanned PDF file . A basic standard is that you should not scan at resolution greater than about 120dpi . Anything beyond that is not necessary for handwritten work and dramatically increases file size. Black & White scans are preferred unless there is a good reason to scan using color. Please also make sure that the contrast is adjusted properly so that all work is clearly legible .

Based on class size and practical limits, we will select a subset of each assignment for grading. Solutions to all problems will be posted after the due date. You are responsible for understanding Harvard Extension School policies on academic integrity ( https://www.extension.harvard.edu/resources-policies/student-conduct/academic-integrity ) and how to use sources responsibly. Not knowing the rules, misunderstanding the rules, running out of time, submitting the wrong draft, or being overwhelmed with multiple demands are not acceptable excuses. There are no excuses for failure to uphold academic integrity. To support your learning about academic citation rules, please visit the Harvard Extension School Tips to Avoid Plagiarism ( https://www.extension.harvard.edu/resources-policies/resources/tips-avoid-plagiarism ), where you'll find links to the Harvard Guide to Using Sources and two free online 15-minute tutorials to test your knowledge of academic citation policy. The tutorials are anonymous open-learning tools.

In particular : “To avoid any suggestions of improper behavior during an exam, students should not communicate with other students during the exam . Neither should they refer to any books, papers, or use electronic devices during the exam without the permission of the instructor or proctor.”

“Breaches of academic integrity are subject to review by the Administrative Board and may be grounds for disciplinary action, up to and including requirement to withdraw from the Extension School and suspension of registration privileges .”

: All homework is due online via the Canvas site - typically by 8:00pm Eastern Time on the Friday or Saturday after the following lecture. Late assignments will generally not be accepted unless cleared in advance with one of the Teaching Assistants (who will be doing the grading) or the Instructor. Your lowest homework grade will not factor into your course grade. .

: Please write up your solutions neatly and show all work. If you use a calculator or other technology as part of your solution, this should be clearly indicated. When asked to “prove” or “justify” a statement, you must write clear verbal explanations in addition to doing out the math; the idea behind a proof is to convince the reader that the statement is true. Most problems will be marked out of 5 points (with the exception of some longer problems, which may be 10 points).

: Please 1) ; and 2) .

: It is always best to attempt every problem and to turn in each homework assignment even if there are some errors or omissions. Please always include work you have done that led to the final solution; this way we can point out where you have made an error. However, please don't include pages and pages of extraneous work. .

: Homework assignments submitted online will generally be graded with only minimal comment, so you should consult the posted solutions to better understand whatever errors you may have made.

If there are any questions on these policies, please let us know. We look forward to working with you all!

Important Dates - Harvard University Extension School - Spring 2024

)

Official Academic Calendar

A letter to the New England Courant , dated May 14, 1722, and actually written by Benjamin Franklin under the pseudonym "Silence Dogood." This was one of 14 letters by Silence Dogood and concerns Harvard University.

posted with the permission of Bill Griffith

Useful Links:

  • Harvard Extension School
  • Harvard University
  • Harvard University Dept. of Mathematics

Download your free Adobe Acrobat Reader for reading and printing PDF formatted documents.

Please send comments to Robert Winters .

URL : http://math.rwinters.com/E21b

Last modified: Thursday, May 16, 2024 0:17 AM

FINAL EXAM: Friday Dec 14, 8:30-11:30 am, room 380-Y

A sample final exam can be found here, sample final solutions here, midterm solutions can be found here, homework will be posted every monday here:.

  • Problem Set 1
  • Problem Set 2
  • Problem Set 3
  • Problem Set 4
  • Problem Set 5
  • Problem Set 6
  • Problem Set 7
  • Optional Problem Set 8
  • vector spaces, linear independence, basis, span, dimension
  • linear maps, matrices, nullspace and range, invertibility and isomorphism
  • products and quotients of vector spaces, duality
  • volume and determinants
  • eigenvalues, eigenvectors, characteristic polynomial
  • inner product, norm, orthogonal and orthonormal bases
  • nilpotent operators, generalized eigenvectors, the Jordan normal form
  • Homework constitutes 30% of the grade.
  • The midterm exam constitutes 30% of the grade.
  • The final exam constitutes 40% of the grade.

MA 26500, Fall 2024 Linear Algebra

Credit Hours: 3.00. Introduction to linear algebra. Systems of linear equations, matrix algebra, vector spaces, determinants, eigenvalues and eigenvectors, diagonalization of matrices, applications. Not open to students with credit in MA 26200, 27200, 35000 or 35100.

Course Information

  • Course Syllabus (pdf)

Assignment Information

  • Assignments
  • Handwritten Homework

Exam Information

  • Exam 1 (Sections 1.1 - 3.3) Wednesday, 10/2, 8:00 PM in ELLT 116
  • Exam 2 (Sections 4.1 - 5.7) Tuesday, 11/5, 6:30 PM in ELLT 116
  • Final Exam - There is a two hour comprehensive common final exam given during final exam week. The time and location will be announced later.

Instructor Info

101 SMTH 208 11:30AM MWF MATH 620
102 SMTH 208 12:30PM MWF MATH 620
153 SMTH 208 1:30PM TR
154 SMTH 208 10:30AM TR
205 AR 101 3:30PM MWF
206 BRNG 1268 11:30AM MWF
357 STON 215 11:30AM MWF
410 HAMP 3153 2:30PM MWF
451 FRNY 1043 10:30AM MWF MATH 639
501 BRNG B222 11:30AM MWF
502 BRNG B222 3:30PM MWF
600 HAAS G066 8:30AM MWF MATH 643
650 LILY G401 8:30AM MWF
701 BRNG B222 12:30PM MWF
702 PHYS 333 12:30PM MWF
704 HAAS G066 4:30PM MWF
705 HAAS G066 2:30PM MWF
706 HAAS G066 3:30PM MWF
707 LILY G401 1:30PM MWF
708 PHYS 333 11:30AM MWF
709 KNOY B033 11:30AM MWF MATH 802
710 KNOY B033 12:30PM MWF MATH 802

Course Materials

Section Type Title Author
ALL SUPP MyLab Math access for online homework - ISBN: 9780135851159: Lay: MyLab Math with Pearson eText -- Access Card -- for Linear Algebra and its Applications 6e (18-Weeks). ISBN: 9780135851258: Lay: Linear Algebra and Its Applications 6e (rental edition).For students that want a print book, they can get the print/rental at the bookstore or they can purchase it from inside the MyLab Math.

Important Notes

  • ADA policies: please see our ADA Information page for more details
  • In the event of a missed exam, see your instructor/professor as soon as possible.
  • See the online course evaluation page for more information on how we collect course feedback from students
  • Undergraduate
  • Applied Math
  • National Math Sciences Alliance
  • Course Links
  • Math Resource Room (MRR)
  • Brightspace
  • Past Courses Archive
  • Past Exams Archive
  • University Exam Schedules
  • University Course Catalog
  • University Course Schedule
  • Undergraduate Advisors
  • Undergraduate Plans of Study
  • WIEP-WISP Tutoring Program

Department of Mathematics, Purdue University, 150 N. University Street, West Lafayette, IN 47907-2067

Phone: (765) 494-1901 - FAX: (765) 494-0548   Contact Us

© 2024 Purdue University | An equal access/equal opportunity university | Copyright Complaints | DOE Degree Scorecards

Trouble with this page? Accessibility issues ? Please contact the College of Science .

Maintained by Science IT

Thomas Church

  • Math 120 | S18
  • Math 210A | F17
  • Math 120 | F15
  • Math 113 | F15
  • Math 51 | S15
  • Math 113 | W13
  • Math 283 | F12
  • Math 51 | F11
  • Math 175 | F10
  • Math 113 | W10
  • Math 112 | F09
  • Math 196 | S09
  • Math 153 | W09
  • Math 152 | F08
  • Math 161–3 | 07–08

Math 113: Linear Algebra and Matrix Theory

In Fall 2015 I taught Math 113 at Stanford University. The course assistant was Guanyang Wang. For questions about the material and class discussions, we used the Math 113 Piazza page .

  • Homework 1 , due September 30 ( solutions )
  • Homework 2 , due October 7 ( solutions )
  • Homework 3 , due October 14 ( solutions )
  • Homework 4 , due October 21 ( solutions )
  • Homework 5 , due October 28 ( solutions )
  • Homework 6 , due November 4 ( solutions )
  • Homework 7 , due November 11 ( solutions )
  • Homework 8 , due November 18 ( solutions )
  • Homework 9 , due December 2 ( solutions )

The midterm exam ( solutions ), and practice midterm ( solutions ), and the final exam .

  • Week 1: Chapter 1A, 1B, 1C
  • Online assignment #1 (due Sunday 9/27): read 2A
  • Week 2: Chapter 2A, 2B, 2C, 3A
  • Online assignment #2 (due Sunday 10/4): read 3A, 3B, 3C
  • Week 3: Chapter 3B, 3C, 3D
  • Online assignment #3 (due Sunday 10/11): read 3E, 3F, 5A
  • Week 4: Chapter 5A, 3F (no annihilators), 3E (quotients only)
  • Online assignment #4 (due Sunday 10/18): read 5B, 5C, and p262–267 on minimal polynomial (skip 8.48); review p122–125 on polynomials if necessary
  • Week 5: Chapter 5A, 5B, 5C, 8.C (no characteristic polynomial)
  • Week 6: Chapter 5C, 6A, 6B
  • Online assignment #6 (due Sunday 11/1): review Chapter 6A through p169, read 6B, 6C
  • Week 7: Chapter 6B, 6C, 7A
  • Online assignment #7 (due Sunday 11/8): read 7A and statements of Complex/Real Spectral Theorems in 7B
  • Week 8: Chapter 7A, 7B, summary of 7C
  • Online assignment #8 (due Tuesday 11/17): read Sections 1 (The space of k -wedges) and Section 2 (Wedge Dependence and Independence) of these notes
  • Week 9: Sections 1, 2, 3 of Notes on k-wedges and determinants
  • Week 10: Jordan form; SVD (image recovery slides: 1 , 2 , 3 ); Bell's inequality and linear algebra in quantum mechanics.
  • Notes on wedge products, determinants, and characteristic polynomials (15 pages)
  • Midterm exam , given February 11
  • Take-home final exam , due March 18

The full syllabus for the course is available here .

Course description: Math 113 is a course on linear algebra, the study of vector spaces and linear maps. The emphasis will be quite theoretical: we will study abstract properties of vector spaces and linear maps as well as their geometric interpretation, mostly ignoring the computational aspects. If you are more interested in applications of linear algebra, you should consider taking Math 104 instead.      Besides studying linear algebra, an important goal of the course is to learn how to write mathematics . In class we will give rigorous proofs, emphasizing proper mathematical language and notation. Through the homework assignments, you will learn to apply mathematical reasoning and write clear, compelling and correct proofs yourself. Your homework and exams will be judged accordingly. No background in linear algebra or proofs is assumed, and there are no formal prerequisites for the course; Math 113 is appropriate for students who have already seen some linear algebra in Math 51.

All homework assignments for the course will be available here, once they are posted. The final exam is Monday, March 18 from 12:15–3:15pm.

  • Statement from the Registrar concerning students with documented disabilities: "Students who may need an academic accommodation based on the impact of a disability must initiate the request with the Student Disability Resource Center (SDRC) located within the Office of Accessible Education (OAE). SDRC staff will evaluate the request with required documentation, recommend reasonable accommodations, and prepare an Accommodation Letter for faculty dated in the current quarter in which the request is being made. Students should contact the SDRC as soon as possible since timely notice is needed to coordinate accommodations. The OAE is located at 563 Salvatierra Walk (phone: 723-1066)."
  • Stanford's Honor Code and Fundamental Standard .

© 2014 Thomas Church Original design by Andreas Viklund

Navigation Menu

Search code, repositories, users, issues, pull requests..., provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications You must be signed in to change notification settings

C1W2_Assignment.md

Latest commit, file metadata and controls, programming assignment - gaussian elimination.

Welcome to the programming assignment on Gaussian Elimination! In this assignment, you will implement the Gaussian elimination method, a foundational algorithm for solving systems of linear equations.

Linear algebra is fundamental to machine learning, serving as the basis for numerous algorithms. Gaussian elimination, while not the most advanced method used today, is a classical and essential technique for solving systems of linear equations. It provides valuable insights into the core principles of linear algebra and lays the groundwork for more advanced numerical methods.

Why should you care?

  • Foundational Skills : Strengthen your understanding of key linear algebra concepts.
  • Programming Practice : Enhance your coding skills by implementing a classical mathematical algorithm.
  • Historical Significance : Gaussian elimination, though not the most cutting-edge method today, is historically significant and provides a solid starting point for understanding the evolution of linear algebra techniques.

1 - Introduction

2 - necessary imports, 3.1 - function swap rows, 3.2 - finding the first non-zero value in a column starting from a specific value, 3.3 - find the pivot for any row, 3.4 moving one row to the bottom, 3.5 - constructing the augmented matrix, 4.1 - row echelon form, 4.2 - worked example, 7 - test with any system of equations, gaussian elimination algorithm.

Gaussian elimination offers a systematic approach to solving systems of linear equations by transforming an augmented matrix into row-echelon form, thereby enabling the determination of variables. The algorithm comprises several essential steps:

Step 1: Augmented Matrix

Consider a system of linear equations:

$$ \begin{align*} 2x_1 + 3x_2 + 5x_3&= 12 \\ -3x_1 - 2x_2 + 4x_3 &= -2 \\ x_1 + x_2 - 2x_3 &= 8 \\ \end{align*} $$

Create the augmented matrix ([A | B]), where (A) represents the coefficient matrix and (B) denotes the column vector of constants:

$$ A = \begin{bmatrix} \phantom{-}2 & \phantom{-}3 & \phantom{-}5 \\ -3 & -2 & \phantom-4 \\ \phantom{-}1 & \phantom{-}1 & -2 \\ \end{bmatrix} $$

$$ B = \begin{bmatrix} \phantom-12 \ -2 \ \phantom-8 \end{bmatrix} $$

Thus, ([A | B]) is represented as:

$$ \begin{bmatrix} \phantom{-}2 & \phantom{-}3 & \phantom{-}5 & | & \phantom{-}12 \\ -3 & -2 & \phantom-4 & | & -2 \\ \phantom{-}1 & \phantom{-}1 & -2 & | & \phantom{-}8 \\ \end{bmatrix} $$

Note: For this assignment, matrix (A) is always square , accommodating scenarios with (n) equations and (n) variables.

Step 2: Transform Matrix into Reduced Row Echelon Form

Initiate row operations to convert the augmented matrix into row-echelon form. The objective is to introduce zeros below the leading diagonal.

  • Row Switching: Rearrange rows to position the leftmost non-zero entry at the top.
  • Row Scaling: Multiply a row by a non-zero scalar.
  • Row Replacement: Substitute a row with the sum of itself and a multiple of another row.

Step 3: Solution Check

Examine for a row of zeros in the square matrix (excluding the augmented part).

Consider the following cases:

  • If no row comprises zeros, a unique solution exists.
  • If one row contains zeros with a non-zero augmented part, the system has no solutions .
  • If every row of zeros has a zero augmented part, the system boasts infinitely many solutions .

Special attention is required to address conditions 2 and 3. A matrix might contain one row of zeros with an augmented part of 0 and another row with zeros and a non-zero augmented part, making the system impossible.

Step 5: Back Substitution

After attaining the reduced row-echelon form, solve for variables starting from the last row and progressing upwards.

Remember, the aim is to simplify the system for easy determination of solutions!

Step 6: Compile the Gaussian Elimination Algorithm

Combine each function related to the aforementioned steps into a single comprehensive function.

Next codeblock will import the necessary libraries to run this assignment. Please do not add nor remove any value there.

3 - Auxiliary functions

This section introduces five auxiliary functions crucial for facilitating your assignment. These functions have already been coded, eliminating the need for your concern regarding their implementation. However, it's essential to examine them carefully to grasp their appropriate usage.

Note: In Python, indices commence at $0$ rather than $1$ . Therefore, a matrix with $n$ rows is indexed as $0, 1, 2, \ldots, n-1$ .

This function has as input a numpy array and two indexes to swap the rows corresponding to those indexes. It does not change the original matrix , but returns a new one.

Let's practice with some examples. Consider the following matrix $M$ .

Swapping row $0$ with row $2$ :

This function becomes essential when encountering a $0$ value during row operations. It determines whether a non-zero value exists below the encountered zero, allowing for potential row swaps. Consider the following scenario within a square matrix (non-augmented):

Let's say, during a specific step of the row-echelon form process, you've successfully reduced the first 2 rows, but you encounter a zero pivot (highlighted in red) in the third row. The task is to search, solely in entries below the pivot , for a potential row swap.

$$ \begin{bmatrix} 6 & 4 & 8 & 1 \\ 0 & 8 & 6 & 4 \\ \color{darkred}0 & \color{darkred}0 & \color{red}0 & \color{darkred}3 \\ 0 & 0 & 5 & 9 \end{bmatrix} $$

Performing a row swap between indexes 2 and 3 (remember, indexing starts at 0!), the matrix transforms into:

$$ \begin{bmatrix} 6 & 4 & 8 & 1 \\ 0 & 8 & 6 & 4 \\ 0 & 0 & 5 & 9 \\ 0 & 0 & 0 & 3 \end{bmatrix} $$

Resulting in the matrix achieving the row-echelon form.

Let's practice with this function. Consider the following matrix.

If you search for a value below the first column starting at the first row, the function should return None:

Searching for the first non zero value in the last column starting from row with index 2, it should return 3 (index corresponding to the value 7).

This function aids in locating the pivot within a designated row of a matrix. It identifies the index of the first non-zero element in the desired row. If no non-zero value is present, it returns None.

Let's practice with the same matrix as before:

Looking for the first non-zero index in row $2$ must return None whereas in row $3$ , the value returned must be $3$ (the index for the value $1$ in that row).

This function facilitates the movement of a specific row to the bottom. Such an operation becomes necessary when confronted with a row entirely populated by zeroes. In reduced row-echelon form, rows filled with zeroes must be positioned at the bottom.

One small example:

This function constructs the augmented matrix by combining a square matrix of size $n \times n$ , representing $n$ equations with $n$ variables each, with an $n \times 1$ matrix that denotes its constant values. The function concatenates both matrices to form the augmented matrix and returns the result.

4 - Row echelon form and reduced row echelon form

As discussed in the lectures, a matrix in row echelon form adheres to the following conditions:

  • Rows consisting entirely of zeroes should be positioned at the bottom.
  • Each non-zero row must have its left-most non-zero coefficient (termed as a pivot ) located to the right of any row above it. Consequently, all elements below the pivot within the same column should be 0.

This form ensures a structured arrangement facilitating subsequent steps in the Gaussian elimination process.

Example of matrix in row echelon form

$$M = \begin{bmatrix} 7 & 2 & 3 \\ 0 & 9 & 4 \\ 0 & 0 & 3 \\ \end{bmatrix} $$

Examples of matrices that are not in row echelon form

$$ A = \begin{bmatrix} 1 & 2 & 2 \\ 0 & 5 & 3 \\ 1 & 0 & 8 \\ \end{bmatrix} $$

$$B = \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 4 \\ 0 & 0 & 7 \\ \end{bmatrix} $$

Matrix $A$ fails to satisfy the criteria for row echelon form as there exists a non-zero element below the first pivot (located in row 0). Similarly, matrix $B$ does not meet the requirements as the second pivot (in row 1 with a value of 4) has a non-zero element below it.

In this section, you'll revisit an example from the lecture to facilitate the implementation of an algorithm. If you feel confident in proceeding with the algorithm, you may skip this section.

Consider matrix $M$ given by:

$$ M = \begin{bmatrix}

  • & * & * & \ 0 & \text{pivot} & * \ 0 & \text{value} & * \end{bmatrix} $$

Here, the asterisk (*) denotes any number. To nullify the last row (row $2$ ), two steps are required:

  • Scale $R_1$ by the inverse of the pivot:

$$ \text{Row 1} \rightarrow \frac{1}{\text{pivot}} \cdot \text{Row } $$

Resulting in the updated matrix with the pivot for row $1$ set to $1$ :

  • & * & * & \ 0 & 1 & * \ 0 & \text{value} & * \end{bmatrix} $$

Next, to eliminate the value below the pivot in row $1$ , apply the following formula:

$$ \text{Row 2} \rightarrow \text{Row 2} - \text{value}\cdot \text{Row 1} $$

This transformation yields the modified matrix:

  • & * & * & \ 0 & 1 & * \ 0 & 0 & * \end{bmatrix} $$

Note that the square matrix $A$ needs to be in reduced row-echelon form. However, every row operation conducted must also affect the augmented (constant) part. This ensures that you are effectively preserving the solutions for the entire system!

Let's review the example covered in the lecture.

$$ \begin{align*} 2x_2 + x_3 &= 3 \\ x_1 + x_2 +x_3 &= 6 \\ x_1 + 2x_2 + 1x_3 &= 12 \end{align*} $$

Consequently, the square matrix $A$ is formulated as:

$$ A = \begin{bmatrix} 0 & 2 & 1 & \\ 1 & 1 & 1 & \\ 1 & 2 & 1 & \end{bmatrix} $$

The column vector (a matrix of size $n \times 1$ ) is represented by:

$$ B = \begin{bmatrix} 3\\ 6\\ 12 \end{bmatrix} $$

Combining matrices $A$ and $B$ yields the augmented matrix $M$ :

$$ M = \begin{bmatrix} 0 & 2 & 1 & | & 3 \\ 1 & 1 & 1 & | & 6 \\ 1 & 2 & 1 & | & 12 \end{bmatrix} $$

Commencing with row $0$ : The initial candidate for the pivot is always the value in the main diagonal of the matrix. Denoting row $0$ as $R_0$ :

$$R_0= \begin{bmatrix} 0 & 2 & 1 & | & 3 \end{bmatrix}$$

The value in the main diagonal is the element $M[0,0]$ (the first element of the first column). The first row can be accessed by performing $M[0]$ , i.e., $M[0] = R_0$ .

The first row operation involves scaling by the pivot's inverse . Since the value in the main diagonal is $0$ , necessitating a non-zero value for scaling by its inverse, you must switch rows in this case. Note that $R_1$ has a value different from $0$ in the required index. Consequently, switching rows $0$ and $1$ :

$$R_0 \rightarrow R_1$$ $$R_1 \rightarrow R_0$$

Resulting in the updated augmented matrix:

$$ M = \begin{bmatrix} 1 & 1 & 1 & | & 6 \\ 0 & 2 & 1 & | & 3 \\ 1 & 2 & 1 & | & 12 \end{bmatrix} $$

Now, the pivot is already $1$ , eliminating the need for row scaling. Following the formula:

$$ R_1 \rightarrow R_1 - 0 \cdot R_0 = R_1$$

Therefore, the second row remains unchanged. Moving to the third row ( $R_2$ ), the value in the augmented matrix below the pivot from $R_0$ is $M[2,0]$ , which is $1$ .

$$R_2 = R_2 - 1 \cdot R_0 = \begin{bmatrix} 0 & 1 & 0 & | & 6 \end{bmatrix}$$

Resulting in the modified augmented matrix:

$$ M = \begin{bmatrix} 1 & 1 & 1 & | & 6 \\ 0 & 2 & 1 & | & 3 \\ 0 & 1 & 0 & | & 6 \end{bmatrix} $$

Progressing to the second row ( $R_1$ ), the value in the main diagonal is $2$ , different from zero. Scaling it by $\frac{1}{2}$ :

$$R_1 = \frac{1}{2}R_1$$

Resulting in the augmented matrix:

$$ M = \begin{bmatrix} 1 & 1 & 1 & | & 6 \\ 0 & 1 & \frac{1}{2} & | & \frac{3}{2} \\ 0 & 1 & 0 & | & 6 \end{bmatrix} $$

Now, there's only one row below it for row replacement. The value just below the pivot is located at $M[2,1]$ , which is $1$ . Thus:

$$R_2 = R_2 - 1 \cdot R_1 = \begin{bmatrix} \phantom{-}0 & \phantom{-}0 & -\frac{1}{2} & | & \phantom{-}\frac{9}{2} \end{bmatrix} $$

$$ M = \begin{bmatrix} \phantom{-}1 & \phantom{-}1 & \phantom{-}1 & | & \phantom{-}6 \\ \phantom{-}0 & \phantom{-}1 & \phantom{-}\frac{1}{2} & | & \phantom{-}\frac{3}{2} \\ \phantom{-}0 & \phantom{-}0 & -\frac{1}{2} & | & \phantom{-}\frac{9}{2} \end{bmatrix} $$

Thus, the matrix is now in reduced row echelon form.

4.3 Handling Non-Diagonal Pivots

In some cases, matrices may lack pivots exclusively in the main diagonal, necessitating special handling within the algorithm. To simplify, let's focus solely on the square matrix, disregarding the augmented part:

$$ \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 7 \\ 0 & 0 & 5 \end{bmatrix} $$

The process initiates typically, beginning with $R_0 = M[0]$ and the pivot as $M[0,0]$ . Notably, the pivot is already $1$ , and all values below it are already $0$ , indicating that after row normalization and reduction, the outcome remains unchanged. Moving to $R_1 = M[1]$ , the pivot candidate, $M[1,1]$ , is $0$ , rendering the standard procedure impossible due to division by zero during row scaling. Thus, the following steps are necessary:

Explore another row below the current row to identify a row with a non-zero value in the same column as the value $M[1,1]$ . In the given example, there's only one such row, so examining the value right below it, $M[2,1]$ , reveals another $0$ , impeding row swapping. Consequently, the next step becomes vital.

Given the failure of step 1, search within the row for the first non-zero number, which becomes the new pivot. If no such number exists, it signifies a row entirely populated by zeroes, to be shifted to the matrix's last row (recall that in reduced row echelon form, rows filled with 0's reside at the bottom). In the current case, the first non-zero value in $R_1$ is $7$ , i.e., position $2$ in that row. So the new pivot index is not in the diagonal but after it, i.e., $M[1,2]$ .

In the current scenario, the initial non-zero value in $R_1$ is $7$ , specifically in position $2$ within that row. Thus, the new pivot index lies beyond the diagonal, at $M[1,2]$ .

$$R_1 =\frac{1}{7} R_1$$

Resulting in the matrix after normalization:

$$ M = \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 1 \\ 0 & 0 & 5 \end{bmatrix} $$

Next, scrutinize every value below this new pivot. In row $2$ , the considered value is $M[2,2]$ , derived from the pivot position in the preceding step:

$$R_2 = R_2 - 5 \cdot R_1$$

Leading to:

$$ M = \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} $$

Hence, the matrix is now in reduced row echelon form.

Now you are ready to go! You will implement such algorithm in the following exercise.

This exercise involves implementing the elimination method to convert a matrix into row-echelon form. As discussed in lectures, the primary approach involves inspecting the values along the diagonal. If they equate to $0$ , an attempt to swap rows should be made to obtain a non-zero value.

5 - Finding solutions

5.1 - determining the existence of solutions.

Before proceeding to find the solutions from the matrix in reduced row echelon form, it's crucial to determine if the linear system has viable solutions. In the process of transforming an augmented matrix to reduced row echelon form, a row of zeros within the coefficient matrix indicates a possible scenario.

If this row of zeros extends across the matrix of coefficients (excluding the augmented column), the system is termed singular . This singularity implies the likelihood of either having no solutions or an infinite number of solutions. The distinction lies in the values within the augmented column.

Consider two examples:

$$ A = \begin{bmatrix} 1 & 3 & 1 & | & 7 \\ 0 & 1 & 8 & | & 5 \\ 0 & 0 & 0 & | & 0 \end{bmatrix} $$

This system has infinitely many solutions.

$$ B = \begin{bmatrix} 1 & 3 & 1 & 5 & | & 7 \\ 0 & 1 & 8 & 4 & | & 5 \\ 0 & 0 & 0 & 0 & | & 0 \\ 0 & 0 & 0 & 0 & | & 8 \end{bmatrix} $$

Unlike $A$ , the system related to matrix $B$ has no solutions. In this case, even though the third row contains all zeros with a zero in the augmented column, the last row has a non-zero value in the augmented column.

It's crucial to handle cases like matrix $B$ in code implementation, considering scenarios where multiple rows contain zeros but have a non-zero value in their augmented columns.

In this exercise you will implement a function to check whether an augmented matrix in reduced row echelon form has unique solution, no solutions or infinitely many solutions.

5.2 Back substitution

The final step of the algorithm involves back substitution, a crucial process in obtaining solutions for the linear system. As discussed in the lectures, this method initiates from the bottom and moves upwards. Utilizing elementary row operations, it aims to convert every element above the pivot (which is already zero due to the reduced row echelon form) into zeros. The formula employed is:

$$\text{Row above} \rightarrow \text{Row above} - \text{value} \cdot \text{Row pivot}$$

In this equation, $\text{value}$ denotes the value above the pivot, which initially equals 1. To illustrate this process, let's consider the matrix discussed previously:

$$ M = \begin{bmatrix} \phantom{-}1 & -1 & \phantom{-}\frac{1}{2} & | & \phantom{-}\frac{1}{2} \\ \phantom{-}0 & \phantom{-}1 & \phantom{-}1 & | & -1 \\ \phantom{-}0 & \phantom{-}0 & \phantom{-}1 & | & -1 \end{bmatrix} $$

Starting from bottom to top:

  • $R_1 = R_1 - 1 \cdot R_2 = \begin{bmatrix} 0 & 1 & 0 & | & 0 \end{bmatrix}$
  • $R_0 = R_0 - \frac{1}{2} \cdot R_1 = \begin{bmatrix} 1 & -\frac{1}{2} & 0 & | & 1 \end{bmatrix}$

The resulting matrix is then

$$ M = \begin{bmatrix} \phantom{-}1 & -\frac{1}{2} & \phantom{-}0 & | & \phantom{-}1 \\ \phantom{-}0 & \phantom{-}1 & \phantom{-}0 & | & \phantom{-}0 \\ \phantom{-}0 & \phantom{-}0 & \phantom{-}1 & | & -1 \end{bmatrix} $$

Moving to $R_1$ :

  • $R_0 = R_0 - \left(-\frac{1}{2} R_1 \right) = \begin{bmatrix} 1 & 0 & 0 & | & 1 \end{bmatrix}$

And the final matrix is

$$ M = \begin{bmatrix} \phantom{-}1 & \phantom{-}0 & \phantom{-}0 & | & \phantom{-}1 \\ \phantom{-}0 & \phantom{-}1 & \phantom{-}0 & | & \phantom{-}0 \\ \phantom{-}0 & \phantom{-}0 & \phantom{-}1 & | & -1 \end{bmatrix} $$

Note that after back substitution, the solution is just the values in the augmented column! In this case,

$$x_0 = 1 \ x_1 =0\ x_2 = -1$$

In this exercise you will implement a function to perform back substitution in an augmented matrix with unique solution . You may suppose that all checks for solutions were already done.

6 - The Gaussian Elimination

6.1 - bringing it all together.

Your task now is to integrate all the steps achieved thus far. Start with a square matrix $A$ of size $ n \times n$ and a column matrix $B$ of size $n \times 1$ and transform the augmented matrix $[A | B]$ into reduced row echelon form. Subsequently, verify the existence of solutions. If solutions are present, proceed to perform back substitution to obtain the values. In scenarios where there are no solutions or an infinite number of solutions, handle and indicate these outcomes accordingly.

In this exercise you will combine every function you just wrote to finish the Gaussian Elimination algorithm.

The code below will allow you to write any equation in the format it is given below (any unknown lower case variables are accepted, in any order) and transform it in its respective augmented matrix so you can solve it using the functions you just wrote in this assignment!

You just need to change the equations variable, always keeping * to indicate product between unknowns and variables and one equation in each line!

Congratulations! You have finished the first assignment of this course! You built from scratch a linear system solver!

  • Mathematics
  • NOC:Linear Algebra (Video) 
  • Co-ordinated by : IIT Kanpur
  • Available from : 2020-05-06
  • Intro Video
  • Notations, Motivation and Definition
  • Matrix: Examples, Transpose and Addition
  • Matrix Multiplication
  • Matrix Product Recalled
  • Matrix Product Continued
  • Inverse of a Matrix
  • Introduction to System of Linear Equations
  • Some Initial Results on Linear Systems
  • Row Echelon Form (REF)
  • LU Decomposition - Simplest Form
  • Elementary Matrices
  • Row Reduced Echelon Form (RREF)
  • Row Reduced Echelon Form (RREF) Continued
  • RREF and Inverse
  • Rank of a matrix
  • Solution Set of a System of Linear Equations
  • System of n Linear Equations in n Unknowns
  • Determinant
  • Permutations and the Inverse of a Matrix
  • Inverse and the Cramer's Rule
  • Vector Spaces
  • Vector Subspaces and Linear Span
  • Linear Combination, Linear Independence and Dependence
  • Basic Results on Linear Independence
  • Results on Linear Independence Continued...
  • Basis of a Finite Dimensional Vector Space
  • Fundamental Spaces associated with a Matrix
  • Rank - Nullity Theorem
  • Fundamental Theorem of Linear Algebra
  • Definition and Examples of Linear Transformations
  • Results on Linear Transformations
  • Rank-Nullity Theorem and Applications
  • Isomorphism of Vector Spaces
  • Ordered Basis of a Finite Dimensional Vector Space
  • Ordered Basis Continued
  • Matrix of a Linear transformation
  • Matrix of a Linear transformation Continued...
  • Matrix of Linear Transformations Continued...
  • Similarity of Matrices
  • Inner Product Space
  • Inner Product Continued
  • Cauchy Schwartz Inequality
  • Projection on a Vector
  • Results on Orthogonality
  • Results on Orthogonality.
  • Gram-Schmidt Orthonormalization Process
  • Orthogonal Projections
  • Gram-Schmidt Process: Applications
  • Examples and Applications on QR-decomposition
  • Recapitulate ideas on Inner Product Spaces
  • Motivation on Eigenvalues and Eigenvectors
  • Examples and Introduction to Eigenvalues and Eigenvectors
  • Results on Eigenvalues and Eigenvectors
  • Results on Eigenvalues and Eigenvectors.
  • Diagonalizability
  • Diagonalizability Continued...
  • Schur's Unitary Triangularization (SUT)
  • Applications of Schur's Unitary Triangularization
  • Spectral Theorem for Hermitian Matrices
  • Cayley Hamilton Theorem
  • Quadratic Forms
  • Sylvester's Law of Inertia
  • Applications of Quadratic Forms to Analytic Geometry
  • Examples of Conics and Quartics
  • Singular Value Decomposition (SVD)
  • Live Session 20-10-2020
  • Watch on YouTube
  • Assignments
  • Download Videos
  • Transcripts
Module NameDownload
noc20_ma54_assignment_Week_0
noc20_ma54_assignment_Week_1
noc20_ma54_assignment_Week_10
noc20_ma54_assignment_Week_11
noc20_ma54_assignment_Week_12
noc20_ma54_assignment_Week_2
noc20_ma54_assignment_Week_3
noc20_ma54_assignment_Week_4
noc20_ma54_assignment_Week_5
noc20_ma54_assignment_Week_6
noc20_ma54_assignment_Week_7
noc20_ma54_assignment_Week_8
noc20_ma54_assignment_Week_9
Sl.No Chapter Name MP4 Download
1Notations, Motivation and Definition
2Matrix: Examples, Transpose and Addition
3Matrix Multiplication
4Matrix Product Recalled
5Matrix Product Continued
6Inverse of a Matrix
7Introduction to System of Linear Equations
8Some Initial Results on Linear Systems
9Row Echelon Form (REF)
10LU Decomposition - Simplest Form
11Elementary Matrices
12Row Reduced Echelon Form (RREF)
13Row Reduced Echelon Form (RREF) Continued
14RREF and Inverse
15Rank of a matrix
16Solution Set of a System of Linear Equations
17System of n Linear Equations in n Unknowns
18Determinant
19Permutations and the Inverse of a Matrix
20Inverse and the Cramer's Rule
21Vector Spaces
22Vector Subspaces and Linear Span
23Linear Combination, Linear Independence and Dependence
24Basic Results on Linear Independence
25Results on Linear Independence Continued...
26Basis of a Finite Dimensional Vector Space
27Fundamental Spaces associated with a Matrix
28Rank - Nullity Theorem
29Fundamental Theorem of Linear Algebra
30Definition and Examples of Linear Transformations
31Results on Linear Transformations
32Rank-Nullity Theorem and Applications
33Isomorphism of Vector Spaces
34Ordered Basis of a Finite Dimensional Vector Space
35Ordered Basis Continued
36Matrix of a Linear transformation
37Matrix of a Linear transformation Continued...
38Matrix of Linear Transformations Continued...
39Similarity of Matrices
40Inner Product Space
41Inner Product Continued
42Cauchy Schwartz Inequality
43Projection on a Vector
44Results on Orthogonality
45Results on Orthogonality.
46Gram-Schmidt Orthonormalization Process
47Orthogonal Projections
48Gram-Schmidt Process: Applications
49Examples and Applications on QR-decomposition
50Recapitulate ideas on Inner Product Spaces
51Motivation on Eigenvalues and Eigenvectors
52Examples and Introduction to Eigenvalues and Eigenvectors
53Results on Eigenvalues and Eigenvectors
54Results on Eigenvalues and Eigenvectors
55Results on Eigenvalues and Eigenvectors.
56Diagonalizability
57Diagonalizability Continued...
58Schur's Unitary Triangularization (SUT)
59Applications of Schur's Unitary Triangularization
60Spectral Theorem for Hermitian Matrices
61Cayley Hamilton Theorem
62Quadratic Forms
63Sylvester's Law of Inertia
64Applications of Quadratic Forms to Analytic Geometry
65Examples of Conics and Quartics
66Singular Value Decomposition (SVD)
Sl.No Chapter Name English
1Notations, Motivation and Definition
2Matrix: Examples, Transpose and Addition
3Matrix Multiplication
4Matrix Product Recalled
5Matrix Product Continued
6Inverse of a Matrix
7Introduction to System of Linear Equations
8Some Initial Results on Linear Systems
9Row Echelon Form (REF)
10LU Decomposition - Simplest Form
11Elementary Matrices
12Row Reduced Echelon Form (RREF)
13Row Reduced Echelon Form (RREF) Continued
14RREF and Inverse
15Rank of a matrix
16Solution Set of a System of Linear Equations
17System of n Linear Equations in n Unknowns
18Determinant
19Permutations and the Inverse of a Matrix
20Inverse and the Cramer's Rule
21Vector Spaces
22Vector Subspaces and Linear Span
23Linear Combination, Linear Independence and Dependence
24Basic Results on Linear Independence
25Results on Linear Independence Continued...
26Basis of a Finite Dimensional Vector Space
27Fundamental Spaces associated with a Matrix
28Rank - Nullity Theorem
29Fundamental Theorem of Linear Algebra
30Definition and Examples of Linear Transformations
31Results on Linear Transformations
32Rank-Nullity Theorem and Applications
33Isomorphism of Vector Spaces
34Ordered Basis of a Finite Dimensional Vector Space
35Ordered Basis Continued
36Matrix of a Linear transformation
37Matrix of a Linear transformation Continued...
38Matrix of Linear Transformations Continued...
39Similarity of Matrices
40Inner Product Space
41Inner Product Continued
42Cauchy Schwartz Inequality
43Projection on a Vector
44Results on Orthogonality
45Results on Orthogonality.
46Gram-Schmidt Orthonormalization Process
47Orthogonal Projections
48Gram-Schmidt Process: Applications
49Examples and Applications on QR-decomposition
50Recapitulate ideas on Inner Product Spaces
51Motivation on Eigenvalues and Eigenvectors
52Examples and Introduction to Eigenvalues and Eigenvectors
53Results on Eigenvalues and Eigenvectors
54Results on Eigenvalues and Eigenvectors
55Results on Eigenvalues and Eigenvectors.
56Diagonalizability
57Diagonalizability Continued...
58Schur's Unitary Triangularization (SUT)
59Applications of Schur's Unitary Triangularization
60Spectral Theorem for Hermitian Matrices
61Cayley Hamilton Theorem
62Quadratic Forms
63Sylvester's Law of Inertia
64Applications of Quadratic Forms to Analytic Geometry
65Examples of Conics and Quartics
66Singular Value Decomposition (SVD)
Sl.No Language Book link
1EnglishNot Available
2BengaliNot Available
3GujaratiNot Available
4HindiNot Available
5KannadaNot Available
6MalayalamNot Available
7MarathiNot Available
8TamilNot Available
9TeluguNot Available

Browse Course Material

Course info.

  • Prof. Gilbert Strang

Departments

  • Mathematics

As Taught In

  • Linear Algebra

Learning Resource Types

Mit18_06s10_pset1_s10_soln.pdf.

This resource contains information related to trigonometry formulas.

facebook

You are leaving MIT OpenCourseWare

IMAGES

  1. SOLUTION: Linear algebra assignment No 2 and 3

    linear algebra assignment

  2. Linear Algebra Assignment Report

    linear algebra assignment

  3. SOLUTION: Linear algebra assignment 1

    linear algebra assignment

  4. SOLUTION: Linear algebra assignment No 2 and 3

    linear algebra assignment

  5. Solved 1 of 1 Linear Algebra Assignment #3 Q1. Determine

    linear algebra assignment

  6. Linear Algebra Assignment 5 Friday

    linear algebra assignment

COMMENTS

  1. Assignments

    Assignments. The problem sets make up 15% of the course grade. Problems are assigned from the required text: Strang, Gilbert. Introduction to Linear Algebra. 4th ed. Wellesley-Cambridge Press, 2009. ISBN: 9780980232714. 19, 25, 27, and 28 from section 3.3.

  2. Problem Sets with Solutions

    MIT18_06SCF11_Ses3.5sol.pdf. pdf. 97 kB. MIT18_06SCF11_Ses3.6sol.pdf. pdf. 101 kB. MIT18_06SCF11_Ses3.7sol.pdf. MIT OpenCourseWare is a web based publication of virtually all MIT course content. OCW is open and available to the world and is a permanent MIT activity.

  3. PDF Exercises and Problems in Linear Algebra

    linear algebra class such as the one I have conducted fairly regularly at Portland State University. ... which precede each assignment, are intended only to x notation and provide \o cial" de nitions and statements of important theorems for the exercises and problems which follow.

  4. Assignments

    Assignments. Axler, Sheldon J. Linear Algebra Done Right. Springer, 2004. ISBN: 9780387982588. [Preview with Google Books] Freely sharing knowledge with learners and educators around the world. Learn more. This section includes 9 homework assignments.

  5. PDF Linear Algebra in Twenty Five Lectures

    line assignments which are usually collected weekly. The rst assignment 8. is designed to ensure familiarity with some basic mathematic notions (sets, ... \Linear Algebra and Its Applications", David C. Lay, Addison{Weseley 2011. \Introduction to Linear Algebra", Gilbert Strang, Wellesley Cambridge

  6. Best Linear Algebra Courses Online with Certificates [2024]

    Learn from top instructors with graded assignments, videos, and discussion forums. Projects (4) Learn a new tool or skill in an interactive, hands-on environment. ... Linear algebra courses cover a variety of topics essential for understanding vector spaces and linear mappings between these spaces. These include the basics of vectors, matrices ...

  7. Introduction to Linear Algebra

    Linear transformations are introduced, focusing on transformation of the plane. Rotations and reflections of the plane combine to form the two-dimensional orthogonal group. Scalar dilations and rotations combine to form a copy of the field of complex numbers. A sketch of Smale's proof of the Fundamental Theorem of Algebra is given, which says ...

  8. Linear Algebra from Elementary to Advanced

    This is the first course of a three course specialization that introduces the students to the concepts of linear algebra, one of the most important and basic areas of mathematics, with many real-life applications. This foundational material provides both theory and applications for topics in. mathematics, engineering and the sciences.

  9. Linear Algebra

    This is a basic subject on matrix theory and linear algebra. Emphasis is given to topics that will be useful in other disciplines, including systems of equations, vector spaces, determinants, eigenvalues, similarity, and positive definite matrices. ... assignment_turned_in Problem Sets with Solutions. grading Exams with Solutions.

  10. Math E-21b

    The material is fundamentally the same in all editions and all homework assignments will be made available as printable PDFs. Additional supplements on various topics in differential equations will also be made available during the course. ... Another excellent Linear Algebra text is Gilbert Strang's "Introduction to Linear Algebra ...

  11. PDF Linear Algebra in Twenty Five Lectures

    \Algebra and Geometry", D. Holten and J. Lloyd, CBRC, 1978. \Theory and Problems of Linear Algebra", S. Lipschutz, McGraw-Hill 1987. There are still many errors in the notes, as well as awkwardly explained concepts. An army of 200 students have already found many of them. The review exercises would provide a better survey of what linear algebra ...

  12. MATH 113: Linear Algebra and Matrix Theory

    Prerequisites: Linear algebra and calculus (Math 51). While most of the material we will cover is in this book, some isn't. You are expected to come to class and take your own notes. Grading: Homework constitutes 30% of the grade. The midterm exam constitutes 30% of the grade.

  13. Linear Algebra

    Assignment Information. Assignments Handwritten Homework Exam Information. Exam 1 (Sections 1.1 - 3.3) Wednesday, 10/2, 8:00 PM in ELLT 116 ... ISBN: 9780135851258: Lay: Linear Algebra and Its Applications 6e (rental edition).For students that want a print book, they can get the print/rental at the bookstore or they can purchase it from inside ...

  14. Linear Algebra Basics

    Basic knowledge of linear algebra is necessary to develop new algorithms for machine learning and data science. In this course, you will learn about the mathematical concepts related to linear algebra, which include vector spaces, subspaces, linear span, basis, and dimension. ... To access graded assignments and to earn a Certificate, you will ...

  15. Exams

    Linear Algebra. Learning Resource Types theaters Lecture Videos. assignment_turned_in Problem Sets with Solutions. grading Exams with Solutions. co_present Instructor Insights. Download Course. Over 2,500 courses & materials Freely sharing knowledge with learners and educators around the world.

  16. Thomas Church

    Online assignment #4 (due Sunday 10/18): read 5B, 5C, and p262-267 on minimal polynomial (skip 8.48); review p122-125 on polynomials if necessary ... No background in linear algebra or proofs is assumed, and there are no formal prerequisites for the course; Math 113 is appropriate for students who have already seen some linear algebra in ...

  17. Coursera-Linear-Algebra-for-Machine-Learning-and-Data-Science ...

    Welcome to the programming assignment on Gaussian Elimination! In this assignment, you will implement the Gaussian elimination method, a foundational algorithm for solving systems of linear equations. Linear algebra is fundamental to machine learning, serving as the basis for numerous algorithms.

  18. Assignments

    Linear Algebra. Learning Resource Types theaters Lecture Videos. assignment_turned_in Problem Sets with Solutions. grading Exams with Solutions. co_present Instructor Insights. Download Course. Over 2,500 courses & materials Freely sharing knowledge with learners and educators around the world.

  19. Mathematics for Machine Learning: Linear Algebra

    Introduction to Linear Algebra and to Mathematics for Machine Learning. Module 1 • 2 hours to complete. In this first module we look at how linear algebra is relevant to machine learning and data science. Then we'll wind up the module with an initial introduction to vectors. Throughout, we're focussing on developing your mathematical ...

  20. NPTEL :: Mathematics

    Sl.No Chapter Name MP4 Download; 1: Notations, Motivation and Definition: Download: 2: Matrix: Examples, Transpose and Addition : Download: 3: Matrix Multiplication

  21. MIT18_06S10_pset1_s10_soln.pdf

    Linear Algebra. Learning Resource Types theaters Lecture Videos. assignment_turned_in Problem Sets with Solutions. grading Exams with Solutions. co_present Instructor Insights. Download Course. Over 2,500 courses & materials Freely sharing knowledge with learners and educators around the world.

  22. Linear Algebra for Machine Learning and Data Science

    Mathematics for Machine Learning and Data Science is a foundational online program created by DeepLearning.AI and taught by Luis Serrano. In machine learning, you apply math concepts through programming. And so, in this specialization, you'll apply the math concepts you learn using Python programming in hands-on lab exercises.