Applied Linear Algebra

charlie

Real Bebopper
I'm making this thread to organize my thoughts and my notes regarding Applied Linear Algebra.
I will begin with a list of questions.

  • What is a vector space?
  • What makes a matrix invertible?
  • What makes a matrix singular/non-singular?
  • What's the trivial space? What's a non-trivial space?
  • What's the application of knowing the properties of these matrices?
  • What is linear dependence and independence?
  • What's a euclidian norm?
  • Where would you use an L5 norm?
  • What's a triangular matrix?
  • What's a diagonal matrix? (determinants, properties etc)
  • What's a determinant?
  • What's a symmetric/asymmetric matrix?
  • What makes a basis?
  • What defines independence?
  • What is full rank?
  • What is functional analysis?
  • What is mean variance portfolio optimization?
  • What is variance quadratic form?
This summarizes most of the topics I've been looking into the last few weeks, and once able to explain each of these, I'll be in a better place.
 
I'll begin with an exhortation from an Applied Linear Algebra text:

"First, make sure you understand what is being said in the case of ordinary Euclidian space. Once this is grasped, the next important case to consider is an elementary function space, e.g., the space of continuous scalar functions. With the two most important cases firmly in hand, the leap to the general abstract formulation should not be too painful."
 
Gaussian Elimination
"Perhaps the most fundamental method for solving linear algebraic systems.
Gaussian interpretation can be profitably reinterpreted as a certain matrix factorization, known as the (permuted) LU decomposition, which provides valuable insight into the solution algorithms."
 
  • What's a triangular matrix?
  • What's a diagonal matrix?
A triangular matrix is a matrix when the all the entries on either side of the diagonal are 0. If all the entries below the matrix are 0 then it is an Upper Triangular matrix, if all the entries above the diagonal are 0 then it is a Lower Triangular matrix; if all the entries both above and below the diagonal are 0, it is a diagonal matrix.

If you have a square-coefficient matrix for some system of linear equations (i.e. you have the same number of equations/rows and unknowns) then the matrix is in (upper) triangular form if the first equation has all the variables, the second equation has all but one variable ... until the last equation has the last variable. Getting such a square-coefficient matrix, or its associated augmented matrix, in upper triangular form is the first step to Gaussian Elimination. Once the triangular form is found, the system can readily be solved with back substitution.

The word triangular is reminiscent of triangular numbers (e.g. 1 = 1, 1 + 2 = 3, 1+ 2 + 3 = 4, ...)
 
i have decided to switch gears.... i do not have the proper foundation to address these things or think about these things

I am now working through Gil Strang's lectures on youtube and I'm finding them much more helpful than my lectures lately. I appreciate how he carries various conceptions of the topics through each lecture (row picture, column picture, algebra etc)

it is very helpful to think of matrix multiplication as linear combinations of either rows or columns
 
Back
Top