This page contain's notes on G.E. Shilov's *Linear Algebra*.

# Determinants

## Fields

A **field** (Shilov calls them a *number field*) is a set $K$ where addition and multiplication are defined and satisfy the usual arithmetical properties: associativity, commutativity, existance of a "0" and a "1", and distributivity.

The most useful examples of fields for practical applications of linear algebra are the field of real number $\bf R$ and the field of complex numbers $\bf C$. Examples of other kinds of fields include the field of rational numbers and the field of integers modulo a prime numbers.

## Systems of Linear Equations

A system of linear equations is an expression of the form

$\begin{aligned} a_{11}x_1 &+ a_{12}x_2 &+ \ldots &+ a_{1 n} x_n = b_1 \\ a_{21}x_1 &+ a_{22} x_2 &+ \ldots &+ a_{2 n} x_n = b_2 \\ \vdots \\ a_{k1}x_1 &+ a_{k2} x_2 &+ \ldots &+ a_{k n} x_n = b_n \end{aligned}$

The $a$'s (*coefficients*) and $b$'s (//constant term/) are given quantities and the $x$'s are "unknown" quantities. Solving this system of equations means finding the set of possible values of $x$'s (if any) that satisfy each equation. The values of all quantities, given and unknown, are assumed to belong to some given field $K$.

In attempting to solve such equations, it may turn out that no such solution exists. Such a system is said to be *incompatable*. If a solution exists, then the system is classified as *determinate* if such a solution is the only solutions.

## Calculating Determinants

A **square matrix** is a rectangular arrangement of $n^2$ numbers $a_{ij}\in K$ ($i,j=1,\ldots,n$):

$\begin{matrix} a_{11} & a_{12} & \ldots a_{1 n} \\ \vdots \\ a_{n1} & a_{n2} & \ldots a_{n n} \end{matrix}$

(More formally, a square matrix can be thought of as a $K$ valued function defined on the set of possible indices $(i, j)$. )

For defining determinants, Shilov introduces a special function $N$ that takes in a tuple in $\bf Z^n$ and outputs a natural number. It is defined in a combinatorial way: $N (\alpha_1, \alpha_2, \dots \alpha_n)$ is the number of time $\alpha_i>\alpha_j$ for $i<j$. That is, it counts the number of "inversions".

With this notation, the determinant $D$ of the previously written square matrix is defined as

$D = \sum (-1)^{N(\alpha_1, \ldots, \alpha_n)} \prod_{i=1}^{n} a_{\alpha_i}.$

Here, the above summation is over all sequences $\alpha_1,\ldots,\alpha_n$ in the set $\{1,\ldots,n\}$ such that $\alpha_i \neq \alpha_j$ whenever $i \neq j$.

The determinant of the matrix $A$ can be denoted as either $\det A$ or as $|A|$.

## Properties of Determinants

We list some properties of determinants. Denote $A$ as an $n \times n$ square matrix with elements $a_{ij}$. Then:

- A square matrix with two identical columns has a zero determinant.
- A square matrix with a column consisting of zeroes has a zero determinant.
- Let $b_1,\ldots,b_n$ and $c_1,\ldots, c_m$ be sequences of numbers and let $\lambda$ be an arbitrary number such that $a_{ij}=b_{i}+\lambda c_{i}$ for some given $j$. Let $B$ be the square matrix obtained from $A$ by replacing the jth column with the column of numbers $b_i$. Similarly, let the square matrix $C$ be obtained from $A$ by replacing the jth column with the column of numbers $c_{i}$. Then $\det A = \det B + \lambda \det C$.
- The detminant of a matrix is not changed by adding a multiple of values of one column to another column. That is to say, replacing $a_{i j}$ with $a_{i j} + \lambda a_{i k}$ for all $i$ and some values $j$ and $k\neq j$ will not alter the determinant.
- Swapping the rows and columns of a square matrix (the result being called a
**transposition**of the matrix) does not alter the value of the determinant.

As a consequence of the last property, the previously mentioned properties hold true if one were to replace "columns" with "rows".

## Minors and Cofactors

TODO

## Cramer's Rule

TODO