# Vector Space

Vector spaces are used to model a collection of objects – vectors – that can be combined in a linear way to form more vectors. Vector spaces may also be called linear spaces.

### Definition

Formally, a vector space $V$ over a field $K$ (with $K$ usually being either $\mathbf R$ or $\mathbf C$) of "scalars" is a set together with two operations:

1. Vector Addition. The sum of two vectors $u$ and $v$ is written as $u + v$. Its properties are as follows for all vectors $u$, $v$ and $w$:

• $(u + v) + w = u + (v + w)$
• There exists a $\mathbf 0\in V$ such that $v+ \mathbf 0=\mathbf 0 + v= v$.
• $u+v=v+u$.
2. Scalar multiplication. Combining the scalar $\lambda$ with the vector $v$ results in a vector $\lambda v$. Scalar multiplication satisfies the following properties for all scalars $\lambda,\mu$ and vectors $u,v$:

• $(\lambda \mu)v = \lambda (\mu v)$
• $(\mu + \lambda) v = \mu v + \lambda v$
• $\lambda(u+ v)=\lambda u + \lambda v$
• $1 v=v$

### Vector Subspaces

A vector subspace $W$ is a subset of some vector space $V$ that is itself a vector space when inheriting vector addition and scalar multiplication from $V$. Equivalently, $W$ is a vector space whenever it is closed under vector addition and scalar multiplication.

### Linear Independence and Dimensionality

A linear combination of vectors $v_1, \ldots, v_n$ is an expression of the form $\lambda_1 v_1 + \ldots + \lambda_nv_n$ for some sequence of scalars $\lambda_1,\ldots,\lambda_n$. These vectors are said to be linearly independent if their linear combination is zero if and only if $\lambda_1=\ldots=\lambda_n=0$.

The span of a finite set of vectors is the set of all linear combinations of those vectors. Spans are themselves vector spaces.

For a given vector space $V$, a basis of the space is a sequence of linearly independent vectors whose span is equal to $V$. If such a basis exists (and the afformentioned sequence is finite), then $V$ is said to be finite dimensional with a dimension equal to the number of vectors making the basis. The dimensionality of a vector space, when defined this way, is independent of the choice of basis.

A vector space is said to be infinite dimensional if it is not finite dimensional.

### Examples

Some common examples of vector spaces include:

• $K^n$, where $K$ is a field and $n$ a natural number. This is the set of all $n$-tuples of elements of the field $K$. Vector addition and scalar multiplication are defined component-wise. Special examples include $\mathbf R^n$ and $\mathbf C^n$.
• The set of all polynomial functions, possibly up to a maximal degree.
• The set of all $n$-times differentiable functions ($n=0$ for continuous) over the set $U$ forms the vector space $C^{n}(U)$.
• The set of linear maps between two vector spaces is itself a vector space. In particular, the set of all linear maps from a vector space $V$ (defined over the field $K$) to the field $K$ is the dual space $V^{\star}$.

# Algebraic Dual Space

The (algebraic) dual space $V^{\star}$ of the vector space $V$ over the scalar field $K$ consists of all linear mappings of the form $V \to K$. $V^\star$ is itself a vector space over $K$. Elements of $V^\star$ are referred to as linear functionals or covectors.

If $V$ is finite-dimensional, then $V^\star$ shares the same dimension. That is, $V$ and $V^\star$ are isomorphic. Moreover, for any non-degenerate bilinear form $<\cdot, \cdot>: V \times V \to K$, the bijection $v \leftrightarrow $ is an isomoprhism between $V$ and $V^\star$. Thus, a finite-dimensional inner product space has a "natural" isomorphism to its dual.

### Double Duals

Regardless of the dimensionality of $V$, there is a natural homomorphism from $V$ to is dual of its dual, $V^{\star \star} = (V^\star)^\star$. This is because the "evaluation map" corresponding to $v\in V$, defined by $f \mapsto f(v)$, is clearly am element of o $V^{\star\star}$. And the relationship to of a vector and its evaluation map are clearly homomorphic. Moreover, this homomorphism is an isomorphism for finite-dimensional $V$. Because this isomorphism is natural, double duals and other "higher order duals" are usually ignored.

# Group Homomorphism

A homomorphism is a mapping that preserves algebraic structure. In the context of group theory, a homomoprhism between a group $(G, \cdot)$ and $(H, \star)$ is a mapping $\phi: G \to H$ with the following property for all $g,g'\in G$:

$\phi(g\cdot g') = \phi(g)\star\phi(g').$

$\phi$ may be categorized as a special "type" of homomorphism according to additional properties it may hold:

• $\phi$ is an isomophism if it is bijective and its inverse is a homomorphism. In this case, $(G, \cdot)$ and $(H, \star)$ are isomorphic. An isomorphism from a group to itself is called an automorphism.
• $\phi$ is an epimorphism if it is surjective. An epimorphism from a group to itself is called a endomorphism.
• $\phi$ is a monomorphism if it is injective.

# Topological Space

A topological space is a set equipped topological structure. Essentially, a topological space is defined so that the "limit" of a sequences o "points" in the space may be defined.

### Topology

A topological space may be defined as a set $\Omega$ to together with a topology $\mathcal T$, which consists of subsets of $\Omega$. Elements of this topology are called open sets. A topology satisfies the following axioms:

1. The empty set belongs to $\mathcal T$.
2. $\Omega \in \mathcal T$
3. If $O_1,\ldots,O_n$ is a finite sequence of open sets, then their intersection $O_1 \cap\ldots\cap O_n$ is also open.
4. If $\{O_{\alpha}\}$ is a (potentially infinite) set of open sets (with $\alpha$ being some indexing parameter), then their union $\bigcup_{\alpha} O_{\alpha}$ is also open.

A set $C\in\Omega$ is said to be closed if its complement $\Omega - C$ is open. The set of closed sets uniquely specifies space's topology.

### Neighborhoods

For a given point $x\in\Omega$, the neighborhood $N_x$ with respect to a topology on $\Omega$ as the collection of open sets containing $x$. The collection of neighborhoods uniquely specifies a space's topological structure.

Using neighborhoods, it is possible to define topological spaces without first referring to topologies. Consider a mapping of the form $\mathcal N: \Omega\to2^{\Omega}$ satisfying the following properties for all $x\in\Omega$:

1. If $N\in\mathcal N(x)$, then $x\in N$.
2. If $N\in\mathcal N(x)$, then any superset of $N$ (contained in $\Omega$) also belongs in $\mathcal N(x)$.
3. If $N,N'\in\mathcal N(x)$, then $N\cap M\in\mathcal N(x)$.
4. If $N\in\mathcal N(x)$, there exists a $M\in\mathcal N(x)$ such that $N\in\mathcal N(y)$ for all $y\in M$.

If these properties are specified, then $\mathcal N$ maps $x$ to its neighborhood $\mathcal N (x)$ with respect to some topology. That is, the above enumerated properties are an alternative and equivalent axiomatization of a topological space.

### Construction of Topological Spaces

Useful topological spaces are usually not defined by explictly specifying the topology directly. Instead, topological structures are typically generated from simpler constructions:

• Topologies generated by a base or subbase (TODO)
• Topologies generated by a metric (TODO)
• Topologies generated by a map with a topological domain (TODO)
• Topologies generated by a product of other topologies (TODO)
• The power set is a topology on the underlying set. This is the "finest" possible topological structure on any given set. Such topologies are called discrete topologies.

# Bilinear Form

A bilinear form over a vector space $V$ (over some scalar field $K$) is a mapping $V \times V \to K$ that is linear in each argument.

Bilinear forms may be characterized by additional properties they posses. Let $B$ be a bilinear form. Then $B$ is:

• non-degenerate if $B(w,v)=0$ for all $v\in V$ only when $w=0$,

• symmetric if $B(v,w)=B(w,v)$ for all $w, v\in V$,

• skew-symmetric if $B(v,w)+B(w,v)=0$ for all $w,v \in V$,

• alternating if $B(v, v) = 0$ for all $v \in V$ (alternating implies skew-symmetric).

• reflexive if $B(v, w)=0$ implies that $B(w, v)=0$.

Bilinear forms may be represented as matrices for finite-dimensional vector spaces. Let $V=K^{n}$. And every bilinear form can be defined in the following fashion:

\begin{aligned} B(u,v) &= \sum_{i,j} u_{j} B_{ij} v_{j} \\ &= \mathbf{u}^{T} \mathbf B v, \end{aligned}

where $\mathbf B$ is the matrix representation of $B$, and $u$ and $v$ are treated as column vectors.

# Group

An group is a simple algebraic structure that represents a composable set of permutations. There are two popular definitions of a group: one as a set of permutations and one as a set equipped with a binary operator. Both definitions are essentially equivalent by Caley's Theorem.

### Permutation Group

One formulation of a group is as a set of permutations. That is, a permutation group is a set $G$ of bijections of some set $X$ with the following closure properties:

1. The identity map belongs to $G$,
2. If $g\in G$, then $g^{-1}\in G$, and
3. If $g,h\in G$, then $g\circ h\in G$.

### Abstract Group

An abstract group is a set $G$ together with a mapping $\cdot:G\times G \to G$ called a binary operator. We write $\cdot (g, h)$ as $g \cdot h$ or simply as $gh$.

An abstract group must satisfy the following properties:

1. There exists an identity element $e \in G$ such that $eg=ge=g$ for all $g\in G$.
2. For all $g \in G$, there exists an inverse element $g^{-1} \in G$ such that $g g^{-1}= g^{-1} g = e$.
3. For all $f,g,h \in G$, $(fg)h=f(gh)$.

It is clear that a permutation group is an abstract group by having composition as the chosen binary operator. Caley's theorem states that the reverse is true in a sense.

Note: For notational convenience, the binary operator is usually assumed and a group is identified by the underlying set. So, for example, claiming that "$g$ is an element of the group $G$" means that "$g$ is an element of the underlying set of the group $G$".

### Subgroup

If $H \subset G$, then $(H, \cdot)$ is a subgroup of $(G, \cdot)$ if $(H, \cdot)$ is itself a group. Equivalently, whenever $H$ is is closed under inversion and composition.

### Abeliean Group

A group is abelian if its binary operator is commutative. That is, whenever $gh=hg$ for all $g,h\in G$.

# Inner Product Space

An inner product space is a vector space $V$ over a scalar field $K\in \{\mathbf R, \mathbf C\}$ together with a bilinear form $\langle\cdot ,\cdot\rangle: V \times V \to K$, called an inner product, that satisfies the following properties:

• Positive-definite: $\langle \mathbf v, v\rangle >0$ for all non-zero $v\in V$.
• Conjuage-symmetric: $\langle \mathbf u, v\rangle = \overline{\langle v, u\rangle}$

The function $\rVert \cdot \lVert: V\to \mathbf R$ defined by $\sqrt{\langle v, v \rangle}$ satisfies the properties of a norm (thus an inner product space is a nomed vector space). The inner product can be recovered from its norm using the polarization identity:

\begin{aligned} \text{Re }(\langle u, v \rangle) &= \frac{1}{4}(\lVert u+v\rVert^2 - \lVert u- v \rVert^2) \\ \text{Im }(\langle u, v \rangle) &= \frac{1}{4}(\lVert u-i v\rVert^2 - \lVert u+i v \rVert^2). \end{aligned}

The inner product norm further satisfies the famous Cauchy-Schwartz inequality:

$\lvert\langle u, v \rangle\rvert = \leq\lVert u \rVert\lVert v \rVert .$

# Caley's Theorem

Caley's theorem asserts that a permutation group and an abstract group are essentially equivalent concepts. It is clear that a permutation group meets the definition of an abstract group. Caley's theorem states that an abstract group is isomorphic to some permutation group.

### Proof

For a given $g \in G$, the bijection $G \to G$ defined by $h \mapsto gh$ is the left action by $g$. Similarly, the map $h\mapsto hg$ is the right action by $g$. The mapping from $g$ to the corresponding left action (or right action) is an isomorphism.

# Galilean Relativity

Galilean Relativity refers to a theory of relativity consistent with Newton's laws.

It is named after Galileo's thought experiment involving a vessel traveling with a perfectly uniform and linear motion. An observer contained within the vessel, observing only phenomena also contained within the vessel, will have no means of determining the vessel's speed or direction of travel.

### Galilean Spacetime

A galilean space(time) is a four-dimensional affine space $A^4$. Points in this space are called "events". The set of displacements forms a four-dimensional, real vector space $V\simeq\mathbb R^4$.

There exists a rank-1 linear map $\tau : V\mapsto\mathbb R$ mapping spatio-temporal displacements to time intervals. Two events $a$ and $b$ are simultaneous if $\tau(a-b)=0$.

The three-dimensional quotient space $V/\text{ker }\tau$ is euclidean (that is, equipped an inner product $\langle\cdot,\cdot\rangle$). From this, the distance between simultaneous events $a$ and $b$ is defined as

$d(a,b) = \sqrt{\langle \pi(a-b), \pi(a-b) \rangle},$

where $\pi$ is the natural projection (epimorphism) from $V$ to $V/\text{ker }\tau$.

### Galilean Transformations

An isomorphism between galilean spaces is a bijection that preserves galilean structure (affinity, euclidean distance between simultaneous events, and time intervals).

All galilean spaces are isomorphic to $\mathbb R \times \mathbb R^3$, where the $\mathbb R$ and $\mathbb R^3$ components are each equiped with the standard inner product. Isomorphisms of spacetime onto $\mathbb R\times\mathbb R^3$ are called inertial reference frames or galilean coordinates.

Automorphisms of $\mathbb R \times \mathbb R^3$ are called galilean transformations, which forms the galilean group. The galilean group is generated from the following galilean transformations:

1. Rectilinear Motion: $(t, \mathbf{x}) \mapsto (t, \mathbf{x}+t\mathbf{v})$ for some $\mathbf{v}\in\mathbb R^3$.
2. Spatial Translations: $(t, \mathbf{x})\mapsto(t, \mathbf{x}+\mathbf{y})$ for some $\mathbf{y}\in\mathbb R^3$.
3. Temporal Translations: $(t, \mathbf x)\mapsto(t+s,\mathbf x)$ for some $s\in\mathbb R$.
4. Rotations and Reflections: $(t, \mathbf x) \mapsto (t, \mathbf{G x})$, where $\mathbf{G}\in O(3)$. That is, $\mathbf{G}$ is a real, invertible 3x3 matrix such that $\mathbf G^T \mathbf G = \mathbf G \mathbf G^T \mathbf = \mathbf{I}_3$.

# Group Representation

Let $G$ be a group and $V$ be a vector space over the field $K$. A representation with respect to these objects is a homomorphism $\pi$ from $G$ to $\text{GL}(V)$, the set of isomophisms of $V$. That is, a representation is just a group action consisting of linear transformations. The study of group representations forms much of representation theory.

Usually, $V \simeq\mathbf C^n$ and $G$ is finite. In this case, $G$ (or, rather, $G/\text{ker }V$) may be identified as a finite set of matrices under some coordinate system.

### Irreducible Representations

The subspace $W \subset V$ is said to be invariant with respect to $\pi$ if $\pi(g)w=w$ for all $w\in W$ and $g\in G$.

The representation $\pi$ is said to be irreducible over $V$ if the only invariant subspaces are $V$ and the subspace consisting of just the zero element.

An important problem of representation theory is decomposing $V$ into irreducible components That is, write $V=V_1\oplus\ldots\oplus V_n$ with $V_1,\ldots,V_n$ irreducible and none of $V_1,\ldots,V_n$ is equal to $V$. If this is possible, then the representation is said to be fully reducible.

# Normed Vector Space

A normed vector space is a vector space $V$ over $K\in\{\mathbf R, \mathbf C\}$ together with a function $\lVert \cdot \rVert: V \to \mathbf R$, the norm, satisfying the following properties for all $u, v\in V$ and $\lambda\in K$:

• $\lVert v \rVert \geq 0$ with equality if and only if $v=0$.
• $\lVert \lambda v\rVert = \lvert \lambda \rvert \lVert v \rVert$
• $\lVert u+v \rVert \leq \lVert u\rVert +\lVert v \rVert$, also known the triangle inequality.

A normed vector space is also a metric space under the metric $d(u, v) = \lVert u - v \rVert$.

A seminorm has the properties of the norm, except with the property that $\lVert v \rVert$ may be zero for nonzero $v$.

# Dual Number

The dual numbers can be thought of an extension of the real numbers with an infinitesimal offset. A dual number may be written as $a+b\epsilon$, where $a$ and $b$ are real numbers. This expression is linear in $a$ and $b$ and is said to obey the following rule of multiplication:

$(a+b\epsilon)(a'+b'\epsilon) = aa' + (ab' +b'a)\epsilon.$

In particular, $\epsilon^2 = 0$.

### Use in Automatic Differentiation

For a polynomial $P$, it can be readily shown that:

$P(a+b\epsilon) = P(a) + bP'(a)\epsilon.$

And for any analytic function $f:R\to R$, one can similarly extend $f$ to the dual numbers:

\begin{aligned} f(a+b\epsilon) &= \sum_{n=0}^{\infty}\frac{1}{n!} f^{(n)}(a) b^n \epsilon^n &= f(a) + bf'(b)\epsilon \end{aligned}

With this, it is possible to "automatically differentiate" a function $f:R\to R$ by calculating the "epsilon" component of $f(x+\epsilon)$. The practicality of this approach depends on how $f$ is written.

The ForwardDiff.jl Julia package follows this exact approach, utilizing a multidimensional analogue of dual numbers to calculate gradients.

# Quotient Group

A quotient group $G/N$ for a given group $G$ and a "normal" subgroup $N$ of $G$ is a group that has a "coarser" or "more relaxed" algebraic structure relative to $G$. The notion of a quotient group is essential in a number of isomorphism theorems.

### Cosets

Let $(G, \cdot)$ be a group with subgroup $(H, \cdot)$.

A left coset of $(H, \cdot)$, denoted as $g\cdot H$ or $gH$ for some $g\in G$, is the set consisting of elements of the form $gh$ for some $h$. That is, $gH$ is the image of $H$ under the left action induced by $g$.

Similarly, a right coset of $(H, \cdot)$, is the image of $H$ under the right action induced by $g$.

### Normal Subgroups

The subgroup $H$ is said to be normal if $gH = Hg$ for all $g\in G$. In this case, there is no distinction between "right" and "left" cosets.

The cosets of a normal subgroup itself forms a group

### Lagrange's Theorem

Let $G$ and $H$ be finite and $H$ normal. Lagrange's theorem states that

$\lvert G/H \rvert = \lvert G\rvert / \lvert H \rvert .$