Linear Space

$$\newcommand{\1}{\textbf 1} \newcommand{\0}{\textbf 0} \newcommand{\x}{\boldsymbol x} \newcommand{\y}{\boldsymbol y} \newcommand{\z}{\boldsymbol z} \newcommand{\u}{\boldsymbol u} \newcommand{\v}{\boldsymbol v} \newcommand{\Norm}{\unlhd} \newcommand{\Span}{\text{span}} \newcommand{\C}{\mathcal C} \newcommand{\F}{\mathcal F} $$

在学习抽象代数之后重学线性代数~

This post includes contents from chapter I, Linear Algebra, Stephen et. al., with some of my understanding from group and ring theory.

You may check out my notes for groups and rings first

1. Vector Space

Definition. Vector Space / Linear Space

A vector space or linear space $V$ over a field $F$ is a $F$-module with a scalar-multiplication function that satisfies

$$1\x=\x $$

where $1$ is the multiplicative identity of field $F$. To be specific, the following conditions should be satisfied to form a vector space.

(VS1) $\forall \x,\y\in V. \x+\y=\y+\x$

(VS2) $\forall \x,\y,\z\in V. \x+(\y+\z)=(\x+\y)+\z$

(VS3) $\exists \0\in V. \forall \x\in V. \0+\x=\x+\0$

(VS4) $\forall \x\in V. \exists \y\in V. \x+\y=0$

(VS5) $\exists 1\in F. \forall \x\in V. 1\x=\x$

(VS6) $\forall a,b\in F. \forall \x\in V. (ab)\x=a(b\x)$

(VS7) $\forall a,b\in F. \forall \x\in V. (a+b)\x=a\x+b\x$

(VS8) $\forall a\in F. \forall \x,\y\in V. a(\x+\y)=a\x+a\y$

(VS1) ~ (VS4) makes $(V,+)$ an abelian group, (VS6) ~ (VS8) makes $V$ a F-module (since F is already a field).

With $V$ and $F$ defined, elements of $V$ are called vectors and elements of $F$ are called scalars.

Definition. Column vector, matrix

Observe that $F^n$ is a vector space over field $F$, with addition and multiplication defined as follows

$$\u+\v:=\alpha\mapsto\u(\alpha)+\v(\alpha)\\ a\u:=\alpha\mapsto a\u(\alpha) $$

In that way, every element (vector) of FnFn may be written as a column vector

Theorem. Cancellation rule for vector addition

If $V$ is a vector space, then we have

$$\forall \x,\y,\z\in V. \x+\z=\y+\z \implies \x=\y $$

This is because $(V,+)$ is a group

Definition. Subspace

A subspace $W$ of a vector space $V$ over a field $F$ is called a subspace of $V$ if $W$ is a vector space over $F$ with the operations of addition and scalar multiplication defined on $V$.

Theorem.

Let $V$ be a vector space, $W$ be a subset of $V$. $W$ is a subspace of $V$ iff the following conditions are satisfied

  • $\0\in W$
  • $W$ is closed under addition
  • $W$ is closed under scalar multiplication

This is because some of the VS axioms are intrinsic properties (like distribution) so there’s no need no verify them.

Theorem. Any Intersection of subspaces of a linear space $V$ is still a subspace of $V$.

Pf. This could be easily shown using the former theorem.

Definition. Sum

Let $S_1,S_2\subset V$ be two non-empty sets of a vector space $V$, then the sum of $S_1$ and $S_2$ (denoted by $S_1+S_2$) is defined to be

$$S_1+S_2:=\{\x+\y\mid \x\in S_1,\y\in S_2\} $$

Definition. Direct sum

A vector space $V$ is called the direct sum of two subsets $W_1$ and $W_2$ (denoted by $V=W_1\oplus W_2$) if

  • $W_1$ and $W_2$ are two subspaces of $V$
  • $W_1\cap W_2=\{\0\}$
  • $V=W_1+W_2$

Theorem. $W_1$ and $W_2$ are two subspaces of $V$, then $V=W_1⊕W_2$ iff every vector $\v$ in $V$ can be uniquly written as

$$\v=\x_1+\x_2 $$

where $x_i\in W_i$

Pf. This could be shown using the definition of direct sum.

Definition. Coset

$W$ is a subspace of vector space $V$ over a field $F$. For any $\v\in V$, the set $\{\v\}+W$ is called the coset of $W$ containing $\v$. It is customary to denote this coset by $\v+W$ rather than $\{\v\}+W$.

Recall the concept of left coset that we learnt in abstract algebra, since $V$ is a group, here $\v+W$ is a left coset (algebraically) of $V$.

Definition. Quotient space

If $W$ is a subspace of a vector space $V$ over a field $F$. Define the collection $S$ to be all the left cosets of $W$, i.e.

$$S:=\{\v+W\mid \v\in V\} $$

Define the addition and scalar-multiplication as follows, $\forall a\in F$, $\forall \v_1,\v_2\in V$

$$(\v_1+W)+(\v_2+W):=(\v_1+\v_2)+W\\ a(\v+W):=a\v+W $$

It could be shown that these two functions are well-defined. Then we call $S$ the quotient space of $V$ modulo $W$ over $F$, denoted by

$$S=V/W $$

Any subgroup of $V$ is a normal subgroup because $V$ is abelian, so $W\Norm V$. We have learnt that algebraically, $S=V/W$ should also be a group with addition defined as above.

Definition. Linear combination

Let $V$ be a vector space over $F$ and $S$ be a nonempty subset of $V$. A vector $\v\in V$ is called a linear combination of vectors in $S$ if there exists a finite number nn of vectors $\{\u_k\}_{k=1}^n$ and scalars $\{a_k\}_{k=1}^n$ in $F$ such that

$$\v=\sum_{k=1}^n a_k\u_k $$

In this case we also say that $\v$ is a linear combination of $\u_1,\u_2,\ldots,\u_n$ and call $a_1,a_2,\ldots,a_n$ the coefficients of the linear combination.

Then by definition $\0$ is the linear combination of any non-empty subsets of $V$.

Definition. Span

Let $S$ be a nonempty subset of a vector space $V$. The span of $S$, denoted by $\Span(S)$, is the set consisting of all linear combinations of vectors in $S$. For convenience, we define $\Span(\emptyset)=\{\0\}$.

Theorem. Span is the smallest subspace

The span of any subset $S$ of a vector space $V$ is a subpace of $V$. Moreover, any subspace of $V$ containing $S$ as a subset must contain $\Span(S)$.

This could easily be shown using the “three requirements” of a subspace

Definition. Generate

A subset $S$ of a vector space $V$ generates (or spans) $V$ if $\Span(S)=V$. In this case, we also say the vectors of $S$ span (or generate) $V$.

Property. If $S_1,S_2\subset V$,

  1. $\Span(S_1\cup S_2)=\Span(S_1)+\Span(S_2)$
  2. $\Span(S_1\cap S_2)\subset \Span(S_1)\cap \Span(S_2)$

2. Linear Independence

Definition. Linear dependency / Independency

A subset $S$ of a vector space $V$ is called linearly dependent if there exists a finite number nn of distinct vectors $\{\v_k\}_{k=1}^n$ in $S$ and scalars $\{a_k\}_{k=1}^n$ , not all zero, such that

$$\sum_{k=1}^n a_k\v_k=\0 $$

In this case we may also say $S$ is linearly dependent. Otherwise we say $S$ is linearly independent.

Theorem.

Let $S_1\subset S_2\subset V$ where $V$ is a linear space. If $S_1$ is linearly dependent, then $S_2$ is linearly dependent.

This is obvious.

Corollary.

Let $S_1\subset S_2\subset V$ where $V$ is a linear space. If $S_2$ is linearly independent, then $S_1$ is linearly independent.

Contrapositive of the former theorem.

Theorem.

Let $S$ be a linearly independent subset of a vector space $V$, and let $\v\in V\setminus $S, then $S\cup \{\v\}$ is linearly dependent iff $\v\in\Span(S)$

Definition. Maximal

Let $F$ be a family of sets. A member $M$ of $F$ is called maximal (w.r.t. set inclusion) if $M$ is contained in no member of $F$ except for $M$ itself.

Definition. Chain

A collection $\C$ of sets is called a chain ( or nest of tower ) if each pair of sets $A,B$ in $\C$, either $A\subset B$ or $B\subset A$

Each pair of sets are comparable

Theorem. Maximal principle

Let $\F$ be a family of sets, if for each chain $\C\subset \F$ there exists a memeber of $\F$ that contains each member of $\C$, then $\F$ contains a maximal member

The maximal principle is locally equivalent to axiom of choice.

Definition. maximal linearly independent set

Let $S$ be a subset of a vector space $V$, a maximal linearly independent subset of $S$ is a subset $B\subset S$ satisfying

  • $B$ is linearly independent
  • The only linearly indepdent subset of $S$ that contains $B$ as a subset if $B$ itself.

Theorem. maximal linearly independent subset equivs to basis

Let $V$ be a vector space, $S\subset V$, $\Span(S)=V$. If $\beta$ is a maximal independent subset of $S$, then $\beta$ is a basis for $S$.

Corollary. Every vector space has a basis

Axiom of choice secures this!

3. Basis and Dimention

Definition. Basis

A basis $\beta$ for a vector space $V$ is a linearly independent subset of V such that $\Span(β)=V$. If $β$ is a basis for $V$, we may also say that vectors of $β$ forms a basis for $V$.

Theorem. Unique decomposition theorem

Let $V$ be a vector space and $β=\{\v_k\}_{k=1}^n$ be a subset of $V$, Then $β$ is a basis of $V$ iff every $\v\in V$ can be uniquely expressed as a linear combination of vectors of $β$. i.e. there exists unique scalars $\{a_k\}_{k=1}^n$ s.t.

$$\v=\sum_{k=1}^n a_k\v_k $$

Theorem. $|S|<\infty\implies |\dim(\Span(S))|<\infty$

If a vector space $V$ is generated by a finite set $S$, then some subset of $S$ is a basis for $V$, hence $V$ has a finite basis.

Theorem. Replacement theorem

Let $V$ be a vector space that is generated by a set $G$ containing exactly $n$ vectors, let $L$ be a linearly independent subset of $V$ containing exactly $m$ vectors. Then $m\leq n$ and there exists a subset $H$ of $G$ containing exactly $n−m$ vectors such that

$$\Span(L\cup H)=G $$

Corollary. Let $V$ be a vector space having a finite basis. Then every basis of $V$ contains the same number of vectors.

This could be proved using contradiction caused by replacement theorem.

Definition. Dimension

A vector space is called finite dimensional if it has a basis consisting of a finite number of vectors. The unique number of basis in each basis for $V$ is called the dimenison of $V$ and is denoted by $\dim(V)$. A vector that is not finite-demensional is called infinite-dimensional.

Theorem. dimension of subspaces

Let $W$ be a subspace of a finite dimensional vector space $V$, then $W$ is finite-dimensional and

$$\dim(W)\leq \dim(V) $$

Moreover, $\dim(W)=\dim(V)\implies W=V$

Corollary. basis extention

If $W$ is a subspace of a finite-dimensional vector space $V$, then any basis for $W$ can be extended to a basis for $V$.

Property. If $V$ is a finite-dimentional vector space, $W_1$ and $W_2$ are two subspaces of $V$

  • $\dim(W_1+W_2)=\dim(W_1)+\dim(W_2)−\dim(W_1\cap W_2)$
  • $\dim(V)=\dim(W_1)+\dim(W_2)$ if $V=W_1\oplus W_2$
  • $\dim(V)=\dim(W_1)+\dim(V/W_1)$