# Subsection6.1Some bases are better than others¶ permalink

We recalled that, towards the end of Lecture 4, we observed that, if $A$ is an $n \times n$ matrix with $n$ linearly independent rows, then there exists a matrix $A^{-1}$ such that

\begin{equation*} A A^{-1} = A^{-1}A = I. \end{equation*}

We say that $A$ is non-singular or invertible. We then noted that multiplying by $A^{-1}$ (or, indeed, by $A$) can be interpreted as a change of basis operation.

The transpose of the matrix $A \in \mathbb{R}^{m \times n}$ is $A^T \in \mathbb{R}$ where

\begin{equation*} (A)_{i,j} = (A^T)_{j,i}\text{.} \end{equation*}

However, if $A \in \mathbb{C}^{m \times n}$ we need to define the related concept of the Hermetian Transpose, $A^\star\text{,}$ for which

\begin{equation*} (A)_{i,j} = \overline{(A^T)_{j,i}}\text{.} \end{equation*}

Here $\bar{z}$ denotes the complex conjugate of $z\text{.}$ It is easy to show that

\begin{equation*} (AB)^\star = B^\star A^\star. \end{equation*}
##### Exercise6.1

Show that if $A \in \mathbb{C}^{n \times n}$ is non-singular, then

\begin{equation*} (A^{-1})^\star = (A^\star)^{-1}. \end{equation*}

# Subsection6.3Vector spaces and inner product spaces¶ permalink

##### Exercise6.2

Find a definition of a vector space.

An inner product space (of vectors over a field $\mathbb{F}$) is a vector space, $V\text{,}$ equipped with the function $(\cdot, \cdot) : V \times V \to \mathbb{F},$ such that, if $x,y,z \in V$ and $a \in \mathbb{F}$ then

\begin{equation*} \begin{split} (x,y) \amp =\overline{(x,y)}\\ (ax,y) \amp = a(x,y)\\ (x+y,z) \amp = (x,z)+(y,z)\\ (x,x) \amp \geq 0\\ (x,x) \amp = 0 \text{ iff } x=0. \end{split} \end{equation*}

There are many possible inner products, but the most important, for vectors with entries in $\mathbb{C}$ is the usual dot product:

\begin{equation*} (u,v) := u^\star v = \sum_{i=0}^n \bar{u}_i v_i. \end{equation*}

For real-valued vectors it can be understood as the “angle” between the vectors in $\mathbb{R}\text{.}$ That is, if $\alpha$ is the angle between two vectors, $u$ and $v\text{,}$ then,

\begin{equation*} \cos(\alpha) = \frac{(u,v)}{\sqrt{(u,u)} \sqrt{(v,v)}}. \end{equation*}

The most important/interesting situation when $(u,v)=0\text{,}$ in which case we say that $u$ and $v$ are orthogonal.

# Subsection6.4Exercises

##### Exercise6.3

A matrix $S \in \Cmm$ is “skew-hermitian” if $S^\star = - S\text{.}$ Show that the eigenvalues of $S$ are pure imaginary.