# Linear Transformation Part II - Inverse Linear Transformation and Isomorphism

#### Definition:

A function $T$ from $V$ into $W$ is called invertible if there exists a function $S$ from $W$ to $V$ such that $TS$ is an identity on $W$ and $ST$ is an identity on $V$.
i.e. $TS=I_w, ST=I_v$

#### Definition:

$T$ is invertible, if
(i) $T$ is one-one i.e. $T(\alpha)=T(\beta) \implies \alpha=\beta$
(ii) $T$ is onto i.e. for any $\beta \in W, \exists \ \alpha \in V, \ such \ that \ T(\alpha)=\beta$

#### Notation:

If $T$ is invertible and $S$ is an inverse of $T$, then $S=T^{-1}$.

### Theorem:

Let $V$ and $W$ be vector spaces over the filed $F$ and $T$ be a linear transformation. Then $T^{-1}$ is a linear transformation from $W$ into $V$.

### Proof:

Claim that $T^{-1}:W \rightarrow V$ is a linear transformation.
Let $\beta_1, \beta_2 \in W$ be any vectors and $a \in F$ be any scalar.
To show that $T^{-1}(a\beta_1+\beta_2)=aT^{-1}(\beta_1)+T^{-1}(\beta_2)$
Let $\alpha_i=T^{-1}(\beta_i)$
i.e. $\alpha_i$ is unique such that $T(\alpha_i)=\beta_i$,        $(i=1, 2)$
Since, T is linear,
$\therefore T(a\alpha_1+\alpha_2)=aT(\alpha_1)+T(\alpha_2)$
$=a\beta_1+\beta_2$
$\because a\alpha_1+\alpha_2$ is unique,
$\therefore a\alpha_1+\alpha_2=T^{-1}(a\beta_1+\beta_2)$
$\therefore T^{-1}(a\beta_1+\beta_2)=aT^{-1}(\beta_1)+T^{-1}(\beta_2)$
Hence, $T^{-1}$ is a linear transformation.

#### Definition:

A linear transformation $T$ from $V$ into $W$ is said to be non-singular if $T(\alpha)=0, \implies \alpha=0, \forall \alpha \in V$.
i.e. the null space of $T$ is zero if $T$ is non-singular which concludes that $T$ is one-one iff $T$ is non-singular.

### Theorem:

Let $T$ be a linear transformation from vector space $V$ into vector space $W$ over the field $F$. $T$ is non-singular iff $T$ carries each linearly independent subset of $V$ onto a linearly independent subset of $W$.

### Proof:

Suppose that $T$ is non-singular.
Let $S=\{\alpha_1, \alpha_2, ... , \alpha_n\}$ be a linearly independent subset of a vector space $V$.
Claim that $S'=\{T(\alpha_1), T(\alpha_2), ... , T(\alpha_n)\}$ is linearly independent.
Suppose that,
$c_1T(\alpha_1)+c_2T(\alpha_2)+ ... +c_nT(\alpha_n)=0$,           $c_1, c_2, ... , c_n \in F$
$\implies T(c_1\alpha_1, c_2\alpha_2, ... , c_n\alpha_n)=0$
$\implies c_1\alpha_1, c_2\alpha_2, ... , c_n\alpha_n=0$   ($\because T$ is  non-singular)
$\implies c_1=0, c_2=0, ... ,c_n=0$          ($\because S$ is linearly independent)
Thus, $c_1T(\alpha_1)+c_2T(\alpha_2)+ ... +c_nT(\alpha_n)=0$, $\implies c_1=0, c_2=0, ... ,c_n=0$
Therefore, $S'=\{T(\alpha_1), T(\alpha_2), ... , T(\alpha_n)\}$ is linearly independent.
Hence, $T$ carries each linearly independent subset of $V$ onto a linearly independent subset of $W$.
Conversely,
Assume that $T$ carries each linearly independent subset of $V$ onto a linearly independent subset of $W$.
Claim that $T$ is non-singular.
Suppose that $\alpha \ne 0\in V$
Then $\{\alpha\}\subset V$ is linearly independent.
$\implies \{T(\alpha)\}\subset W$ is linearly independent.
$\implies T(\alpha)\ne 0$
Thus, $\alpha \ne 0 \implies T(\alpha)\ne 0$
Hence, $T$ is non-singular.

#### Example:

Let $F$ be a real field and $T$ be a linear transformation defined on $F^2$ given by $T(x_1, x_2)=(x_1+x_2, x_1)$. Verify that $T$ is non-singular.

#### Solution:

$T(x_1, x_2)=0$
$\implies (x_1+x_2, x_1)=0$
$\implies x_1+x_2=0, x_1=0$
$\implies x_1=0, x_2=0$
$\implies (x_1, x_2)=0$
$\therefore T(x_1, x_2)=0 \implies (x_1, x_2)=0$
$T$ is non-singular.
Now,
$T(x_1, x_2)=(x_1+x_2, x_1)=(s_1, s_2)$
$\therefore x_1+x_2=s_1, x_1=s_2$
$\therefore x_1=s_2, x_2=s_1-s_2$
If $T^{-1}$ exists, then
$T^{-1}(s_1, s_2)=(x_1, x_2)$
$T^{-1}(s_1, s_2)=(s_2, s_1-s_2)$

#### Definition:

Let $V$ and $W$ be vector spaces over the filed F. A linear transformation from $V$ onto $W$ is said to be an isomorphism if
(i) $T$ is one-one
(ii) $T$ is onto
If $T$ is isomorphism then in this case, $V$ is isomorphic to $W$.

### Theorem:

Let $V$ be an n-dimensional vector space over the filed F. Then there is an isomorphism for $V$ onto $F^n$  $(F^n=\{(x_1, x_2, ... , x_n) | x_i\in F \ \text{is a vector space})$

### Proof:

Let $B=\{\alpha_1, \alpha_2, ..., \alpha_n\}$ be an ordered basis for $V$.
Then any $\alpha \in V$ can be expressed as $\alpha=a_1\alpha_1+a_2\alpha_2+ ... +a_n\alpha_n$, $\forall a_i \in F \ (1\le i \le n)$.
Define $\theta: V \rightarrow F^n$ by $\theta(\alpha)=(a_1, a_2, ... , a_n)$
Claim: $\theta$ is an isomorphism.
Let $\alpha, \beta \in V$ and $a \in F$
$\therefore \alpha=a_1\alpha_1+a_2\alpha_2+ ... +a_n\alpha_n$, $\beta=b_1\alpha_1+b_2\alpha_2+ ... +b_n\alpha_n, \ a_i, b_i \in F$
$\therefore \theta(\alpha)=(a_1, a_2, ... , a_n)$ and $\theta(\beta)=(b_1, b_2, ... , b_n)$
(i) Consider,
$\theta(a\alpha+\beta)$
$=\theta[a(a_1\alpha_1+a_2\alpha_2+ ... +a_n\alpha_n)+(b_1\alpha_1+b_2\alpha_2+ ... +b_n\alpha_n)]$
$=\theta[(aa_1+b_1)\alpha_1+(aa_2+b_2)\alpha_2+ ... +(aa_n+b_n)\alpha_n]$
$=(aa_1+b_1, aa_2+b_2, ... ,aa_n+b_n)$
$=a(a_1, a_2, ... , a_n)+(b_1, b_2, ... , b_n)$
$=a\theta(\alpha)+\theta(\beta)$
$\therefore \theta$ is linear transformation.
(ii) Suppose that, $\theta(\alpha)=\theta(\beta)$
$\therefore (a_1, a_2, ... , a_n)=(b_1, b_2, ... , b_n)$
$\therefore a_1=b_1, a_2=b_2, ... , a_n=b_n$
$\therefore \alpha=\beta$.
Hence, $\theta$ is one-one.
(iii) Let $\beta \in F^n$.
$\therefore \beta=(b_1, b_2, ... , b_n)$
Take $b_1\alpha_1+b_2\alpha_2+ ... +b_n\alpha_n=\alpha \in V$
$\therefore \theta(\alpha)=\theta(b_1\alpha_1+b_2\alpha_2+ ... +b_n\alpha_n)$
$=(b_1, b_2, ... , b_n)$
$=\beta$
$\therefore \theta$ is onto.
Hence, $\theta$ is an isomorphism.
i.e. $V\cong F^n$.