**Deﬁnition 2.4.1** *Let{***x1***,· · ·,***x***p}be vectors in*F*n _{.}_{A linear combination is any expression}*

*of the form*

*p*

∑

*i*=1
*ci***x***i*

*where the* *ci* *are scalars.* *The set of all linear combinations of these vectors is called*

span (**x**1*,· · ·,***x***n*)*.* *If* *V* *⊆* F*n,* *then* *V* *is called a subspace if whenever* *α, β* *are scalars*

*and* **u** *and* **v** *are vectors of* *V,* *it follows* *α***u**+*β***v** *∈* *V. That is, it is “closed under the*
*algebraic operations of vector addition and scalar multiplication”.* *A linear combination*
*of vectors is said to be trivial if all the scalars in the linear combination equal zero. A set*
*of vectors is said to be linearly independent if the only linear combination of these vectors*
*which equals the zero vector is the trivial linear combination. Thus* *{***x1***,· · ·* *,***x***n}* *is called*
*linearly independent if whenever*

*p*

∑

*k*=1

*ck***x***k*=**0**

*it follows that all the scalarsck* *equal zero. A set of vectors,* *{***x1***,· · ·,***x***p},is called linearly*

*dependent if it is not linearly independent. Thus the set of vectors is linearly dependent if*
*there exist scalarsci, i*= 1*,· · ·* *, n,not all zero such that*

∑*p*

*k*=1*ck***x***k* =**0***.*

**Proposition 2.4.2** *Let* *V* *⊆* F*n.* *Then* *V* *is a subspace if and only if it is a vector space*
*itself with respect to the same operations of scalar multiplication and vector addition.*

*2.4.* *SUBSPACES AND SPANS* 57

**Proof:** Suppose ﬁrst that *V* is a subspace. All algebraic properties involving scalar
multiplication and vector addition hold for*V* because these things hold forF*n*_{. Is}_{0}_{∈}_{V}_{? Yes}

it is. This is because 0**v***∈V* and 0**v**=**0**. By assumption, for*α*a scalar and**v***∈V, α***v***∈V.*
Therefore, *−***v**= (*−*1)**v***∈* *V*. Thus *V* has the additive identity and additive inverse. By
assumption, *V* is closed with respect to the two operations. Thus*V* is a vector space. If
*V* *⊆* F*n* _{is a vector space, then by deﬁnition, if} _{α, β}_{are scalars and} _{u}_{,}_{v}_{vectors in} _{V,}_{it}

follows that*α***v**+*β***u***∈V*.

Thus, from the above, subspaces ofF*n*_{are just subsets of}_{F}*n*_{which are themselves vector}

spaces.

**Lemma 2.4.3** *A set of vectors{***x1***,· · ·* *,***x***p}* *is linearly independent if and only if none of*
*the vectors can be obtained as a linear combination of the others.*

**Proof:**Suppose ﬁrst that*{***x1***,· · ·* *,***x***p}*is linearly independent. If**x***k*=

∑
*j _{̸}*=

*kcj*

**x**

*j,*then

**0**= 1

**x**

*k*+ ∑

*j̸*=

*k*(

*−cj*)

**x**

*j,*

a nontrivial linear combination, contrary to assumption. This shows that if the set is linearly independent, then none of the vectors is a linear combination of the others.

Now suppose no vector is a linear combination of the others. Is *{***x1***,· · ·,***x***p}* linearly
independent? If it is not, there exist scalars*ci,* not all zero such that

*p*

∑

*i*=1

*ci***x***i* =**0***.*

Say*ck̸*= 0*.* Then you can solve for**x***k* as

**x***k* =

∑

*j̸*=*k*

(*−cj*)*/ck***x***j*

contrary to assumption.

The following is called the exchange theorem.

**Theorem 2.4.4** *(Exchange Theorem) Let{***x1***,· · ·,***x***r}be a linearly independent set of vec-*
*tors such that each* **x***i* *is in span*(**y1***,· · ·,***y***s*)*.* *Thenr≤s.*

**Proof 1:** Suppose not. Then*r > s*. By assumption, there exist scalars*aji* such that

**x***i*=
*s*

∑

*j*=1
*aji***y***j*

The matrix whose *jith* _{entry is} _{a}

*ji* has more columns than rows. Therefore, by Theorem

2.3.8 there exists a**nonzero**vector**b***∈*F*r* _{such that}_{A}_{b}_{=}_{0}_{.}_{Thus}

0 =
*r*
∑
*i*=1
*ajibi,* each*j.*
Then
*r*
∑
*i*=1
*bi***x***i*=
*r*
∑
*i*=1
*bi*
*s*
∑
*j*=1
*aji***y***j* =
*s*
∑
*j*=1
( * _{r}*
∑

*i*=1

*ajibi*)

**y**

*j*=

**0**

**Proof 2:** Deﬁne span*{***y1***,· · ·,***y***s} ≡* *V,* it follows there exist scalars *c*1*,· · ·* *, cs* such
that
**x1**=
*s*
∑
*i*=1
*ci***y***i.* (2.23)

Not all of these scalars can equal zero because if this were the case, it would follow that
**x1** = 0 and so *{***x1***,· · ·,***x***r}* would not be linearly independent. Indeed, if **x1** =**0**, 1**x1**+
∑*r*

*i*=20**x***i*=**x**1=**0**and so there would exist a nontrivial linear combination of the vectors

*{***x**1*,· · ·,***x***r}*which equals zero.

Say *ck* *̸*= 0*.*Then solve ((2.23)) for**y***k* and obtain

**y***k∈*span
x1*,*
s-1 vectors here
z }| {
**y1***,· · ·,***y***k _{−}*1

*,*

**y**

*k*+1

*,· · ·*

*,*

**y**

*s*

*.*Deﬁne

*{*

**z1**

*,· · ·,*

**z**

*s*1

_{−}*}*by

*{*

**z1**

*,· · ·,*

**z**

*s−*1

*} ≡ {*

**y1**

*,· · ·*

*,*

**y**

*k−*1

*,*

**y**

*k*+1

*,· · ·,*

**y**

*s}*

Therefore, span*{***x1***,***z1***,· · ·,***z***s−*1*}* = *V* because if **v** *∈* *V,* there exist constants *c*1*,· · ·* *, cs*

such that
**v**=
*s _{−}*1
∑

*i*=1

*ci*

**z**

*i*+

*cs*

**y**

*k.*

Now replace the**y***k* in the above with a linear combination of the vectors,*{***x1***,***z1***,· · ·* *,***z***s−*1*}*
to obtain**v***∈*span*{***x1***,***z1***,· · ·,***z***s−*1*}.*The vector**y***k,*in the list*{***y1***,· · ·,***y***s},*has now been

replaced with the vector**x1** and the resulting modiﬁed list of vectors has the same span as
the original list of vectors,*{***y1***,· · ·,***y***s}.*

Now suppose that *r > s*and that span*{***x1***,· · ·* *,***x***l,***z1***,· · ·,***z***p}* =*V* where the vectors,

**z**1*,· · ·,***z***p*are each taken from the set,*{***y**1*,· · ·,***y***s}* and*l*+*p*=*s.*This has now been done
for*l*= 1 above. Then since*r > s,*it follows that*l≤s < r*and so*l*+ 1*≤r.*Therefore,**x***l*+1
is a vector not in the list, *{***x**1*,· · ·* *,***x***l}* and since span*{***x**1*,· · ·,***x***l,***z**1*,· · ·* *,***z***p}* = *V,* there
exist scalars*ci* and*dj* such that

**x***l*+1=
*l*
∑
*i*=1
*ci***x***i*+
*p*
∑
*j*=1
*dj***z***j.* (2.24)

Now not all the*dj* can equal zero because if this were so, it would follow that*{***x**1*,· · ·,***x***r}*
would be a linearly dependent set because one of the vectors would equal a linear combination
of the others. Therefore, ((2.24)) can be solved for one of the **z***i,*say **z***k,*in terms of **x***l*+1
and the other**z***i* and just as in the above argument, replace that**z***i* with**x***l*+1to obtain

span
**x1***,· · ·***x***l,***x***l*+1*,*
p-1 vectors here
z }| {
**z1***,· · ·***z***k−*1*,***z***k*+1*,· · ·,***z***p*
=*V.*
Continue this way, eventually obtaining

span*{***x**1*,· · ·* *,***x***s}*=*V.*

But then**x***r* *∈*span*{***x**1*,· · ·,***x***s}* contrary to the assumption that *{***x**1*,· · ·,***x***r}* is linearly
independent. Therefore,*r≤s*as claimed.

*2.4.* *SUBSPACES AND SPANS* 59

**Proof 3:** Suppose*r > s.*Let**z***k* denote a vector of*{***y1***,· · ·* *,***y***s}.*Thus there exists*j* as

small as possible such that

span (**y1***,· · ·,***y***s*) = span (**x1***,· · ·,***x***m,***z1***,· · ·,***z***j*)

where*m*+*j* =*s.*It is given that *m*= 0*,* corresponding to no vectors of*{***x1***,· · ·,***x***m}* and
*j*=*s,*corresponding to all the**y***k*results in the above equation holding. If*j >*0 then*m < s*

and so
**x***m*+1=
*m*
∑
*k*=1
*ak***x***k*+
*j*
∑
*i*=1
*bi***z***i*

Not all the*bi*can equal 0 and so you can solve for one of them in terms of**x***m*+1*,***x***m,· · ·,***x1***,*

and the other**z***k*. Therefore, there exists

*{***z**1*,· · ·* *,***z***j−*1*} ⊆ {***y**1*,· · ·,***y***s}*
such that

span (**y1***,· · ·,***y***s*) = span (**x1***,· · ·,***x***m*+1*,***z1***,· · ·,***z***j−*1)
contradicting the choice of*j*. Hence*j* = 0 and

span (**y1***,· · ·,***y***s*) = span (**x1***,· · ·,***x***s*)

It follows that

**x***s*+1*∈*span (**x1***,· · ·,***x***s*)

contrary to the assumption the**x***k* are linearly independent. Therefore,*r≤s*as claimed.

**Deﬁnition 2.4.5** *A ﬁnite set of vectors,{***x1***,· · ·* *,***x***r}is a basis for*F*n _{if}*

_{span (}

_{x1}

_{,}_{· · ·}_{,}

_{x}*r*) = F

*n*

_{and}_{{}

_{x1}

_{,}_{· · ·}_{,}

_{x}

_{r}}

_{is linearly independent.}**Corollary 2.4.6** *Let* *{***x**1*,· · ·,***x***r}* *and{***y**1*,· · ·* *,***y***s}* *be two bases*1 *of*F*n. Thenr*=*s*=*n.*
**Proof:** From the exchange theorem,*r≤s*and*s≤r*. Now note the vectors,

**e***i*=

1 is in the*ith*_{slot}

z }| {

(0*,· · ·,*0*,*1*,*0*· · ·,*0)
for*i*= 1*,*2*,· · ·* *, n*are a basis forF*n*.

**Lemma 2.4.7** *Let* *{***v1***,· · ·,***v***r}* *be a set of vectors. Then* *V* *≡*span (**v1***,· · ·,***v***r*) *is a sub-*

*space.*

**Proof:**Suppose*α, β*are two scalars and let∑*r _{k}*

_{=1}

*ck*

**v**

*k*and

∑*r*

*k*=1*dk***v***k* are two elements

of*V.*What about
*α*
*r*
∑
*k*=1
*ck***v***k*+*β*
*r*
∑
*k*=1
*dk***v***k*?
Is it also in*V*?
*α*
*r*
∑
*k*=1
*ck***v***k*+*β*
*r*
∑
*k*=1
*dk***v***k* =
*r*
∑
*k*=1
(*αck*+*βdk*)**v***k* *∈V*

so the answer is yes.

1_{This is the plural form of basis. We could say basiss but it would involve an inordinate amount of}

**Deﬁnition 2.4.8** *A ﬁnite set of vectors,{***x1***,· · ·* *,***x***r}* *is a basis for a subspaceV* *of* F*n* *if*
span (**x**1*,· · ·,***x***r*) =*V* *and{***x**1*,· · ·* *,***x***r}* *is linearly independent.*

**Corollary 2.4.9** *Let* *{***x**1*,· · ·,***x***r}* *and{***y**1*,· · ·* *,***y***s}* *be two bases forV. Thenr*=*s.*
**Proof:** From the exchange theorem,*r≤s*and*s≤r*.

**Deﬁnition 2.4.10** *Let* *V* *be a subspace of* F*n. Then* dim (*V*) *read as the dimension of* *V*

*is the number of vectors in a basis.*

Of course you should wonder right now whether an arbitrary subspace even has a basis. In fact it does and this is in the next theorem. First, here is an interesting lemma.

**Lemma 2.4.11** *Suppose* **v** *∈/* span (**u1***,· · ·* *,***u***k*) *and* *{***u1***,· · ·,***u***k}* *is linearly independent.*

*Then{***u**1*,· · ·* *,***u***k,***v***}* *is also linearly independent.*

**Proof:** Suppose ∑*k _{i}*

_{=1}

*ci*

**u**

*i*+

*d*

**v**=

**0**

*.*It is required to verify that each

*ci*= 0 and

that *d*= 0*.* But if *d̸*= 0*,* then you can solve for **v** as a linear combination of the vectors,

*{***u**1*,· · ·* *,***u***k}*,
**v**=*−*
*k*
∑
*i*=1
(_{c}_{i}*d*
)
**u***i*

contrary to assumption. Therefore,*d*= 0*.*But then∑*k _{i}*

_{=1}

*ci*

**u**

*i*= 0 and the linear indepen-

dence of*{***u1***,· · ·,***u***k}*implies each *ci*= 0 also.

**Theorem 2.4.12** *Let* *V* *be a nonzero subspace of*F*n. ThenV* *has a basis.*

**Proof:** Let **v1** *∈* *V* where **v1** *̸*= 0*.* If span*{***v1***}* = *V,* stop. *{***v1***}* is a basis for *V*.
Otherwise, there exists**v2***∈* *V* which is not in span*{***v1***}.* By Lemma 2.4.11 *{***v1***,***v2***}* is a
linearly independent set of vectors. If span*{***v1***,***v2***}*=*V* stop,*{***v1***,***v2***}* is a basis for*V.*If
span*{***v**1*,***v**2*} ̸*=*V,*then there exists**v**3 *∈/* span*{***v**1*,***v**2*}* and*{***v**1*,***v**2*,***v**3*}* is a larger linearly
independent set of vectors. Continuing this way, the process must stop before *n*+ 1 steps
because if not, it would be possible to obtain*n*+ 1 linearly independent vectors contrary to
the exchange theorem.

In words the following corollary states that any linearly independent set of vectors can be enlarged to form a basis.

**Corollary 2.4.13** *LetV* *be a subspace of*F*n* _{and let}_{{}_{v}

1*,· · ·,***v***r}be a linearly independent*

*set of vectors inV. Then either it is a basis forV* *or there exist vectors,* **v***r*+1*,· · ·,***v***ssuch*

*that* *{***v1***,· · ·* *,***v***r,***v***r*+1*,· · ·* *,***v***s}* *is a basis forV.*

**Proof:**This follows immediately from the proof of Theorem 2.4.12. You do exactly the
same argument except you start with*{***v1***,· · ·* *,***v***r}*rather than*{***v1***}*.

It is also true that any spanning set of vectors can be restricted to obtain a basis.
**Theorem 2.4.14** *Let* *V* *be a subspace of* F*n* _{and suppose}_{span (}_{u1}_{· · ·}_{,}_{u}

*p*) = *V* *where*

*the* **u***i* *are nonzero vectors. Then there exist vectors* *{***v1***· · ·* *,***v***r}* *such that{***v1***· · ·,***v***r} ⊆*
*{***u1***· · ·* *,***u***p}* *and{***v1***· · ·* *,***v***r}* *is a basis for* *V.*

**Proof:** Let *r* be the smallest positive integer with the property that for some set

*{***v1***· · ·,***v***r} ⊆ {***u1***· · ·* *,***u***p},*

span (**v1***· · ·* *,***v***r*) =*V.*

Then *r≤p*and it must be the case that *{***v1***· · ·,***v***r}* is linearly independent because if it
were not so, one of the vectors, say **v***k* would be a linear combination of the others. But

then you could delete this vector from *{***v1***· · ·,***v***r}* and the resulting list of *r−*1 vectors
would still span*V* contrary to the deﬁnition of*r*.