Linear combinations of vectors

Kapitoly: Vector spaces, Examples of vector spaces, Vector subspace, Linear combinations of vectors, Linear wrapper, Bases of vector space, Dimensions of vector space, Transition matrix

Linear combination is the process of constructing a new vector from a set of vectors using only addition and multiplication.

What is linear combination

Consider some vector space V. We know that the space is closed under the operation of vector addition, so for two vectors x, y ∈ V, it is always true that x+y∈ V. If we add two vectors, we get a vector from the same space again. The same is true for scalar multiplication. If we have some k∈ ℝ, then also k · x ∈ V.

We can combine the two operations - we take k, l ∈ ℝ and the expression k · x + l · y again gives us a new vector, let's label it v, and this vector v will again lie in the space V. Why? k · x ∈ V, while certainly l · y∈ V. We add the two vectors of the space V, so by definition we must get a vector from the simple V.

So we can say that the vector v is a linear combination of the vectors x and y. In general: if we have the vectors x1, x2, …, xn and the real numbers k1, k2, …, kn, which we call coefficients, the linear combination of these vectors is the vector v, which we get

$$\mathbf{v}=k_1\cdot \mathbf{x}_1+k_2\cdot \mathbf{x}_2 + \ldots + k_n\cdot \mathbf{x}_n.$$

Example: consider the space 2 and from it two vectors: [2,3] and [0,8]. Now choose the coefficients: k1 = 2, k2 = 5. By linearly combining these vectors with the coefficients of k1, k2 we get the vector

$$2\cdot\left[2{,}3\right]+5\cdot\left[0{,}8\right]=\left[4{,}6\right]+\left[0{,}40\right]=\left[4{,}46\right]$$

We see that we have obtained the vector [4,46], which is part of the space 2. If we had chosen different coefficients, we would have obtained a different vector. For k1 = 0, k2 = −1 we have:

$$0\cdot\left[2{,}3\right]-1\cdot\left[0{,}8\right]=\left[0{,}0\right]+\left[0,-8\right]=\left[0,-8\right]$$

Linearly dependent vectors

Linear combinations are related to linearly dependent or linearly independent vectors. Let us illustrate with an example: consider the vector space 2 and the set of vectors Q from this space:

$$Q=\left\{\left[1{,}1\right], \left[2{,}4\right], \left[3, 5\right], \left[10, 18\right]\right\}$$

We will now be interested in the question of whether we can express any of these four vectors as a linear combination of the other three vectors. Looking at the vectors in this way, we see that the vector [3, 5] is equal to the simple sum of the first two vectors.

$$\left[1{,}1\right]+\left[2{,}4\right]=\left[3{,}5\right]$$

So the vector [3,5] is a linear combination of the vectors [1,1] and [2,4]. To make it fun, we remove this vector from the set Q. We get a new set Q1 and it has the form:

$$Q_1=\left\{\left[1{,}1\right], \left[2{,}4\right], \left[10, 18\right]\right\}$$

Can we find any other vector there that is a linear combination of the others? Yes, the vector [10,18] is a linear combination of the first two vectors for k1 = 2, k2 = 4:

$$2\cdot\left[1{,}1\right]+4\cdot\left[2{,}4\right]=\left[2{,}2\right]+\left[8{,}16\right]=\left[10{,}18\right]$$

Again, removing this vector, we get:

$$Q_2=\left\{\left[1{,}1\right], \left[2{,}4\right]\right\}$$

We ask again - can we get one of the vectors as a combination of the other? We can't, no multiple of the vector [1,1] will be equal to the vector [2,4].

We have obtained a set of vectors where none of the vectors can be expressed as a linear combination of the other vectors. We call such a set a linearly independent set of vectors. If one vector can be expressed as a combination of the others, we refer to it as a linearly dependent vector. If it cannot be expressed, we speak of it as a linearly independent vector.

Zero vector

The zero vector 0 plays an important role in linear dependence. Let us return to the set of vectors Q1 and the space 2. The set Q1 looks like this:

$$Q_1=\left\{\left[1{,}1\right], \left[2{,}4\right], \left[10, 18\right]\right\}$$

and we already know that the vector [10, 18] is dependent because 2 · [1,1]+4 · [2,4] = [10, 18]. So what would happen if we subtracted the vector [10, 18] from the expression 2 · [1,1]+4 · [2,4]? Or if we add its −1 multiple? It would look like this:

$$2\cdot\left[1{,}1\right]+4\cdot\left[2{,}4\right]-\left[10, 18\right]$$

This expression would certainly be equal to the zero vector:

$$2\cdot\left[1{,}1\right]+4\cdot\left[2{,}4\right]-\left[10, 18\right]=\left[0{,}0\right]$$

From this we can deduce an interesting property. If the set of vectors x1, x2, …, xn is linearly dependent, then there are real coefficients a1, a2, …, an such that

$$a_1\cdot \mathbf{x}_1+a_2\cdot \mathbf{x}_2+\ldots+ a_n\cdot \mathbf{x}_n = \mathbf{0}$$

At least one coefficient of ai must be non-zero. If we were to substitute zero for all the coefficients, we would always get a valid equation, and hence we would get that every set of vectors is linearly dependent, which is surely nonsense. The reasoning behind the previous equation is simple. For example, if the vector x1 is linearly dependent, then this means that

$$a_2\cdot \mathbf{x}_2+\ldots+ a_n\cdot \mathbf{x}_n=\mathbf{x}_1$$

To this equation, just add x1 and we get

$$-1\cdot\mathbf{x}_1 + a_2\cdot \mathbf{x}_2+\ldots+ a_n\cdot \mathbf{x}_n=\mathbf{0}$$

How to tell if vectors are dependent

If we have the vectors x1, x2, …, xn and we want to find out if they are linearly (in)dependent, we just solve the above equation

$$a_1\cdot \mathbf{x}_1+a_2\cdot \mathbf{x}_2+\ldots+ a_n\cdot \mathbf{x}_n = \mathbf{0}$$

i.e. find the coefficients ai. If we are in the space 3, we could have the following vectors: [1,2,3], [2,1,7], [1,−4,5]. Since we want to find out if they are dependent, we solve this equation:

$$a_1\cdot\left[1{,}2,3\right] + a_2\cdot\left[2{,}1,7\right] + a_3\cdot\left[1,-4{,}5\right]=\left[0{,}0,0\right]$$

We just break this equation down by the definition of vector addition and vector multiplication. It is true that a1 · [1,2,3] = [a1,2a1,3a1] etc. We decompose as follows:

$$\left[a_1{,}2a_1{,}3a_1\right]+\left[2a_2, a_2{,}7a_2\right]+\left[a_3, -4a_3, 5a_3\right]=\left[0{,}0,0\right]$$

Now we break down the addition. We always add at the same coordinates, so we add the first component of the first vector a1 with the first component of the second vector 2a2 with the first component of the third vector a3, and all of this must equal the first component of the zero vector, i.e., zero. And so on for the second component. We get a system of linear equations:

$$ \begin{array}{} a_1&+&2a_2&+&a_3&=&0\\ 2a_1&+&a_2&+&-4a_3&=&0\\ 3a_1&+&7a_2&+&5a_3&=&0\\ \end{array} $$

We can write this system in a classical matrix:

$$ \begin{pmatrix} 1&2&1&0\\ 2&1&-4&0\\ 3&7&5&0 \end{pmatrix} $$

Note that the columns of this matrix are actually the original vectors. The first column of the matrix contains the original vector [1,2,3], the third column contains the vector [1,−4,5], and so on. So you don't always have to make all these complicated adjustments when you do the math, you create this matrix by stacking the vectors you want to find out if they are dependent into the columns and then adding a zero column.

We can solve the system using the Gaussian elimination method. Let's modify the matrix to a stepwise form. First, add −2 to the second row times the first row, and add −3 to the third row times the first row:

$$ \begin{pmatrix} 1&2&1&0\\ 2&1&-4&0\\ 3&7&5&0 \end{pmatrix} \sim \begin{pmatrix} 1&2&1&0\\ 0&-3&-6&0\\ 0&1&2&0 \end{pmatrix} $$

Now we can add to the second row 3 the multiple of the third row:

$$ \begin{pmatrix} 1&2&1&0\\ 0&-3&-6&0\\ 0&1&2&0 \end{pmatrix} \sim \begin{pmatrix} 1&2&1&0\\ 0&0&0&0\\ 0&1&2&0 \end{pmatrix} $$

We don't even need to calculate any further, we have managed to zero out one row, which means that the vectors are linearly dependent. But we can calculate the system. We choose the parameter a3 = t. Then from the equation

$$a_2+2a_3=0$$

which is the last row of the matrix, we get the equation by substituting a3 = t:

$$\begin{eqnarray} a_2+2t&=&0\\ a_2&=&-2t \end{eqnarray}$$

We now substitute these values into the first equation, i.e., the first row of the matrix:

$$\begin{eqnarray} a_1+2a_2+a_3&=&0\\ a_1-4t+t&=&0\\ a_1&=&3t \end{eqnarray}$$

Thus we have a1 = 3t, a2 = −2t, a3 = t. If we substitute, for example, t = 3 after t, we get one particular solution a1 = 9, a2 = −6, a3 = 3. We can plug these coefficients into the original equation

$$a_1\cdot\left[1{,}2,3\right] + a_2\cdot\left[2{,}1,7\right] + a_3\cdot\left[1,-4{,}5\right]=\left[0{,}0,0\right]$$

and we get:

$$9\cdot\left[1{,}2,3\right] + (-6)\cdot\left[2{,}1,7\right] + 3\cdot\left[1,-4{,}5\right]=\left[0{,}0,0\right]$$

Adjust the left hand side:

$$\begin{eqnarray} 9\cdot\left[1{,}2,3\right] + (-6)\cdot\left[2{,}1,7\right] + 3\cdot\left[1,-4{,}5\right]&=&\left[0{,}0,0\right]\\ \left[9{,}18,27\right] + \left[-12,-6,-42\right] + \left[3,-12{,}15\right]&=&\left[0{,}0,0\right]\\ \left[9-12+3{,}18-6-12{,}27-42+15\right] &=& \left[0{,}0,0\right]\\ \left[0{,}0,0\right] &=& \left[0{,}0,0\right]\\ \end{eqnarray}$$

We see that if we choose, for example, a1 = 9, a2 = −6, a3 = 3, as the coefficients, we get a zero vector from the vectors [1,2,3], [2,1,7], [1,−4,5]. If the system had only one, zero, solution, then the vectors would be linearly independent.

References and resources