A set of vectors
in a vector space is said to be linearly independent if no vector in the set can be written as a linear combination of the others. Think of each vector as a direction in space. If a set of vectors is linearly independent, it means that none of these directions can be reached by walking along the others. Each vector adds a new dimension or direction that is not covered by any combination of the other vectors.
Claim
Claim: If a set of vectors
is linearly independent, then the set
is also linearly independent.
The claim provided above is an application or consequence of the original definition of linear independence. It illustrates a scenario where modifying one vector in a linearly independent set (in this case, subtracting
from ) does not introduce linear dependence among the vectors. This is because the modified vector still cannot be represented as a linear combination of the other vectors
and in the new set. Let’s dive into the proof.
Proof
Step 1: Assume the Hypothesis
Suppose that a set of vectors
is linearly independent.
Step 2: Deduce the Conclusion
The collection
being linearly independent means that the only linear relation on the collection is the trivial one. That is, if we have scalars
and
such that:
then it must follow that ,
indicating any such linear relation must be trivial.
Step 3: Apply the Transformation
Now, consider the set .
We want to show that this set is also linearly independent. Suppose there exist scalars
and
such that:
Expanding the above equation, we get:
Rearranging, we find:
Since
is linearly independent, the only solution for this equation is when all the coefficients are zero:
This implies that .
Therefore, the set
is linearly independent.
Conclusion
This proof demonstrates that linear independence is preserved even when one vector in a linearly independent set is modified by subtracting a scalar multiple of another vector from the same set.
Back to posts