The concept of linear dependence and independence of vectors. Linear dependence and linear independence of vectors. Basis of vectors. Affine coordinate system


The vector system is called linearly dependent, if there are numbers among which at least one is different from zero, such that the equality https://pandia.ru/text/78/624/images/image004_77.gif" width="57" height="24 src=" >.

If this equality is satisfied only in the case when all , then the system of vectors is called linearly independent.

Theorem. The vector system will linearly dependent if and only if at least one of its vectors is a linear combination of the others.

Example 1. Polynomial is a linear combination of polynomials https://pandia.ru/text/78/624/images/image010_46.gif" width="88 height=24" height="24">. The polynomials constitute a linearly independent system, since the polynomial https: //pandia.ru/text/78/624/images/image012_44.gif" width="129" height="24">.

Example 2. The matrix system, , https://pandia.ru/text/78/624/images/image016_37.gif" width="51" height="48 src="> is linearly independent, since a linear combination is equal to the zero matrix only in in the case when https://pandia.ru/text/78/624/images/image019_27.gif" width="69" height="21">, , https://pandia.ru/text/78/624 /images/image022_26.gif" width="40" height="21"> linearly dependent.

Solution.

Let's make a linear combination of these vectors https://pandia.ru/text/78/624/images/image023_29.gif" width="97" height="24">=0..gif" width="360" height=" 22">.

Equating the same coordinates of equal vectors, we get https://pandia.ru/text/78/624/images/image027_24.gif" width="289" height="69">

Finally we get

And

The system has only one trivial solution, therefore the linear combination of these vectors is equal to zero only in the case when all coefficients are equal to zero. Therefore, this system of vectors is linearly independent.

Example 4. The vectors are linearly independent. What will the vector systems be like?

a).;

b).?

Solution.

a). Let's make a linear combination and equate it to zero

Using the properties of operations with vectors in linear space, we rewrite the last equality in the form

Since the vectors are linearly independent, the coefficients at must be equal to zero, i.e..gif" width="12" height="23 src=">

The resulting system of equations has a unique trivial solution .

Since equality (*) executed only when https://pandia.ru/text/78/624/images/image031_26.gif" width="115 height=20" height="20"> – linearly independent;

b). Let's make an equality https://pandia.ru/text/78/624/images/image039_17.gif" width="265" height="24 src="> (**)

Applying similar reasoning, we obtain

Solving the system of equations by the Gauss method, we obtain

or

The latter system has an infinite number of solutions https://pandia.ru/text/78/624/images/image044_14.gif" width="149" height="24 src=">. Thus, there is a non-zero set of coefficients for which holds the equality (**) . Therefore, the system of vectors – linearly dependent.

Example 5 A system of vectors is linearly independent, and a system of vectors is linearly dependent..gif" width="80" height="24">.gif" width="149 height=24" height="24"> (***)

In equality (***) . Indeed, at , the system would be linearly dependent.

From the relation (***) we get or Let's denote .

We get

Tasks for independent decision(in the audience)

1. A system containing a zero vector is linearly dependent.

2. System consisting of one vector A, is linearly dependent if and only if, a=0.

3. A system consisting of two vectors is linearly dependent if and only if the vectors are proportional (that is, one of them is obtained from the other by multiplying by a number).

4. If you add a vector to a linearly dependent system, you get a linearly dependent system.

5. If a vector is removed from a linearly independent system, then the resulting system of vectors is linearly independent.

6. If the system S is linearly independent, but becomes linearly dependent when adding a vector b, then the vector b linearly expressed through system vectors S.

c). System of matrices , , in the space of second-order matrices.

10. Let the system of vectors a,b,c vector space is linearly independent. Prove the linear independence of the following vector systems:

a).a+b, b, c.

b).a+https://pandia.ru/text/78/624/images/image062_13.gif" width="15" height="19">– arbitrary number

c).a+b, a+c, b+c.

11. Let a,b,c– three vectors on the plane from which a triangle can be formed. Will these vectors be linearly dependent?

12. Two vectors are given a1=(1, 2, 3, 4),a2=(0, 0, 0, 1). Find two more four-dimensional vectors a3 anda4 so that the system a1,a2,a3,a4 was linearly independent .

a 1 = { 3, 5, 1 , 4 }, a 2 = { –2, 1, -5 , -7 }, a 3 = { -1, –2, 0, –1 }.

Solution. Are looking for common decision systems of equations

a 1 x 1 + a 2 x 2 + a 3 x 3 = Θ

Gauss method. To do this, we write this homogeneous system in coordinates:

System Matrix

The allowed system has the form: (r A = 2, n= 3). The system is cooperative and uncertain. Its general solution ( x 2 – free variable): x 3 = 13x 2 ; 3x 1 – 2x 2 – 13x 2 = 0 => x 1 = 5x 2 => X o = . The presence of a non-zero particular solution, for example, indicates that the vectors a 1 , a 2 , a 3 linearly dependent.

Example 2.

Find out whether a given system of vectors is linearly dependent or linearly independent:

1. a 1 = { -20, -15, - 4 }, a 2 = { –7, -2, -4 }, a 3 = { 3, –1, –2 }.

Solution. Consider a homogeneous system of equations a 1 x 1 + a 2 x 2 + a 3 x 3 = Θ

or in expanded form (by coordinates)

The system is homogeneous. If it is non-degenerate, then it has a unique solution. When homogeneous system– zero (trivial) solution. This means that in this case the system of vectors is independent. If the system is degenerate, then it has non-zero solutions and, therefore, it is dependent.

We check the system for degeneracy:

= –80 – 28 + 180 – 48 + 80 – 210 = – 106 ≠ 0.

The system is non-degenerate and, thus, the vectors a 1 , a 2 , a 3 linearly independent.

Tasks. Find out whether a given system of vectors is linearly dependent or linearly independent:

1. a 1 = { -4, 2, 8 }, a 2 = { 14, -7, -28 }.

2. a 1 = { 2, -1, 3, 5 }, a 2 = { 6, -3, 3, 15 }.

3. a 1 = { -7, 5, 19 }, a 2 = { -5, 7 , -7 }, a 3 = { -8, 7, 14 }.

4. a 1 = { 1, 2, -2 }, a 2 = { 0, -1, 4 }, a 3 = { 2, -3, 3 }.

5. a 1 = { 1, 8 , -1 }, a 2 = { -2, 3, 3 }, a 3 = { 4, -11, 9 }.

6. a 1 = { 1, 2 , 3 }, a 2 = { 2, -1 , 1 }, a 3 = { 1, 3, 4 }.

7. a 1 = {0, 1, 1 , 0}, a 2 = {1, 1 , 3, 1}, a 3 = {1, 3, 5, 1}, a 4 = {0, 1, 1, -2}.

8. a 1 = {-1, 7, 1 , -2}, a 2 = {2, 3 , 2, 1}, a 3 = {4, 4, 4, -3}, a 4 = {1, 6, -11, 1}.

9. Prove that a system of vectors will be linearly dependent if it contains:

a) two equal vectors;

b) two proportional vectors.

In this article we will cover:

  • what are collinear vectors;
  • what are the conditions for collinearity of vectors;
  • what properties of collinear vectors exist;
  • what is the linear dependence of collinear vectors.
Yandex.RTB R-A-339285-1 Definition 1

Collinear vectors are vectors that are parallel to one line or lie on one line.

Example 1

Conditions for collinearity of vectors

Two vectors are collinear if any of the following conditions are true:

  • condition 1 . Vectors a and b are collinear if there is a number λ such that a = λ b;
  • condition 2 . Vectors a and b are collinear with equal coordinate ratios:

a = (a 1 ; a 2) , b = (b 1 ; b 2) ⇒ a ∥ b ⇔ a 1 b 1 = a 2 b 2

  • condition 3 . Vectors a and b are collinear under the condition of equality vector product and zero vector:

a ∥ b ⇔ a, b = 0

Note 1

Condition 2 not applicable if one of the vector coordinates is zero.

Note 2

Condition 3 applies only to those vectors that are specified in space.

Examples of problems to study the collinearity of vectors

Example 1

We examine the vectors a = (1; 3) and b = (2; 1) for collinearity.

How to solve?

IN in this case it is necessary to use the 2nd collinearity condition. For given vectors it looks like this:

The equality is false. From this we can conclude that vectors a and b are non-collinear.

Answer : a | | b

Example 2

What value m of the vector a = (1; 2) and b = (- 1; m) is necessary for the vectors to be collinear?

How to solve?

Using the second collinearity condition, vectors will be collinear if their coordinates are proportional:

This shows that m = - 2.

Answer: m = - 2 .

Criteria for linear dependence and linear independence of vector systems

Theorem

A system of vectors in a vector space is linearly dependent only if one of the vectors of the system can be expressed in terms of the remaining vectors of this system.

Proof

Let the system e 1 , e 2 , . . . , e n is linearly dependent. Let us write a linear combination of this system equal to the zero vector:

a 1 e 1 + a 2 e 2 + . . . + a n e n = 0

in which at least one of the combination coefficients is not equal to zero.

Let a k ≠ 0 k ∈ 1 , 2 , . . . , n.

We divide both sides of the equality by a non-zero coefficient:

a k - 1 (a k - 1 a 1) e 1 + (a k - 1 a k) e k + . . . + (a k - 1 a n) e n = 0

Let's denote:

A k - 1 a m , where m ∈ 1 , 2 , . . . , k - 1 , k + 1 , n

In this case:

β 1 e 1 + . . . + β k - 1 e k - 1 + β k + 1 e k + 1 + . . . + β n e n = 0

or e k = (- β 1) e 1 + . . . + (- β k - 1) e k - 1 + (- β k + 1) e k + 1 + . . . + (- β n) e n

It follows that one of the vectors of the system is expressed through all other vectors of the system. Which is what needed to be proven (etc.).

Adequacy

Let one of the vectors be linearly expressed through all other vectors of the system:

e k = γ 1 e 1 + . . . + γ k - 1 e k - 1 + γ k + 1 e k + 1 + . . . + γ n e n

We move the vector e k to the right side of this equality:

0 = γ 1 e 1 + . . . + γ k - 1 e k - 1 - e k + γ k + 1 e k + 1 + . . . + γ n e n

Since the coefficient of the vector e k is equal to - 1 ≠ 0, we get a non-trivial representation of zero by a system of vectors e 1, e 2, . . . , e n , and this, in turn, means that this system of vectors is linearly dependent. Which is what needed to be proven (etc.).

Consequence:

  • A system of vectors is linearly independent when none of its vectors can be expressed in terms of all other vectors of the system.
  • A system of vectors that contains a zero vector or two equal vectors is linearly dependent.

Properties of linearly dependent vectors

  1. For 2- and 3-dimensional vectors, the following condition is met: two linearly dependent vectors are collinear. Two collinear vectors are linearly dependent.
  2. For 3-dimensional vectors, the following condition is satisfied: three linearly dependent vectors are coplanar. (3 coplanar vectors are linearly dependent).
  3. For n-dimensional vectors, the following condition is satisfied: n + 1 vectors are always linearly dependent.

Examples of solving problems involving linear dependence or linear independence of vectors

Example 3

Let's check the vectors a = 3, 4, 5, b = - 3, 0, 5, c = 4, 4, 4, d = 3, 4, 0 for linear independence.

Solution. Vectors are linearly dependent because the dimension of vectors is less than the number of vectors.

Example 4

Let's check the vectors a = 1, 1, 1, b = 1, 2, 0, c = 0, - 1, 1 for linear independence.

Solution. We find the values ​​of the coefficients at which the linear combination will equal the zero vector:

x 1 a + x 2 b + x 3 c 1 = 0

We write the vector equation in linear form:

x 1 + x 2 = 0 x 1 + 2 x 2 - x 3 = 0 x 1 + x 3 = 0

We solve this system using the Gauss method:

1 1 0 | 0 1 2 - 1 | 0 1 0 1 | 0 ~

From the 2nd line we subtract the 1st, from the 3rd - the 1st:

~ 1 1 0 | 0 1 - 1 2 - 1 - 1 - 0 | 0 - 0 1 - 1 0 - 1 1 - 0 | 0 - 0 ~ 1 1 0 | 0 0 1 - 1 | 0 0 - 1 1 | 0 ~

From the 1st line we subtract the 2nd, to the 3rd we add the 2nd:

~ 1 - 0 1 - 1 0 - (- 1) | 0 - 0 0 1 - 1 | 0 0 + 0 - 1 + 1 1 + (- 1) | 0 + 0 ~ 0 1 0 | 1 0 1 - 1 | 0 0 0 0 | 0

From the solution it follows that the system has many solutions. This means that there is a non-zero combination of values ​​of such numbers x 1, x 2, x 3 for which the linear combination of a, b, c equals the zero vector. Therefore, the vectors a, b, c are linearly dependent.

If you notice an error in the text, please highlight it and press Ctrl+Enter

Introduced by us linear operations on vectors make it possible to create various expressions for vector quantities and transform them using the properties set for these operations.

Based on a given set of vectors a 1, ..., a n, you can create an expression of the form

where a 1, ..., and n are arbitrary real numbers. This expression is called linear combination of vectors a 1, ..., a n. The numbers α i, i = 1, n, represent linear combination coefficients. A set of vectors is also called system of vectors.

In connection with the introduced concept of a linear combination of vectors, the problem arises of describing a set of vectors that can be written as a linear combination of a given system of vectors a 1, ..., a n. In addition, there are natural questions about the conditions under which there is a representation of a vector in the form of a linear combination, and about the uniqueness of such a representation.

Definition 2.1. Vectors a 1, ..., and n are called linearly dependent, if there is a set of coefficients α 1 , ... , α n such that

α 1 a 1 + ... + α n а n = 0 (2.2)

and at least one of these coefficients is non-zero. If the specified set of coefficients does not exist, then the vectors are called linearly independent.

If α 1 = ... = α n = 0, then, obviously, α 1 a 1 + ... + α n a n = 0. With this in mind, we can say this: vectors a 1, ..., and n are linearly independent if it follows from equality (2.2) that all coefficients α 1 , ... , α n are equal to zero.

The following theorem explains why the new concept is called the term "dependence" (or "independence"), and provides a simple criterion for linear dependence.

Theorem 2.1. In order for the vectors a 1, ..., and n, n > 1, to be linearly dependent, it is necessary and sufficient that one of them is a linear combination of the others.

◄ Necessity. Let us assume that the vectors a 1, ..., and n are linearly dependent. According to Definition 2.1 of linear dependence, in equality (2.2) on the left there is at least one non-zero coefficient, for example α 1. Leaving the first term on the left side of the equality, we move the rest to the right side, changing their signs, as usual. Dividing the resulting equality by α 1, we get

a 1 =-α 2 /α 1 ⋅ a 2 - ... - α n /α 1 ⋅ a n

those. representation of vector a 1 as a linear combination of the remaining vectors a 2, ..., a n.

Adequacy. Let, for example, the first vector a 1 can be represented as a linear combination of the remaining vectors: a 1 = β 2 a 2 + ... + β n a n. Transferring all terms from the right side to the left, we obtain a 1 - β 2 a 2 - ... - β n a n = 0, i.e. a linear combination of vectors a 1, ..., a n with coefficients α 1 = 1, α 2 = - β 2, ..., α n = - β n, equal to zero vector. In this linear combination, not all coefficients are zero. According to Definition 2.1, the vectors a 1, ..., and n are linearly dependent.

The definition and criterion for linear dependence are formulated to imply the presence of two or more vectors. However, we can also talk about a linear dependence of one vector. To realize this possibility, instead of “vectors are linearly dependent,” you need to say “the system of vectors is linearly dependent.” It is easy to see that the expression “a system of one vector is linearly dependent” means that this single vector is zero (in a linear combination there is only one coefficient, and it should not be equal to zero).

The concept of linear dependence has a simple geometric interpretation. The following three statements clarify this interpretation.

Theorem 2.2. Two vectors are linearly dependent if and only if they collinear.

◄ If vectors a and b are linearly dependent, then one of them, for example a, is expressed through the other, i.e. a = λb for some real number λ. According to definition 1.7 works vectors per number, vectors a and b are collinear.

Let now vectors a and b be collinear. If they are both zero, then it is obvious that they are linearly dependent, since any linear combination of them is equal to the zero vector. Let one of these vectors not be equal to 0, for example vector b. Let us denote by λ the ratio of vector lengths: λ = |a|/|b|. Collinear vectors can be unidirectional or oppositely directed. In the latter case, we change the sign of λ. Then, checking Definition 1.7, we are convinced that a = λb. According to Theorem 2.1, vectors a and b are linearly dependent.

Remark 2.1. In the case of two vectors, taking into account the criterion of linear dependence, the proven theorem can be reformulated as follows: two vectors are collinear if and only if one of them is represented as the product of the other by a number. This is a convenient criterion for the collinearity of two vectors.

Theorem 2.3. Three vectors are linearly dependent if and only if they coplanar.

◄ If three vectors a, b, c are linearly dependent, then, according to Theorem 2.1, one of them, for example a, is a linear combination of the others: a = βb + γс. Let us combine the origins of vectors b and c at point A. Then the vectors βb, γс will have a common origin at point A and along according to the parallelogram rule, their sum is those. vector a will be a vector with origin A and the end, which is the vertex of a parallelogram built on component vectors. Thus, all vectors lie in the same plane, i.e., coplanar.

Let vectors a, b, c be coplanar. If one of these vectors is zero, then it will obviously be a linear combination of the others. It is enough to take all the coefficients of the linear combination equal to zero. Therefore, we can assume that all three vectors are not zero. Compatible started of these vectors at a common point O. Let their ends be points A, B, C, respectively (Fig. 2.1). Through point C we draw lines parallel to lines passing through pairs of points O, A and O, B. Designating the points of intersection as A" and B", we obtain a parallelogram OA"CB", therefore, OC" = OA" + OB". Vector OA" and the non-zero vector a = OA are collinear, and therefore the first of them can be obtained by multiplying the second by a real number α:OA" = αOA. Similarly, OB" = βOB, β ∈ R. As a result, we obtain that OC" = α OA + βOB, i.e. vector c is a linear combination of vectors a and b. According to Theorem 2.1, vectors a, b, c are linearly dependent.

Theorem 2.4. Any four vectors are linearly dependent.

◄ We carry out the proof according to the same scheme as in Theorem 2.3. Consider arbitrary four vectors a, b, c and d. If one of the four vectors is zero, or among them there are two collinear vectors, or three of the four vectors are coplanar, then these four vectors are linearly dependent. For example, if vectors a and b are collinear, then we can make their linear combination αa + βb = 0 with non-zero coefficients, and then add the remaining two vectors to this combination, taking zeros as coefficients. We obtain a linear combination of four vectors equal to 0, in which there are non-zero coefficients.

Thus, we can assume that among the selected four vectors, no vectors are zero, no two are collinear, and no three are coplanar. Let's choose them as common beginning point O. Then the ends of the vectors a, b, c, d will be some points A, B, C, D (Fig. 2.2). Through point D we draw three planes parallel to the planes OBC, OCA, OAB, and let A", B", C" be the points of intersection of these planes with the straight lines OA, OB, OS, respectively. We obtain a parallelepiped OA" C "B" C" B"DA", and vectors a, b, c lie on its edges emerging from vertex O. Since the quadrilateral OC"DC" is a parallelogram, then OD = OC" + OC". In turn, the segment OC" is a diagonal parallelogram OA"C"B", so OC" = OA" + OB" and OD = OA" + OB" + OC" .

It remains to note that the pairs of vectors OA ≠ 0 and OA" , OB ≠ 0 and OB" , OC ≠ 0 and OC" are collinear, and, therefore, it is possible to select the coefficients α, β, γ so that OA" = αOA , OB" = βOB and OC" = γOC. We finally get OD = αOA + βOB + γOC. Consequently, the OD vector is expressed through the other three vectors, and all four vectors, according to Theorem 2.1, are linearly dependent.

Task 1. Find out whether the system of vectors is linearly independent. The system of vectors will be specified by the matrix of the system, the columns of which consist of the coordinates of the vectors.

.

Solution. Let the linear combination equal to zero. Having written this equality in coordinates, we obtain the following system of equations:

.

Such a system of equations is called triangular. She has only one solution . Therefore, the vectors linearly independent.

Task 2. Find out whether the system of vectors is linearly independent.

.

Solution. Vectors are linearly independent (see problem 1). Let us prove that the vector is a linear combination of vectors . Vector expansion coefficients are determined from the system of equations

.

This system, like a triangular one, has a unique solution.

Therefore, the system of vectors linearly dependent.

Comment. Matrices of the same type as in Problem 1 are called triangular , and in problem 2 – stepped triangular . The question of the linear dependence of a system of vectors is easily solved if the matrix composed of the coordinates of these vectors is step triangular. If the matrix does not have a special form, then using elementary string conversions , preserving linear relationships between the columns, it can be reduced to a step-triangular form.

Elementary string conversions matrices (EPS) the following operations on a matrix are called:

1) rearrangement of lines;

2) multiplying a string by a non-zero number;

3) adding another string to a string, multiplied by an arbitrary number.

Task 3. Find the maximum linearly independent subsystem and calculate the rank of the system of vectors

.

Solution. Let us reduce the matrix of the system using EPS to a step-triangular form. To explain the procedure, we denote the line with the number of the matrix to be transformed by the symbol . The column after the arrow indicates the actions on the rows of the matrix being converted that must be performed to obtain the rows of the new matrix.


.

Obviously, the first two columns of the resulting matrix are linearly independent, the third column is their linear combination, and the fourth does not depend on the first two. Vectors are called basic. They form a maximal linearly independent subsystem of the system , and the rank of the system is three.



Basis, coordinates

Task 4. Find the basis and coordinates of the vectors in this basis on the set geometric vectors, whose coordinates satisfy the condition .

Solution. The set is a plane passing through the origin. An arbitrary basis on a plane consists of two non-collinear vectors. The coordinates of the vectors in the selected basis are determined by solving the corresponding system of linear equations.

There is another way to solve this problem, when you can find the basis using the coordinates.

Coordinates spaces are not coordinates on the plane, since they are related by the relation , that is, they are not independent. The independent variables and (they are called free) uniquely define a vector on the plane and, therefore, they can be chosen as coordinates in . Then the basis consists of vectors lying in and corresponding to sets of free variables And , that is .

Task 5. Find the basis and coordinates of the vectors in this basis on the set of all vectors in space whose odd coordinates are equal to each other.

Solution. Let us choose, as in the previous problem, coordinates in space.

Because , then free variables uniquely determine the vector from and are therefore coordinates. The corresponding basis consists of vectors.

Task 6. Find the basis and coordinates of the vectors in this basis on the set of all matrices of the form , Where – arbitrary numbers.

Solution. Each matrix from is uniquely representable in the form:

This relation is the expansion of the vector from with respect to the basis
with coordinates .

Task 7. Find the dimension and basis of the linear hull of a system of vectors

.

Solution. Using the EPS, we transform the matrix from the coordinates of the system vectors to a step-triangular form.




.

Columns the last matrices are linearly independent, and the columns linearly expressed through them. Therefore, the vectors form a basis , And .

Comment. Basis in is chosen ambiguously. For example, vectors also form a basis .

Editor's Choice
In recent years, the bodies and troops of the Russian Ministry of Internal Affairs have been performing service and combat missions in a difficult operational environment. Wherein...

Members of the St. Petersburg Ornithological Society adopted a resolution on the inadmissibility of removal from the Southern Coast...

Russian State Duma deputy Alexander Khinshtein published photographs of the new “chief cook of the State Duma” on his Twitter. According to the deputy, in...

Home Welcome to the site, which aims to make you as healthy and beautiful as possible! Healthy lifestyle in...
The son of moral fighter Elena Mizulina lives and works in a country with gay marriages. Bloggers and activists called on Nikolai Mizulin...
Purpose of the study: With the help of literary and Internet sources, find out what crystals are, what science studies - crystallography. To know...
WHERE DOES PEOPLE'S LOVE FOR SALTY COME FROM? The widespread use of salt has its reasons. Firstly, the more salt you consume, the more you want...
The Ministry of Finance intends to submit a proposal to the government to expand the experiment on taxation of the self-employed to include regions with high...
To use presentation previews, create a Google account and sign in:...