Linear Dependence And Independence Of Vectors Engineering Essay

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

Matrices and Determinants were discovered and developed in the eighteenth and nineteenth centuries. Initially, their development dealt with transformation of geometric objects and solution of systems of linear equations. Historically, the early emphasis was on the determinant, not the matrix. In modern treatments of linear algebra, matrices are considered first.

The origins of mathematical matrices lie with the study of systems of simultaneous linear equations. An important Chinese text from between 300 BC and AD 200, Nine Chapters of the Mathematical, gives the first known example of the use of matrix methods to solve simultaneous equations. In the treatise's seventh chapter, "Too much and not enough," the concept of a determinant first appears, nearly two millennia before its supposed invention by the Japanese mathematician Seki in 1683 or his German contemporary Gottfried Leibnitz (who is also credited with the invention of differential calculus, separately from but simultaneously with Isaac Newton).

Since their first appearance in ancient China, matrices have remained important mathematical tools. Today, they are used not simply for solving systems of simultaneous linear equations, but also for describing the quantum mechanics of atomic structure, designing computer game graphics, analyzing relationships, and even plotting complicated dance steps!  The elevation of the matrix from mere tool to important mathematical theory owes a lot to the work of female mathematician Olga Taussky Todd (1906-1995), who began by using matrices to analyze vibrations on airplanes during World War II and became the torchbearer for matrix theory.

Matrix

A matrix is a rectangular array of numbers arranged in rows and columns. The array of numbers below is an example of a matrix.

21

62

33

93

44

95

66

13

77

38

79

33

The number of rows and columns that a matrix has is called its dimension or its order. By convention, rows are listed first; and columns, second. Thus, we would say that the dimension (or order) of the above matrix is 3 x 4, meaning that it has 3 rows and 4 columns. Numbers that appear in the rows and columns of a matrix are called elements of the matrix.

In the above matrix, the element in the first column of the first row is 21; the element in the second column of the first row is 62; and so on.

TYPES OF MATRIX

Row matrix

A matrix having a single row is called a row matrix. e. g. [1 3 5 7]

Column Matrix:

. A matrix having a single column is called a column matrix. e.g. .

Square Matrix

A square matrix is an n x n matrix; that is, a matrix with the same number of rows as columns. In this section, we describe several special kinds of square matrix.

Symmetric matrix. If the transpose of a matrix is equal to itself, that matrix is said to be symmetric. Two examples of symmetric matrices appear below.

A= A'=    

1

2

2

3

B= B'=    

5

6

7

6

3

2

7

2

1

Note that each of these matrices satisfy the defining requirement of a symmetric matrix: A = A' and B = B'.

Diagonal matrix. A diagonal matrix is a special kind of symmetric matrix. It is a symmetric matrix with zeros in the off-diagonal elements. Two diagonal matrices are shown below.

A=    

1

0

0

3

B=    

5

0

0

0

3

0

0

0

1

Note that the diagonal of a matrix refers to the elements that run from the upper left corner to the lower right corner.

Scalar matrix. A scalar matrix is a special kind of diagonal matrix. It is a diagonal matrix with equal-valued elements along the diagonal. Two examples of a scalar matrix appear below.

A=    

3

0

0

3

B=    

5

0

0

0

5

0

0

0

5

These square matrices play a prominent role in the application of matrix algebra to real-world problems. For example, a scalar matrix called the identity matrix is critical to the solution of simultaneous linear equations. (We cover the identity matrix later in the tutorial.)

Unit Matrix or Identity Matrix:

 

A diagonal matrix of order n which has unity for all its diagonal elements, is called a unit matrix of order n and is denoted by In.

 

Thus a square matrix A = [aij]nÃ-n is a unit matrix if aij =

 

For example: 

 

Triangular Matrix:

 

A square matrix in which all the elements below the diagonal elements are zero is called Upper Triangular matrix and a square matrix in which all the elements above diagonal elements are zero is called Lower Triangular matrix.

 

Given a square matrix A = [aij]nÃ-n,

 

For upper triangular matrix, aij = 0,      i > j

 

and for lower triangular matrix, aij = 0, i < j

Linear Dependence and Linear Independence of Functions

Two functions, f and g, are linearly dependent if one is a multiple of the other; otherwise

they are linearly independent.

DEFINITION 2. Let f1, f2, f3, ..., fn

be functions defined on an interval I. The functions are linearly dependent if there exist n real numbers c1, c2, ..., cn, not all zero, such that

c1f1(x) + c2f2(x) + c3f3(x) + · · · + cnfn(x) _ 0;

that is

,

c1f1(x) + c2f2(x) + c3f3(x) + · · · + cnfn(x) = 0 for all x 2 I.

Otherwise the functions are linearly independent.

Equivalently, the functions f1, f2, f3, ..., fn

are linearly independent if

c1f1(x) + c2f2(x) + c3f3(x) + · · · + cnfn(x) _ 0

only when c1 = c2 = · · · = cn = 0.

Geometric meaning

A geographic example may help to clarify the concept of linear independence. A person describing the location of a certain place might say, "It is 5 miles north and 6 miles east of here." This is sufficient information to describe the location, because the geographic coordinate system may be considered a 2-dimensional vector space (ignoring altitude). The person might add, "The place is 7.81 miles northeast of here." Although this last statement is true, it is not necessary.

In this example the "5 miles north" vector and the "6 miles east" vector are linearly independent. That is to say, the north vector cannot be described in terms of the east vector, and vice versa. The third "7.81 miles northeast" vector is a linear combination of the other two vectors, and it makes the set of vectors linearly dependent, that is, one of the three vectors is unnecessary.

Also note that if altitude is not ignored, it becomes necessary to add a third vector to the linearly independent set. In general, n linearly independent vectors are required to describe any location in n-dimensional space

Definition

A finite subset of n vectors, v1, v2, ..., vn, from the vector space V, is linearly dependent if there exists a set of n scalars, a1, a2, ..., an, not all zero, such that

Note that the zero on the right is the zero vector, not the number zero.

If such scalars do not exist, then the vectors are said to be linearly independent.

Alternatively, linear independence can be directly defined as follows: a set of vectors is linearly independent if and only if the only representations of the zero vector as linear combinations of its elements are trivial solution solutions i.e. whenever a1, a2, ..., an are scalars such that

then ai = 0 for i = 1, 2, ..., n.

A set of vectors is then said to be linearly dependent if it is not linearly independent.

More generally, let V be a vector space over a field K, and let {vi | i∈I} be a family of elements of V. The family is linearly dependent over K if there exists a family {aj | j∈J} of elements of K, not all zero, such that

where the index set J is a nonempty, finite subset of I.

A set X of elements of V is linearly independent if the corresponding family {x}x∈X is linearly independent.

Equivalently, a family is dependent if a member is in the linear span of the rest of the family, i.e., a member is a linear combination of the rest of the family.

A set of vectors which is linearly independent and spans some vector space, forms a basis for that vector space. For example, the vector space of all polynomials in x over the reals has for a basis the (infinite) subset {1, x, x2, ...}.

Linear Combination of Vectors

.One vector is dependent on other vectors, if it is a linear combination of the other vectors.

If one vector is equal to the sum of scalar multiples of other vectors, it is said to be a linear combination of the other vectors.

For example, suppose a = 2b + 3c, as shown below.

11

16

   =   

2 

1

2

   +   

3 

3

4

   =   

2*1 + 3*3

2*2 + 3*4

A

B

c

Note that 2b is a scalar multiple and 3c is a scalar multiple. Thus, a is a linear combination of b and c.

Condition for Linear Dependence and Independence

A set of vectors is linearly independent if no vector in the set is

(a) a scalar multiple of another vector in the set or

(b) a linear combination of other vectors in the set;

conversely, a set of vectors is linearly dependent if any vector in the set is

(a) a scalar multiple of another vector in the set or

(b) a linear combination of other vectors in the set.

Consider the row vectors below.

a =  

1

2

3

 

d=  

2

4

6

b =  

4

5

6

 

e=  

0

1

0

c =  

5

7

9

 

f=  

0

0

1

Note the following:

Vectors a and b are linearly independent, because neither vector is a scalar multiple of the other.

Vectors a and d are linearly dependent, because d is a scalar multiple of a; i.e., d = 2a.

Vector c is a linear combination of vectors a and b, because c = a + b. Therefore, the set of vectors a, b, and c is linearly dependent.

Vectors d, e, and f are linearly independent, since no vector in the set can be derived as a scalar multiple or a linear combination of any other vectors in the set.

Testing for Linear Dependence of Vectors

There are many situations when we might wish to know whether a set of vectors is linearly dependent, that is if one of the vectors is some combination of the others.

Two vectors u and v are linearly independent if the only numbers x and y satisfying xu+yv=0 are x=y=0. If we let

then xu+yv=0 is equivalent to

If u and v are linearly independent, then the only solution to this system of equations is the trivial solution, x=y=0. For homogeneous systems this happens precisely when the determinant is non-zero. We have now found a test for determining whether a given set of vectors is linearly independent: A set of n vectors of length n is linearly independent if the matrix with these vectors as columns has a non-zero determinant. The set is of course dependent if the determinant is zero.

Example

The vectors u=<2,-1,1>, v=<3,-4,-2>, and w=<5,-10,-8> are dependent since the determinant

is zero. To find the relation between u, v, and w we look for constants x, y, and z such that

This is a homogeneous system of equations. Using Gaussian Elimination, we see that the matrix

in row-reduced form is

Thus, y=-3z and 2x=-3y-5z=-3(-3z)-5z=4z which implies 0=xu+yv+zw=2zu-3zv+zw or equivalently w=-2u+3v. A quick arithmetic check verifies that the vector w is indeed equal to -2u+3v.

Proof

The vectors (1, 1) and (−3, 2) in are linearly independent.

[edit] Proof

Let λ1 and λ2 be two real numbers such that

Taking each coordinate alone, this means

Solving for λ1 and λ2, we find that λ1 = 0 and λ2 = 0.

Alternative method using determinants

An alternative method uses the fact that n vectors in are linearly dependent if and only if the determinant of the matrix formed by taking the vectors as its columns is zero.

In this case, the matrix formed by the vectors is

We may write a linear combination of the columns as

We are interested in whether AΛ = 0 for some nonzero vector Λ. This depends on the determinant of A, which is

Since the determinant is non-zero, the vectors (1, 1) and (−3, 2) are linearly independent.

Otherwise, suppose we have m vectors of n coordinates, with m < n. Then A is an nÃ-m matrix and Λ is a column vector with m entries, and we are again interested in AΛ = 0. As we saw previously, this is equivalent to a list of n equations. Consider the first m rows of A, the first m equations; any solution of the full list of equations must also be true of the reduced list. In fact, if 〈i1,...,im〉 is any list of m rows, then the equation must be true for those rows.

Furthermore, the reverse is true. That is, we can test whether the m vectors are linearly dependent by testing whether

for all possible lists of m rows. (In case m = n, this requires only one determinant, as above. If m > n, then it is a theorem that the vectors must be linearly dependent.) This fact is valuable for theory; in practical calculations more efficient methods are available.

Example II

Let V = Rn and consider the following elements in V:

Then e1, e2, ..., en are linearly independent.

Rank

The rank of a matrix describes its linear independence. It describes the dimensionality of the space. It is defined for any square or rectangular matrix such that

rank (A) = number of linearly independent rows of A

= number of linearly independent columns of A.The number of linearly independent rows of a matrix is always equal to the number of linearly independent columns. For A of n Ã- p, the maximum possible rank is the smaller of n and p, where A is considered to be of full rank. When the rank is smaller than both n and p, the matrix is of deficient rank. Thus, only a square matrix can be of full rank - rectangular matrices will always have a rank less than either n or p. When a matrix is the product of two other matrices, C = AB, its rank cannot be greater than the smaller rank of the two matrices:

Rank C ≤ min (rank A, rank B).

If A is nonsingular (|A| ≠ 0), then

A has an inverse

r(A) = n, or A is full rank

the rows of A are linearly independent

the columns of A are linearly independent

Rank in terms of column vectors:

The rank of a matrix equals the maximum number

of linearly independent column vectors of .

Hence and its transpose aAT have the same rank.When a matrix is the product of two other matrices, C = AB, its rank cannot be greater than the smaller rank of the two matrices:

Rank C ≤ min (rank A, rank B).

If A is nonsingular (|A| ≠ 0), then

A has an inverse

r(A) = n, or A is full rank

the rows of A are linearly independent

the columns of A are linearly independent When a matrix is the product of two other matrices, C = AB, its rank cannot be greater than the smaller rank of the two matrices:

Rank C ≤ min (rank A, rank B).

If A is nonsingular (|A| ≠ 0), then

A has an inverse

r(A) = n, or A is full rank

the rows of A are linearly independent

the columns of A are linearly independent

Writing Services

Essay Writing
Service

Find out how the very best essay writing service can help you accomplish more and achieve higher marks today.

Assignment Writing Service

From complicated assignments to tricky tasks, our experts can tackle virtually any question thrown at them.

Dissertation Writing Service

A dissertation (also known as a thesis or research project) is probably the most important piece of work for any student! From full dissertations to individual chapters, we’re on hand to support you.

Coursework Writing Service

Our expert qualified writers can help you get your coursework right first time, every time.

Dissertation Proposal Service

The first step to completing a dissertation is to create a proposal that talks about what you wish to do. Our experts can design suitable methodologies - perfect to help you get started with a dissertation.

Report Writing
Service

Reports for any audience. Perfectly structured, professionally written, and tailored to suit your exact requirements.

Essay Skeleton Answer Service

If you’re just looking for some help to get started on an essay, our outline service provides you with a perfect essay plan.

Marking & Proofreading Service

Not sure if your work is hitting the mark? Struggling to get feedback from your lecturer? Our premium marking service was created just for you - get the feedback you deserve now.

Exam Revision
Service

Exams can be one of the most stressful experiences you’ll ever have! Revision is key, and we’re here to help. With custom created revision notes and exam answers, you’ll never feel underprepared again.