Sunteți pe pagina 1din 7

Linear Algebra Final Review

Chapter 1
1. Every matrix is row equivalent to a unique matrix in echelon form. False

because you can row reduce a matrix into a multitude of ways, row reducing it
to its pure RREF or just to a triangular reduction.
2. Any system of n linear equations in n variables has at most n solutions. False
because in an nxn matrix, there can be a row of zeros, indicating a free
variable; hence, there can exist infinitely many solutions to the system of
linear equations.
3. If a system of linear equations has two different solutions, it must have
infinitely many solutions. True because if the system has more than one

unique solution, the system suggests that there is a free variable; hence, there
will exist infinitely many solutions.

4. If a system of linear equations has no free variables, then it has a unique


solution. False; even when there exist no free variables, that doesnt mean the

complement is true; that is, there doesnt exist a unique solution nor are there
signs of basic variables. The system may be inconsistent.

5. If an augmented matrix [A b] is transformed into [C d] by elementary row


operations, then the equations Ax=b and Cx=d have exactly the same
solution. True because by definition and by the properties of row reduction, a

matrix that has been row reduced n-number of times maintains its row
equivalency with its previous matrix form; hence, the solution that may be
difficult to see from Ax=b is, nonetheless, the same as the solution one may
figure out by fully row reducing Ax=b into Cx=d.

6. If a system Ax=b has more than one solution, then so does the system Ax=0.

True because if there exists more than one solution for the inhomogenous
equation Ax=b, then, definitely, there exists a consistent system of equations
that has a free variable, letting Ax=0 have infinitely many solutions.

7. If A is an mxn matrix and the equation Ax=b is consistent for some b, then
the columns of A span Rm. False, the system must encompass all b for the

columns of A to span Rm.

8. If an augmented matrix [A b] can be transformed by elementary row

operations into reduced echelon form, then the equation Ax=b is


consistent. False because when the matrix A is in RREF, it doesnt

necessarily mean that it will be consistent. There could possiblly be a


row of zeros but on the augmented side there exists a value b where its
nonzero, making the system inconsistent. In other words, any matrix can
be row reduced into row reduced echelon form, however, not every
matrix/system of equations is always consistent.

9. If matrices A and B are row equivalent, they have the same reduced

row echelon form. True because A and B are row equivalent, hence,

their final row reduced state should be the same; the only difference
between A and B is that their initial row matrix states are different.

10. The equation Ax=0 has the trivial solution if and only if there are no

free variables. False; all homogenous equations have the trivial solution,

regardless of whether they have free variables.

11. If A is an mxn matrix and the equation Ax=b is consistent for EVERY b

in Rm, then A has m pivot columns. True because by definition of column

space, if Ax=b is consistent for all b in Rm, then there are m pivot
columns.

12. If an mxn matrix A has a pivot position in every row, then the equation

Ax has a unique solution for each b in Rm. False, A has to have a pivot

position in every column for the equation Ax to have a unique solution


for each b in Rm.

13. If an nxn square matrix A has n pivot positions, then the reduced

echelon form of A is the nxn identity matrix. True, by definition of a

square matrix of size nxn, whereby there exists n pivot positions, then
the RREF of A will be equivalent to the identify matrix In.

14. If 3x3 matrices A and B each have three pivot positions, then A can be

transformed into B by elementary row operations. True, if there are 3

pivot positions in a 3x3 square matrix, then there are 3 basic variables in
a consistent system of equations; hence, their RREFs will resemble the
same Identity matrix of size 3x3: I3.

15. If A is an mxn matrix, if the equation Ax=b has at least two different

solutions, and if the equation Ax=c is consistent, then the equation


Ax=c has many solutions. True because if Ax=b has at least two

different solutions, then there exist free variables, suggesting that there
are infinitely many solutions. Hence, because Ax=c is consistent, then it,
too, has infinitely many solutions because Ax=b, an inhomogeneous
equation has infinitely many solutions.

16. If A and B are row equivalent mxn matrices and if the columns of A

span Rm, then so do the columns of B. True, its pretty self explanatory

because A and B are row equivalent mxn matrices, which means that the
solutions of A and B are the same; hence, if the columns of A span Rm,
then so do the columns of B.
17. If none of the vectors in the set S = {v1, v2, v3} in R3 is a multiple of one
of the other vectors, then S is linearly independent. False, that is true
about the relationship between two vectors in R2, because we dont
know if one of the vectors may possibly be a linear combination of the
preceding vectors.
18. If {u,v,w} is linearly independent, then u, v, and w are not in R2. True; if
u,v, and w were in R2, then the set of vectors would be linearly
dependent since there are more vectors than equations in the system.

19. In some cases, it is possible for four vectors to span R5. False; 5 vectors

are required for this to happen since it will take 5 vectors to have 5 pivot
positions to even span the vector space of R5.
20. If u and v are in Rm, then u is in Span{u,v}. True, because u is a linear
combination of -1*u + 0*v.
21. If u,v, and w are nonzero vectors in R2, then w is a linear combination of u
and v. False, if u and v are multiples of each other, then span{u,v} is a line

where vector w doesnt span.

22. If w is a linear combination of u and v in Rn, then u is a linear combination of


v and w. False because the linear dependence relations doesnt allow u to be a

linear combination of v and w since the statement goes true for span{u,v}. u
cannot be a linear combination of v and w, because its been established that u
and v are linearly independent in Rn.

23. Suppose v1,v2,v3 are in R5, v2 is not a multiple of v1, and v3 is not a linear
combination of v1 or v2. Then { v1,v2,v3} is linearly independent. False because

there can be a zero vector which makes the entire system linearly dependent.
Also, if there are 3 vectors with 5 equations, there are infinitely many
solutions, making the system, again, linearly dependent.
24. A linear transformation is a function. True by definition of a linear
transformation; it takes a vector or polynomial and outputs some value. For
every input there is an output (definition of a function).
25. If A is a 6x5 matrix, the linear transformation x |-> Ax cannot map R5 to R6.
True, for x to Ax to map R5 to R6, R5 must have a pivot position in each row, but
since there are 5 rows, it needs 6 to map onto R6, which is impossible.
26. If A is an mxn matrix with m pivot columns, then the linear transformation x
to Ax is a one to one mapping. False, there must be a pivot in each column, but

because were dealing with an mxn matrix, m might be less than n.

Chapter 2
1. If A and B are mxn, then both ABT and ATB are defined. True because the

column of the left matrix multiple is equivalent to the row dimension of the
right matrix multiple.
2. If AB=C and C has 2 columns, then A has 2 columns. False, B must have 2
columns, while A can have many columns. However, the number of rows in A
must be equivalent to the number of columns in B.
3. The transpose of an elementary matrix is an elementary matrix. True. The
transpose of an elemenetary matrix is an elementary matrix of the same type.
4. If A is a 3x3 matrix with three pivot positions, there exist elementary
matrices E1,,Ep, such that Ep.E1A=I. True, if A is 3x3 with three pivot

positions, then A is row equivalent to I_3.


5. If AB = I, then A is invertible. False, we must know whether A is a square
matrix.
6. Of A amd N are square and invertible, the (AB)-1 = A-1B-1. False, (AB)-1 =
B-1A-1.

Chapter 3

1. If A is a 2x2 matrix with a zero determinant, then one column of A is a


multiple of the other. True, because of the matrix has a determinant of zero,

its linearly dependent.

2. If two rows of a 3x3 matrix A are the same, then detA = 0. True because then

there will be a row of zeros, and if we calculate the cofactor expansion, we get
that detA = 0.
3. If A is a 3x3 matrix, then det5A = 5detA. False, det5A = 53detA.
4. If A and B are nxn matrices, with detA = 2 and detB = 3, then det(A+B)=5.

False because detAB = detA*detB, and detA+detB = 5 is not related to the


addition of two matrices A and B and then finding the determinant of their
additive result.
5. If A is nxn and detA=2, then detA3=6. False, detA3=23 = 8.
6. If B is produced by interchanging two rows of A, then detB = detA. False,
detB= -detA.
7. If B is produced by multiple row 3 of A by 5, then detB = 5detA. True, by the
properties of determinants effect via row operations.

8. If B is formed by adding to one row of A a linear combination of the other


rows, then detB = detA. True, by the definition of the row operations affect on

the determinant.

9. detAT= -detA. False, detAT=detA.


10. det(-A) = -det(A). True. Test case.
11. detATA >= 0. True because two matrices determinants squared will

automatically give you a positive result.

12. If u and v are in R2 and det[u v] = 10, then the area of the riangle in the plan
with vertices at 0, u, and v is 10. False, its the area (calculated

determinant) where the vertices are.


13. If A3=0, then detA = 0. True, if you cube root the determinant A cubes, you still
get the same result: 0.
14. If A is invertible, then det A-1 = detA. False, detA-1= 1/detA.
15. If A is invertible, then (detA)(detA-1)=detA/detA = 1.

Chapter 4
There exists a vector space V, and a set S = { v1,,vp}
1. The set of all linear combinations of v1,,vp is a vector space. True because
the set is Span{ v1,,vp} and every subspace is itself a vector space.
2. If { v1,,vp-1} spans V, then S spans V. True, any linear combination of v1,,vp-1

is also a linear combination of v1,,vp,vp where p is the zero weight.

3. If {v1,,vp-1 } is linearly independent, then so is S. False because then { v1,,vp


} is linearly dependent since it has one extra vector after v_p-1.
4. If S is linearly independent, then S is a basis for V. False because if we let the

standard basis for R3 play a role, then the first two vectors in the span is a
linearly independent set but is not a basis for R3. In other words, it might to
span the vector space, yet it is linearly independent.
5. If Span S = V, then some subset of S is a basis for V. True by the definition of a
basis or Spanning Set Theorem.

6. If dim V = p and Span S = V, then S cannot be linearly dependent. True

because Span S is in V where dim V = p, so by the basis theorem, S is a basis for


V because S spans V and has exactly p elements; hence S must be linearly
independent.
7. A plane in R3, is a two dimensional subspace. False, because the plane must
pass through the origin to be a subspace.
8. The nonpivot columns of a matrix are always linearly dependent. False
because there can be linear dependence relations amongst columns and there
can possible not be any linear dependence relations amonst them.
9. Row operations on a matrix A can change the linear dependence relations
among the rows of A. True, only among the rows of A, not the columns of A.
10. Row operation on a matrix can change the null space. False, the null space is

the solution to the equation Ax=0, which cannot be affect by row operations.

11. The rank of a matrix equals the number of nonzero rows. False, that is the

dimension of the row space. What is being referred to is the dimension of the
Column Space.

12. If an mxn matrix A s row equivalent to an echelon matrix U and if U has k


nonzero rows, then the dimension of the solution space of Ax=0 is m-k. False,

the dimension of the solution space is n-k. rankA = k and dim Nul = n-k by the
rank theorem.

13. If B is obtained from a matrix A by several elementary row operations, then


rankB = rankA. True the rank of B and A are equivalent because they have the

same column spaces.

14. The nonzero rows of a matrix A form a basis for Row A. False, the nonzero

rows of A span Row A but they may not be linearly independent. Hence they
dont form a basis.

15. If matrices A and B have the same reduced echelon form, then Row A = Row
B. True by the definition of a row space, the row reduced form of matrices A

and B contain the same row spaces. The nonzero rows of the RREF E form a
basis for the row space of each matrix that is row equivalent to E.
16. If H is a subspace of R3 then there is a 3x3 matrix A such that H = ColA. True,
if H is the zero subspace, let A be the 3x3 zero matrix. If dimH = 1, let v be a
basis for H and set A = [v v v]. If dimH = 2 let {u,v} be a basis for H and set A =
[u v v]. If dimH = 3 let {u,v,w} be a basis for H and set A = [u v w].
17. If A is mxn and rank A = m, then the linear transformation x to Ax is one to
one. False because the number of columns may be bigger or smaller than the

number of rows, hence in Rn, the matrix is not one-to-one.

18. If A is mxn and the linear transformation x to Ax is onto, then rank A = m.

True because if x to Ax is onto, then Col A = Rm and rank A = m.


19. A change of coordinates matrix is always invertible. True because a change of
coordinates matrix is square and must be invertible to change coordinates;
otherwise, it wouldnt be called a change of coordinates matrix.

20. If B = {b1,,bn} and C = {c1,,cn} are bases for a vector space V, then the jth
column of the change of coordinates matrix P(C <-B) is the coordinate vector
[cj]B. False, its a b vector in a C basis. So it should be written as follows: [bj]C.
Chapter 5

1.

If A is invertible and 1 is an eigenvalue for A, then 1 is also an eigenvalue for


A-1. True, if A is invertible and if Ax=1*x for some nonzero x, then left multiple

by A-1 to obtain x = A-1x. Since x is nonzero, this shows 1 is an eigenvalue of A-1


.

2. If A is row equivalent to the identity matrix I, then A is diagonalizable. False,


3.
4.

5.

6.
7.

if A is row equivalent to the identity matrix, then A is invertible. The invertible


matrix wont be diagonalizable.
If A contains a row or column of zeros, then 0 is an eigenvalue of A. True if A
contains a row or column of zeros, then A is not row equivalent to the identity
matrix and thus is not invertible. By IMT, 0 is an eigenvalue of A.
Each eigenvalue of A is also an eigenvalue of A2. False, if we consider a
diagonal matrix D whose eigenvalues are 1 and 3, that is, its diagonal entries
are 1 and 3, then D^2 is a diagonal matrix whose eigenvalues or diagonal
entries are 1 and 9. You just square the eigenvalues.
Each eigenvector of A is also an eigenvector of A2. True, suppose a nonzero
vector x satisfies Ax = lambda*x, then A2x = A(Ax) = A(lambda*x) =
lambda*A*x = lambda*lambda*x. This shows that x is also an eigenvector for
A2.
Each eigenvector of an invertible matrix A is also an eigenvector of A-1. True,
lambda-1x=A-1x which shows that x is also an eigenvector of A-1.
Eigenvalues must be nonzero scalars. False, zero is an eigenvalue of each
singular square matrix.

8. Eigenvectors must be nonzero vectors. True, by definition, eigenvectors must


be nonzero.
9. Two eigenvectors corresponding to the same eigenvalue are always linearly

dependent. False, let v be an eigenvector for A, then v and 2v are distinct


eigenvectors for the same eigenvalue but v and 2v and linearly dependent.

10. Similar matrices always have exactly the same eigenvalues. True because the
similarity theorem says so.
11. Similar matrices always have exactly the same eigenvectors. False, if A is a

3x3 then A is similar to a diagonal matrix D, which its eigenvectors are the
columns of I_3, but the eigenvectors of A are entirely different.
12. The sum of two eigenvectors of a matrix A is also an eigenvector of A. False, if
we have a test case, we can show that a1 and a2 are eigenvectors, but a1+a2 is
not an eigenvector.
13. The eigenvalues of an upper triangular matrix A are exactly the nonzero
entries on the diagonal of A. False; all the diagonal entries of an upper

triangular matrix are the eigenvalues of the matrix. A diagonal entry may be
zero.

14. The matrices A and A transpose have the same eigenvalues, counting
multiplicities. True because they both have the same characteristic

polynomial.

15. If a 5x5 matrix A has fewer than 5 distinct eigenvalues, then A is not
diagonalizable. False, because we can let A be the 5x5 identity matrix, so there

are eigenvalues, but each eigenvalue is 1.

16. There exists a 2x2 matrix that has no eigenvectors in R2. True, for example if

we let A be the matrix that rotates vectors through pi/2 radians about the
origin, then Ax is not a multiple of x when x is nonzero.
17. If A is diagonalizable then the columns of A are linearly independent. False, if
A is a diagonal matrix with 0 on the diagonal, then the columns of A are not
linearly independent, because the determinant is 0.
18. A nonzero vector cannot correspond to two different eigenvalues of A. True,
if we try and diagonalize, then we get different eigenvalues, but because
theyre nonzero, they must equal each other. (lambda_1x = lambda_2x)
(lambda1-lambda2)x=0, and so if x doesnt equal 0, then both eigenvalues are
equal.
19. A square matrix A is invertible if and only if there is a coordinate system in
which the transformation x to Ax is represented by a diagonal matrix. Let A

be a singular matrix that is diagonalizable. Then the transformation x to Ax is


represented by a diagonal matrix relative to a coordinate system determined
by eigenvectors of A.

20. If each vector e in the standard basis for Rn is an eigenvector or A, then A is a


diagonal matrix. True by definition of matrix multiplication.
21. If A is similar to a diagonalizable matrix B, then A is also diagonalizable. True,

if B = PDP-1 where D is a diagonal matrix, and if A = QBQ-1 then


A=
-1
-1
-1
Q(PDP )Q = (QP)D(PQ) , which shows that A is diagonalizable.
22. If A and B are invertible nxn matrices, then AB is similar to BA. True, since B
is invertible, AB is similar to B(AB)B-1 which equals BA.
23. An nxn matrix with n linearly independent eigenvectors is invertible. False,
having n linearly independent eigenvectors makes an nxn matrix
diagonalizable but not necessarily invertible. One of the eigenvalues of the
matrix could be zero.

24. If A is an nxn diagonalizable matrix, then each vector in Rn can be written as a


linear combination of eigenvectors of A. True if A is diagonalizable then by the

theorem of diagnalization A has n linearly independent vectors v1 through vn


in Rn. By the basis theorem, v1 through vn spans Rn. This means that each
vector in Rn can be written as a linear combination of v1 through vn.

S-ar putea să vă placă și