hkus10
Senior Members-
Posts
30 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by hkus10
-
Let P belongs to Mnn be a nonsingular matrix and Let L:Mnn>>>Mnn be given by L(A) = P^-1AP for all A in Mnn. Prove that L is an invertible linear operator. I have no clues how to start this question. What do I need to prove for this question? and why
-
a) This is what I get let n=lambda. Since r is an eigenvalue of L, Lx=nx. Since the transformation is invertible, (L^-1)Lx=(L^-1)nx. ==> Ix=n(L^-1)x, where I=indentity matrix At this point, I want to divide both sides by r. However, how can I be sure r is not equal to zero? Thanks
-
What does the sentence want me to write down? The key is I do not understand the question.
-
Let L : V>>>V be an invertible linear operator and let lambda be an eigenvalue of L with associated eigenvector x. a) Show that 1/lambda is an eigenvalue of L^-1 with associated eigenvector x. For this question, the things I know are that L is onto and one to one. Therefore, how to prove this question? b) state the analogous statement for matrices. What does "state" the analogous statement mean? Thanks
-
1) (x-10)[(x+4)(x+1) - 24] - 3[(-11x - 11) + 24] + 8[-21 + 3x] what I get is (x-10)(x^2+5x-20) + 57x-207 The reason that I do not combine them is because I think it is much more difficult to deal with x^3? What should I do here?
-
<font face="Arial">1) Let L:R3 >>>R3 be defined by<br> L([1 0 0]) = [1 2 3],<br> L([0 1 0]) = [0 1 1],<br> L([0 0 1]) = [1 1 0]<br><br> How to prove that L is invertible? I have the idea of one-to-one and onto, but I do not know how to apply them to this proof.<br><br> 2) Find a linear transformation L:R2 >>>R3 such that {[1 -1 2], [3 1 -1]} is a basis for range (L). How can I approach this problem?<br><br> 3) Let S = {v1,…,vn} be an ordered basis for vector space V . Let<br> L :V →R^n be given by L(v) = [v]S . Prove that L is an<br> isomorphism( prove that the linear transformation is one-to-one and onto)<br> What I know so far is that {[v1]s, [v2]s,...,[vn]s} is an ordered basis for R^n and v can be written in a unique way sych that v = a1v1+...+anvn = 0.<br> How can I go from there?<br><br> 4) If L:V>>>W is a linear transformation of a vector space V into a vector space W and dim V = dim W, then prove that if L is one-to-one, then it is onto.<br> I get a point that dim(W) = dim(range(L)). What I am trying to get to is W = range(L). Then, by definition, L is onto. My question is that how can I prove that dim(W) = dim(range(L)) >>> W = range(L) and by what Thm of defin? If not, how should I approach this question?<br><br> 5) Let L:R^n>>>R^m be a linear transformation defined by L(x) = Ax, for x in R^n. Then, L is onto if and only if rank A = m.<br> In this question, is the nullity of A equal to the nullity of L?<br><br> Thanks </font>
-
Let L:p2 >>> p3 be the linear transformation defined by L(p(t)) = t^2 p'(t). (a) Find a basis for and the dimension of ker(L). (b) Find a basis for and the dimension of range(L). The hint that I get is to begin by finding an explicit formula for L by determining L(at^2 + bt + c). Does this hint mean let p(t) = at^2 + bt + c? Then, I find that t^2 p'(t) = 2at^3 + bt^2. is the basis for ker(L) {t, 1} and the basis for range(L) {t^3, t^2}? Thanks
-
Let S = {w1, w2, ..., wn} be a set of n vectors in R^n and let A be nxn matix whoise columns are the elements of S. Prove that for all b belong in R^n, Ax = b is consistent if and only if b belongs span(S). My approach is: I use contrapositive method to prove both sides First, I prove that if Ax = b is consistent, the b belongs span(s). Assume that Ax = b is inconsistent. Let x be [x1 x2 ... xn] ***This is a vectical vector which means x1, x2, and xn lines up vectically since I cannot express it in this way. This means the last row of all vectors in S are zeros and the last row of b has nonzero integer. Then, b cannot be written as a linear combination of vectors in S since 0x1 + 0x2 + ... + 0xn = 0. Therefore, b does not belong span(S). For the other side: Assume b does not belong span(S). Then, b cannot be written as a linear combination of vectors in S. If the last row of b is an nonzero integer, then the last row of all vectors in S must be zeros so that b cannot be written as a linear combination of vectors in S. By the Matrix-Vector Product written in terms of columns, [v1 v2 ... vn][x] not equal to . Thus, Ax = b is inconsistent. My question is that this proof seems quite reasonably for me. However, am I really proving this question. If not, how to approach it instead? Thanks
-
Let L:P1 >> P1 be a linear transformation for which we know that L(t + 1) = 2t + 3 and L(t - 1) = 3t -2 a) Find L(6t-4) I just want to check the way to calculate this question. Is L(6t - 4) equal to 6*3t - 4*2 = 18t - 8? if not, how to calculate it?
-
1) Let S be an ordered basis for n-dimensional vector space V. Show that if {w1, w2, ..., wk} is a linearly independent set of vectors in V, then {[w1]s, [w2]s,...,[wk]s} is a linearly independent set of vectors in R^n. What I got so far is w1 = a1V1 + a2V2 + ... + anVn so, [w1]s = [a1 a2 ... an] The same thing for w2, [w2]s and wk, [wk]s. My question how to go from there? 2) Let S and T be two ordered bases of an n-dimensional vector space V. Prove that the transition matrix from T - coordinates to S - coordinates is unique. That is, if A,B belong to Mnn both satisfy A[v]T = [V]S and B[V]T = [v]S for all v belong to V, then A = B. My approach for this question is that Let S = {v1, v2, vn} Let T = {w1, w2, wn} Av = a1v1+a2v2+...+anvn v = b1w1+b2w2+...+bnwn Aa1v1 + Aa2v2+ ... +Aanvn a1(Av1) + a2(Av2)+...+an(Avn) = b1w1+b2w2+...+bn(wn) Am I going the right direction? If no, how should I approach? If yes, how should I move from here?
-
1) How to show that if W is a subspace of a finite-dimensional vector space V, then W is finite-dimensional and dim W<= dimV. 2) How to show that if a subspace of a finite-dimensional vector space V and dim W = dimV, then W = V. 3) How to prove that the subspace of R^3 are{0}, R^3 itself, and any line or plane passing through the origin. How to approach these three Questions? Thanks
-
Let V be the vector space of all real-valued continuous functions. t, e^t, sin(t) are in V. Is t, e^t, sin(t) in V linearly independent? My answer is yes. However, how can I prove it which is that which do I have to show or can I just say the def of linear independent?
-
Ap + A(su) + A(tv) = b Ap + s(Au) + t(Av) = b Ap + s(0) + t(0) = b Ap = b Is this correct?
-
Suppose that the solution set to a linear system Ax = b is a plane in R^n with vector equation x = p + su + tv , s, t ∈ R . Prove that p is a solution to the nonhomogeneous system Ax = b , and that u and v are both solutions to the homogeneous system Ax = 0 . (Hint Try choices of s and t). Should I start from A(p + su + tv) = b? If yes, what should I do from here? If no, where should I start?
-
1) Let u and v be nonzero vectors in a vector space V. show that u and v are linearly dependent if and only if there is a scalar k such that v = ku. Equivalently, u and v are linearly independent if and only if neither vector is a multiple of the other. 2) Let S = {v1, v2, ..., vk} be a set of vectors in a vector space V. Prove that S is linearly dependent if and only if one of the vectors in S is a linear combination of all the other vectors in S. For these two questions, I know I have to prove them in both directions because of "if and only of". However, how to approach this problem? what Thms or definition should I use to prove them? 3) Let S = {v1, v2, ..., vk} be a set of vectors in a vector space V, and let W be a subspace of V containing S. Show that W contains span S. For question 3, does "W be a subspace of V containing S" mean W contains S? If yes, what is the reason to show it?
-
Let T be the set of all matrics of the form AB - BA, where A and B are nxn matrics. Show that span T is not Mnn. 1) does "span T is not Mnn" mean that Mnn does not span T? Thanks
-
Determine whether the given vector A in 2x2 matrix belongs to span{A1, A2, A3}, where A1 = [1 -1 0 3] A2 = [1 1 0 2] A3 = [2 2 -1 1] A = [5 1 -1 9]. Since A1, A2, A3 are not a nx1 matrices, I cannot put this into reduced echelon form? Therefore, what can I do to solve this problem? Thanks
-
1) Let x0 be a fixed vector in a vector space V. Show that the set W consisting of all scalar multiples cx0 of x0 is a subspace of V. What techniques should I use to prove this? 2a) Show that a line lo through the origin of R^n is a subspace of R^n. 2b) show that a line l in R^n not passing through the origin is not a subspace of R^n. What techniques and direction should I use to solve these problems? Thanks
-
Is this the answer? aW_1 + bW_2 = [math] \begin{pmatrix}a & b & a+b\\ a & 0 & 0\end{pmatrix} [/math] where a, b can be any real number.
-
What I get is [math]\begin{pmatrix}a & b & 2c\\ a & 0 & 0\end{pmatrix}[/math] which is not [math]\begin{pmatrix}a & b & c\\ a & 0 & 0\end{pmatrix}[/math]
-
1.) The set W of all 2x3 matrices of the form a b c a 0 0 where c = a + b, is a subspace of M23 (Matrics 23). Show that every vector in W is a linear combination of W1 = 1 0 1 1 0 0 W2 = 0 1 1 0 0 0 Do I have to combine both W1 and W2 into one equation?
-
1) Find a vector equation for the plane in R3(3D) with scalar equation 2x − 3y + z = 5 . First,I find three points on the plane and then I used one point as a fixed point in order to find two vectors on the plane by using two other points. Then, I tried to test whether the two vectors are perpendicular to the normal vector of this plane by using cross product. However, I do not get the right normal vector by using my two vectors on the plane? My question is that whether my approach is wrong? If yes, what should I do?
-
1) If det(AB) = 0, is det(A) or det(B) = 0? Give reasons for your answer. Q1) First, cannot both det(A) or det(B) be 0? If it can, is this statement false. In any case, how can I prove that this is true for all statement since I only know how to find an example to show this is true, which cannot represent all the possibility. 2) Show that if A is singular and Ax = b, b is not equal to 0, has one solution, then it has infinitely many. Q2) How to approach this question? 3) Let A^2 = A. Prove that either A is singular or det(A) = 1. Q3) How can I approach this question?
-
1) Find an equation relating a, b, and c so that the linear system 2x+2y+3z = a 3x- y+5z = b x-3y+2z = c is consistent for any values of a, b, and c that satisfy that equation. what is the method to solve this problem? 2) In the following linear system, determine all values of a for which the resulting linear system has a) no solution; b) a unique solution; c) infinitely many solutions: x + y - z = 2 x + 2y + z = 3 x + y + (a^2 - 5)z = a For these two questions: Do I make this to be a reduced echelon form first? If yes, how to make it with some variables a, b, and c? If no, what is the right approach for this problem? Thanks
-
1) Let A be an n x n matrix. Prove that if Ax= 0 for all n x 1matrices, then A=O. Can you show me the steps of solving this problem? Please!