jebz123 Posted March 4, 2010 Posted March 4, 2010 If v1 = u1 + u2 + u3, v2 = u1 + a*u2, and v3 = u2 + b*u3...where u1, u2, and u3 are given linearly independent vectors, find the condition that must be satisfied by a and b in order to ensure that v1, v2, and v3 are linearly independent.
Cap'n Refsmmat Posted March 4, 2010 Posted March 4, 2010 Okay. How would you start this? We're not going to just give you the answer. How would you start, and where do you trip up?
shif Posted April 1, 2010 Posted April 1, 2010 u1, u2 and u3 are the basis of the linear space they span (they are lin. indep. and span it by def.) Write v1, v2 and v3 in a matrix according to the basis {u1,u2,u3} and you get 1 1 1 1 a 0 0 1 b the will be lin. dep. if and only if we can make a zero row with Gauss elimination which is if and only if b(a-1)+1=0 (if I am not mistaken)
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now