Sarahisme Posted March 19, 2006 Posted March 19, 2006 Hey everyone, i am having a bit of trouble trying to do this question. how do i show that they are linearly dependent ? i tried using the wronskian but this doesnt work does it? (because just because the wroskian is 0 on the interval it doesnt neccessarily mean that the two functions are dependent, right?) this is what i have done so far: (a) for 0<t<1 |t| = t so f(t) = t^2.t = t^3 = g(t) therefore f(t) and g(t) are linearly dependent on 0<t<1 for -1<t<0 |t| = -t so f(t) = t^2.-t = -t^3 = -g(t) therefore f(t) and g(t) are linearly dependent on -1<t<0 for -1<t<1 because f(t) is a different multiple of g(t) on 0<t<1 than on for -1<t<0, therefore f(t) & g(t) are linearly independent on for -1<t<1 also for part (b) i get W(f,g) = {t^6}/|t| which is obviously not zero for all -1<t<1 ?? :S ========================= does that makes sense to you guys?? because i am not to sure it makes sense to me! any help would be lovely -Sarah
Tom Mattson Posted March 21, 2006 Posted March 21, 2006 how do i show that they are linearly dependent ? You've already done it. i tried using the wronskian but this doesnt work does it? (because just because the wroskian is 0 on the interval it doesnt neccessarily mean that the two functions are dependent' date=' right?) [/quote'] That's right. this is what i have done so far: (a) for 0<t<1 |t| = t so f(t) = t^2.t = t^3 = g(t) therefore f(t) and g(t) are linearly dependent on 0<t<1 for -1<t<0 |t| = -t so f(t) = t^2.-t = -t^3 = -g(t) therefore f(t) and g(t) are linearly dependent on -1<t<0 That's just what I would have done. for -1<t<1 because f(t) is a different multiple of g(t) on 0<t<1 than on for -1<t<0, therefore f(t) & g(t) are linearly independent on for -1<t<1 also for part (b) i get W(f,g) = {t^6}/|t| which is obviously not zero for all -1<t<1 ?? :S How did you get that for [imath]W(f,g)[/imath]? The Wronskian clearly vanishes on [imath]-1<x<0[/imath] and on [imath]0<x<1[/imath]. Having said that, the Wronskian doesn't exist at [imath]x=0[/imath] because [imath]f[/imath] is not differentiable there. So the problem statement is wrong when it claims that [imath]W(f,g)\equiv 0[/imath] everywhere in [imath][-1,1][/imath].
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now