Primarygun Posted August 10, 2006 Posted August 10, 2006 For sequence' a n = a n-1 + a n-2 We first assume that a n has a general form of b^n. Then, we put it into the sequence equation and get a quadratic one. It will be ok for me if the final solution for the a n be that form. But normally, if there exists two different roots, the final solution indeed is in the form of (C b^n + D f^n) where they are arbitrarily assigned. Why do we assume the incorrect form but getting the correct answer? Any comments Please!
matt grime Posted August 11, 2006 Posted August 11, 2006 We do not assume that the solution is b^n at all. We assume that the solution is of the form c_1*b_1^n + c_2*b_2^n: the equation is linear. And then from this knowledge we know that b must satisfy a quadratic equation based upon the information given. All you're doing is taking sloppy shorthand for absolute truth.
Primarygun Posted August 11, 2006 Author Posted August 11, 2006 Thanks for your help first. Here I cut a short passage from a lecture from a local university. I highlighted the part which shows that the prove assumes that the "incorrect" form. Why can it lead to a more complicated solution?
matt grime Posted August 11, 2006 Posted August 11, 2006 *may* have solutions of the form, not *does* have. In anycase, it is badly written, but that is not uncommon in textbooks (it for instance states the solution in the degree one case is a^nx_0 and then immediately asserts x_n=a^n is a solution. It is not unless x_0=1
Primarygun Posted August 12, 2006 Author Posted August 12, 2006 OK. Thanks for telling me that prove is poorly written. I am going to seek a better one. Do you know where I can find it?
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now