Hey guys,
I am a bit new to this forum and this is my first post (i used to just read stuff that came up here). I hope its not inappropriate to ask something right away, because i will do now ask for any advice. I'm studying applied mathematics, and at the moment im working on differential equations. Now i found something i could find an answer to in my book, or somewhere else on the internet. In my textbook (martin braun, differential equations and their applications), i stumbled on to the following question:
Show that the solution of y(t) of the given initial-value problem exists on the specified interval:
y' =y^2 + cos(t^2) , y(0)=0; on the interval 0 <= t<=1/2.
the existence-theorem on this subject tells me that I need a rectangle [t_0 < t < t_ +a ] X [y_0 -b , y_0 +b] to be able to use the theorem. But, thats my problem here, I can't construct a proper rectangle, because there's no |y(t)| <=b specified.
Now my question is, how do I apply the existance theorem to a initial value problem when the specified interval has boundairies for t, but not for y. (if i use my own brain, i'd say just use |y| <= \infty, but i cant justify that)
Could anyone point me in the right direction or give me a helpful answer? Would be great!
x tymo