Jump to content

Recommended Posts

Posted

Sure.

 

You should know by now about square roots, and how [math]\sqrt{4} = 2[/math] because [math]2^2 = 4[/math].

 

The trouble occurs when you try to do [math]\sqrt{-4}[/math]. There's nothing that, when squared, will give you negative four. Try it. Take any negative number and multiply it by itself. You'll get a positive number -- but -4 is negative.

 

That's where imaginary numbers come in. It's sometimes useful to take the square root of a negative number, even though there is no actual square root. So we make up a number called [math]i[/math]. To define it the most simply (there are better ways to define it), [math]i = \sqrt{-1}[/math]. Of course, [math]i[/math] doesn't actually exist, so it's imaginary. We just use it because it's helpful, not because it represents something real.

 

So then what's the square root of -4? It's 2i. When you're faced with an equation like this:

[math]\sqrt{-16}[/math]

you can take the - sign out as an i:

[math]i \times \sqrt{16}[/math]

which becomes

[math]4i[/math]

when you take the square root of 16.

 

Complex numbers can also be things like [math]4i + 3[/math] or anything with that goofy letter i in it. Just remember that i represents [math]\sqrt{-1}[/math].

Posted
sorry i wrote it twice...

 

Thanks for that it seems so much easier to understand now...

 

Am i allowed to ask more questions?

 

Please do! Even I'm learning from the responses you receive. :D

 

You might, however, now that you have some of the basics, research a little on your own, then present your question like,

 

"I think that blah blah blah, but I'm not sure about this specific thing. Does blah happen this way or does blah happen that way? Thanks for your help."

 

 

Well, you get the point. ;) Enjoy.

Posted

If imaginary numbers don't exist, does that mean I just imagined I saw them in the Schrodinger equation >:D

 

Some people think math isn't complex enough unless you have imaginary numbers too :eek:

 

Sorry, I couldn't help myself.

Posted

polynomials...

 

I have no idea what they are used for and what the point of them is..

please help me it is appreciated everytime someone helps...

Posted

A polynomial is just a fancy name for an equation like [math]h = 3 + 4t - 9.8t^2[/math], which has three ("poly" means many) parts: 3, 4t, and -9.8t2. That equation, by the way, is the equation that could tell you the height of a ball thrown upwards at 4 meters per second from a height of three meters. They can be used for about anything; that's just one example.

Posted

Many things. A polynomial is just a type of equation, so you'd be better off asking what an equation might be used for.

 

Physics, for example, is all about equations. You can predict how fast something will be moving at a certain time, how high it will be, what direction it will be going, etc. and a lot of that uses polynomials such as [math]x = x_0 + v_0t + \frac{1}{2}at^2[/math] and other fun equations.

Posted

An equation is a broader concept, not as limited in scope. This could be a=b, or 1+1=2 or 2*3=6. These are all equations. However, a polynomial is much more defined and it has a very specific set of parameters on how it's composed... like x = 2y+4 or the examples above.

 

When I read your question, it sounded like "what's the difference between a heliocopter and a motorized method of transportation." You see? A motorized method of transportation could be anything like a car or a boat or a motorcycle or a plane, whereas a helicoptor is a very specific type.

Posted

yes. polynomials are a subset of equations. just like a poodle is a subset of 'dogs'.

 

though i happen to think that polynomials are infinitely less daft looking than poodles.

Posted

Small remark: Strictly speaking saying that a polynomial was an equation is wrong. "x²-3x" is a polynomial (in x) but certainly not an equation (because there is no "=" sign in there). I think you should substitute "polynom" with "polynomial equation" or "polynomial function" in many sentences above.

Posted

So, here's something that's puzzled me off and on for while. Suppose [math]1x+2x^2+3x^3[/math] is a polynomial of degree 3.

 

What is [math]0x+1x^2+2x^3 = x^2+2x^3[/math]? Is this a polynomial of degree 3?

 

Then how about [math]0x^2+1x^3= x^3[/math]? What would we call that? Is it still technically a polynomial of degree 3?

 

Or would one better argue backwards: any expression can be written as a polynomial of some degree? If so (and I'm not suggesting it is so) isn't it rather the case that some polynomial of some degree is a generalization of any expression?

Posted
So, here's something that's puzzled me off and on for while. Suppose [math]1x+2x^2+3x^3[/math] is a polynomial of degree 3.

Ok.

 

What is [math]0x+1x^2+2x^3 = x^2+2x^3[/math]? Is this a polynomial of degree 3?

Yes, highest exponent (with a non-zero prefactor) counts.

 

Then how about [math]0x^2+1x^3= x^3[/math]? What would we call that? Is it still technically a polynomial of degree 3?

Yes, highest exponent (with a non-zero prefactor) counts.

 

Or would one better argue backwards: any expression can be written as a polynomial of some degree?

Not necessarily, e.g.: sin(x), exp(x), log(x), [math]\sqrt{x}[/math], [math]\Theta (x) = \left\{ \begin{array}{rcl} 0&:&x<0 \\ 1&:&x\geq 0 \end{array}\right.[/math].

 

If so (and I'm not suggesting it is so) isn't it rather the case that some polynomial of some degree is a generalization of any expression?

I don't think so, but some scientists are rather unscrupulous approximating functions with with polynomials ("sin(x) = x" for small values of x is a common one).

Posted
Not necessarily, e.g.: sin(x), exp(x), log(x), [math]\sqrt{x}[/math], [math]\Theta (x) = \left\{ \begin{array}{rcl} 0&:&x<0 \\ 1&:&x\geq 0 \end{array}\right.[/math].
Thanks for that. But the first two on your list can be expressed as a Taylor series (I'm not too sure about the others). These (the Taylors, that is) are polynomials, surely?
Posted
Thanks for that. But the first two on your list can be expressed as a Taylor series (I'm not too sure about the others). These (the Taylors, that is) are polynomials, surely?

 

A taylor series is a polynomial, but it is also an approximation, not the actual function.

Posted
A taylor series is a polynomial, but it is also an approximation, not the actual function.
Lost me there! f is a function on, say, X, which evaluated at some x in X yields the expression f(x). f(x) is not a function, it's an evaluation of x in the codomain of the function f

 

sin(x) is the evaluation of the function sine at some x in X and so on.

 

So what do you mean by ".....not the actual function"?

Posted

It's an approximation, it's an infinitely long sum, if you where to sum to infinity it would give the same answer as sin(x), but normally people only take the first few terms of the expansion, infact, it's quite common with sin(x) to take only the first term, which is just "x", that is a perfectly valid taylor expansion of sin(x) but is only valid for small angles. This is so common it is refereed to as the Paraxial approximation...

Posted
So what makes the difference between a polynomial and a regular equation?
Regarding what Atheist said, let's pretend you asked the difference between a polynomial and any other expression. The answer being that a polynomial is the sum of positive integer powers* of x, and their co-efficients**. So ax3+bx2+cx1+dx0 is a polynomial but (ax3 + bx3.6)/cx1 isn't.

 

* This includes x1 which is usually written as x, and x0 which is always 1 so rarely written at all.

** Atheist seems to call coefficients prefactors, as far as I can tell we're talking about the same thing.

Posted

Missed most of this conversation now, but I thought I'd make a few comments. Generally, when talking about polynomials, we are referring to elements of the space

 

[math]\mathbb{R}[X] = \left\{ \left. \sum_{k=0}^n a_k X^k \ \right| n \in \mathbb{N}, \ a_k \in \mathbb{R}\right\}[/math]

 

Notice that this is a far more general definition of a polynomial: here, X could be a real number, or it could be a function, matrix, etc. So in a sense, the summation notation is incorrect - elements of this space are just finite sequences of real numbers. Indeed, to vaguely refer back to a point that the tree made above, if one assumes [imath]X[/imath] is some non-zero real number [imath]1/x[/imath], this ring gives you finite series of the form

 

[math]\sum_{k=0}^n \frac{a_k}{x^k} \in \mathbb{R}[x^{-1}][/math]

 

(In fact, if you define the space of formal power series [imath]\mathbb{R}[[x]][/imath] by removing the restriction on [imath]n[/imath] being finite and ignore the fact that these series do not always converge, you can prove some cool things in terms of asymptotic relations and Laplace transforms).

 

Defining the degree of a polynomial is then rather trivial. You can define the function [imath]\text{deg} : \mathbb{R}[x] \to \mathbb{N}[/imath] by [imath]\text{deg}(\sum_{k=0}^n a_k X^k) = n[/imath].

 

In regards to Taylor/Maclaurin series, Klaynos is quite correct in saying that a smooth function cannot be written as a polynomial (which are deemed to be of finite order). However, you can write them as the limiting sequence of polynomials. Essentially, Taylor's theorem says that for any n+1 times differentiable function [imath]f : [a,x] \to \mathbb{R}[/imath], the following expansion is possible:

 

[math]f(x) = \sum_{k=0}^n \frac{f^{(k)}(a)}{n!}(x-a)^k + R_n(x)[/math]

 

where the 'remainder' [imath]R_n(x) = O(x^{n+1})[/imath]. For smooth (infinitely differentiable) functions, one can prove that [imath]R_n(x) \to 0[/imath] as [imath]n \to \infty[/imath].

Posted
Thanks for that. But the first two on your list can be expressed as a Taylor series (I'm not too sure about the others). These (the Taylors, that is) are polynomials, surely?

 

No, a Taylor series is NOT a polynomial it is an infinite series. A polynomial, by definition, has a "highest power" (its degree) and is a finite sum. If you "chop" of a Taylor series at, say, the nth power, you get a "Taylor polynomial" which is the approximation Klynos was talking about.

 

 

For an analytic function, its Taylor series is exactly equal to it, not an approximation. There are, however, "smooth" (infinitely differentiable) functions that are NOT analytic and not equal to their Taylor's series. One such is f(x)= e^{-1/x^2} if x is not 0 and f(0)= 0. All derivatives exist and area 0 at x= 0 so the Taylor series for it around x= 0 is identically 0. That, of course, converges for all x but converges to f only for x= 0.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.