Jump to content

Recommended Posts

Posted

Well, the original point I was trying to make was how frustrating it can be for individuals to accept something which is counterintuitive.

 

Intuitively, I think it's very hard for people to accept that .999... = 1 because we cannot intuitively represent an infinitely repeating decimal to ourselves. We know .9 != 1, .99 != 1, .999 != 1, etc. so it seems like no matter how many times you do that you will never reach 1. And clearly, the entire concept of infinity is counterintuitive, because we spend our entire lives interacting with the finite.

 

Reading through that thread, that's all I see. People try to represent infinity to themselves as a sort of number because that's the intuitive approach, and rather than being infinity, it becomes a very large quantity, to the point that you can have "an infinite number of zeros with a 1 on the end" or some other such nonsense which can be added to .999... to equal 1.

 

I'm just amazed at how frustrating counterintuitive concepts can be for people. All that frustration would be abated if they simply accepted the repeated proofs and their inability to create counterproofs.

 

But really, I think what it comes down to is the symbols interfering with the interpretation...

  • Replies 54
  • Created
  • Last Reply

Top Posters In This Topic

Posted
Okay how about if it were written like this?

 

[math]1.\ddot{0}-0.\ddot{9}=1^{-\infty}[/math]

Well that's all well and dandy (not right)' date=' but guess what [math']1^{-\infty}[/math] is equal to?

 

Does it really matter?
Yes, maths sort of breaks down when 1 doesn't equal 1.
For human uses....
This is a maths board.
no matter how percise we needed to make a calculation' date='[/quote']In pure maths it's fairly common to express numbers in such a fasion that there is no need for rounding, or just not round them. But more to the point, any calculation to which the awnser is [math]0.\ddot{9}[/math] will be displayed as 1 on any caculator. Try it: type 3 times ( 1 over 3 ) into your calculator and see what happens.
Posted
The moment one declares that they are an expert, one must begin to question.

I did not declare myself an expert; I am not an expert in mathematics, no matter what any silly badge says.

 

 

 

Two hours ago I, if someone would have told me that 0.9999...=1.000... I would have told them they were nuts. Now while, I may not fully see how .9999....=1, I do see how 1.0000.... - 0.9999.... = 0.0000.... Still, best I can grasp is that the difference between 1.0000.... and 0.9999..... is infinitesimally small, but it is not 0.

 

If you accept that the difference between these two strings is 'infitesimally small' then if we are treating them as representations of real numbers that means *by definition* that they are equal: the only real infinitesimal is 0.

 

Ok, you've just proved that they are equal real numbers even though you don't realize or accept this, because you do not know what the real numbers are (and this is not an insult, I do not know what the chemical formula for acetone, however, I do know what methane is (hopefully), and if I were to say to you that I've found this chemical CH_4 and refuse to acknowledge it is methane even after everyone pointed the correct formula and so on, you'd be right for thinking ill of me for my obstinate refusal to learn a definition. These are just labels in maths too).

Posted
Well, the original point I was trying to make was how frustrating it can be for individuals to accept something which is counterintuitive.

Well I think this thread is proving your point.

 

Intuitively, I think it's very hard for people to accept that .999... = 1 because we cannot intuitively represent an infinitely repeating decimal to ourselves. We know .9 != 1, .99 != 1, .999 != 1, etc. so it seems like no matter how many times you do that you will never reach 1. And clearly, the entire concept of infinity is counterintuitive, because we spend our entire lives interacting with the finite.

This is the problem I'm having a hard time overcoming. My poor finite mind is really tying itself in knots trying to grasp this.

 

I'm just amazed at how frustrating counterintuitive concepts can be for people. All that frustration would be abated if they simply accepted the repeated proofs and their inability to create counterproofs.

Sorry but this runs against my inclation not to accept something to be fact without questioning. I also think that we are better off having mind numbing discussions trying to prove to someone like me that 0.99999.... = 1.000.... than we are having people blindly accepting what "experts" tell them without question.

Posted

Let's put it another way: you are free to play around with infinite stings of decimal digits and declare that 0.999... and 1 are strictly different elements of this space of strings of decimal digits. That's fine, and no one would ever say that 0.999.... and 1 are the same string.

 

But these things, whatever they are, cannot then be a model for the real numbers (in any natural sense).

 

Equally, I am perfectly free to play around with the symbols a/b for a and b integers, and do things like a/b + c/d = (a+b)/(c+d), or even a/b + 1 = b/a, but what I've done is not give a model of the rational numbers with + meaning its usual thing.

 

It is just a definition you're now arguing against. You can philosophize about that but it ain't really a mathematical problem now.

Posted
I did not declare myself an expert; I am not an expert in mathematics, no matter what any silly badge says.

My point was simply that we must always question everything. ;)

 

If you accept that the difference between these two strings is 'infitesimally small' then if we are treating them as representations of real numbers that means *by definition* that they are equal: the only real infinitesimal is 0.

I'm going to go back to my finite mind, but to me an infinitesimally small number is approaching but never quite reaching zero. Simply put I'm stuck on Bascule's example that no matter how far one carries it out, when one stops one always ends up with a difference of 1^-n.

 

Ok, you've just proved that they are equal real numbers even though you don't realize or accept this, because you do not know what the real numbers are (and this is not an insult,...

I'm not going to take it as an insult. I mean after all my math skills were molded in the U.S. education system. ;)

 

My purpose of participating in these forums is to learn and grow, I can't do either without stretching my mind and this thread is really stretching my mind.

Posted

You are thinking far too restrictively. In particular you are free to explore, in maths, as many avenues as you wish, as long as you are consistent. Take one avenue, just for the sake of argument, where we wish to use a system with no infinitesimals (whatever an infinitesimal may be - do you not find the notion of something smaller than anything nonzero but larger than zero a little troubling because that is what you are demanding) and we wish to use strings of decimal digits to do this, then we have to declare 1=0.999.... to do this.

 

If you want to do otherwise you're talking about a different system. But here's the nub: you can't talk about your 'new' system and claim it is the same as everyone elses because it is not. Mathematics is about universally agreed conventions and one of them is that the real numbers behave as we want, and 0.999...=1 in decimal represnetations of these things.

 

That is a stone solid fact that follows directly from the definitions.

 

At best you can argue that those definitions aren't what we ought to be using, but that does not change the validity of the argument.

 

You are just arguing about how to use the symbols, not the symbols themselves, and in this argument you're wrong because you can't argue with a definition.

 

You understand different bases, right, like binary. In binary 1+1=10. If you say this is wrong then you're arguing with definitions, and the 0.999... thing is just the same: these are just representations of numbers, not acutally the numbers themselves.

Posted

Okay, I looked up the definition of "real number". The easiest to understand definition was "a real number is any number which can be represented as a non-terminating decimal." This is from http://www.filosofia.net/materiales/rec/glosaen.htm

 

So great, a real number is any number best I can tell. I looked up non-real number and couldn't get a definition.

 

I then relooked at the blog entry first referenced with all the proofs and broke a few coggs in the gear box.

 

For instance:

 1/3=0.33333....
+2/3=0.66666.....
=============
3/3=0.99999.....

 

I see it and I agree with it, but my mind can't make it make sense. My sense of reality feels like it has been shattered.

Posted

That is not the formal definition of the real numbers.

 

The real numbers are the unique totally ordered complete real field. Complicated huh? Right, here's what they really are:

 

start with the rational numbers. now, consider *all* possible sequences of rational numbers x_n that are Cauchy. The reals are their completion.

 

OK, not helping huh?

 

Right take the rationals and then append on every conceivable limit of sequences of rational numbers, if x is the limit of x_n and y is the limit of y_n then we declare x=y if x_n - y_n converges to zero.

 

(I am being deliberately vague).

 

'Proposition': when done properly, the result is a decent object that contains the rational numbers, and all bounded increasing sequences converge to a *unique* limit (which is the whole point of the construction - to make some space where these things are true).

 

Now, it took several centuries, if not milennia to come to decide that this was what we wanted to do.

 

In any case, in this space 0.999.... and 1 are representations of the same object.

Posted

0.999... = 0.9+0.09+0.009+0.0009+... = [math] \sum_{n=1}^{\infty} \frac{9}{10^n}[/math],

which is an infinite row.

 

The sum of an infinite row is defined as

S=a_0/(1-k), where a_0 is the first number of the row, and k=a_n/a_(n-1).

 

Here: k=0.09/0.9=0.1 and a_0=0.9

 

=> S=0.9/(1-0.1)=0.9/0.9=1

 

 

(Q.E.D.)

 

 

Those who disagree should say where in my calculations I am wrong. :)

Posted

You aren't wrong, but you did miss a summation sign out in the opening line:

 

[math] 0.9\ldots = 0.9+0.09+\cdots = \sum_{n=1}^{\infty} \frac{9}{10^n}[/math]

Posted

Math is abstract; therefore, anything can be as abstract as any person would like it to be.

 

If Andy Warhol were a mathematician, then he would be able to point out the truth of his math to someone whom was expecting more. Andy would say, "What do you expect? This is what it is."

 

The creator gives the clothes made of math to the tailor.

The tailor may keep the clothes the same, for the creator had an original design that was perfect.

Some people may have altered the clothes, while others without the pattern decided to make their own.

And when you say their pattern is wrong, you may be right.

Something or someone, if not somebody, knows the true pattern.

Posted
You aren't wrong' date=' but you did miss a summation sign out in the opening line:

 

[math'] 0.9\ldots = 0.9+0.09+\cdots = \sum_{n=1}^{\infty} \frac{9}{10^n}[/math]

 

 

Yes, you're right... I forgot that. :embarass:

 

(It's fixed now. :) Thanks for noticing!)

Posted

Based on the proof I posted in post #33 I see how 1/3=0.3333... so 3/3=0.9999.... thus 1=0.9999.... Really, 0.3333.... is just a decimal representation of a fraction 1/3 and 0.9999.... is just the sum of three 0.333.... added together and since the sum of three 1/3 is 1 0.9999.... must equal one. In a way I guess it is just a convient convention to clean up a messy little quirk.

 

It makes sense to me now, but man what a mind trip. Now if someone decides to use something like this in real life rather than rounding to significant digits I'm going to smack them. ;)

 

See people helped explain it in a way that someone who hasn't touched a math book since college could grasp the concept and thus accept the logic. Now if we could just do the same about certain biological concepts. ;)

Posted
I also think that we are better off having mind numbing discussions trying to prove to someone like me that 0.99999.... = 1.000.... than we are having people blindly accepting what "experts" tell them without question.

 

You make it sound like I accept that .999... = 1 because it's the word of experts. That isn't the case. I accept it because I have seen probably a good hundred proofs that it's the case, all of which were simple and valid, whereas I have never seen a proof that .999... != 1 (which I deemed valid)

 

I mean seriously, what more do you need than 1/3 = .3333..., 2/3 = .66666..., 3/3 = .99999... = 1?

Posted
You make it sound like I accept that .999... = 1 because it's the word of experts.

No I was only speaking for myself that I tend to question EVERYTHING. I'm not as bad as my brother, but rarely take anything at face value. My brother was so bad as a kid that if you told him a pan was hot he'd touch it to see for himself.

 

While I don't advocate touching a hot pan to verify that a pan is hot, I do advocate questioning something that one doesn't understand. As I did here in a round about way.

 

As you can see by my post above, in part because of this thread, I was able to work through the logic and see how 0.999....=1. It really messed with my mind, but I do now understand. I like understanding why something is the way it is. It helps reassure me that I'm not accepting something on blind faith.

 

This whole thread actually has been really cool and I learned something really weird. Who knew that math could stir up so much debate or have such interesting quirks.

Posted

I can see why the mods always lock these threads. It seems to me that there are two groups of people (obviously) and the two groups are talking in different terms.

1)The self-titled experts (no offense; I haven't seen your degrees) claim, half the time, that their opponents just don't get it, but don't explain why. The other half of the time they give a proof or two; while I firmly believe that .9repeat does indeed equal 1, I have never seen a proof that did not appear to me to have a logical flaw. The clasic 9/9 is the best example - 9 over 9 does not equal .9 repeat, just because x/9 = .x repeat. That's an assumption borne out by experiments, but if you want to know how many barrels of apples (where each holds only 9, exactly) you can fill with nine apples, you end up with 9. That is what math is about, fundamentally - explaining the physical, real world (even though, on some levels, the "real" world seems rather unreal). Or so I would argue.

2) People who tend to be less proficient in math but who sense, as another poster suggested, that others are trying to pull the wool over their head.

 

Personally, I like that the mods lock these threads. I think it's great to debate it, but invariably there are people (from BOTH camps) who get frustrated and (a) start to claim that they are experts (please ... give facts that you can back up and you won't need to claim that you are an expert; even if you are, bragging about it won't get you any respect) who should just be trusted or (b) start to hurl insults at the "experts" because they disagree (often without being able to articulate their reasoning well) with the established mathmatical fact.

 

For the record (since I'd rather have the wrath of the non-experts), I do believe that .999 = 1. All proofs (compelling or not) aside, we're debating over something so small we can't even decide if it exists.

Posted
I can see why the mods always lock these threads. It seems to me that there are two groups of people (obviously) and the two groups are talking in different terms.

1)The self-titled experts (no offense; I haven't seen your degrees) claim' date=' half the time, that their opponents just don't get it, but don't explain why.[/quote']

 

No one here is a self proclaimed expert. All that should matter is that what I say is correct (and you can go away and check it is correct on your own - yet again another beauty of maths: it does not need expensive equipment to perform an experiment to verify a proof, merely logic and access to the internet to look up the definitions like Cauchy convergent sequence and completion).

 

The other half of the time they give a proof or two; while I firmly believe that .9repeat does indeed equal 1, I have never seen a proof that did not appear to me to have a logical flaw.

 

 

Yes, you have, but you don't realize it: any proof that uses the fact that the reals are theC completion of the rationals will almost certainly be correct. I believe that I've even posted the formal proof here in some form, it is after all equivalent to the assertion that, in the real numbers, 1/n converges to zero.

 

 

The clasic 9/9 is the best example - 9 over 9 does not equal .9 repeat, just because x/9 = .x repeat. That's an assumption borne out by experiments, but if you want to know how many barrels of apples (where each holds only 9, exactly) you can fill with nine apples, you end up with 9. That is what math is about, fundamentally - explaining the physical, real world (even though, on some levels, the "real" world seems rather unreal). Or so I would argue.

 

I think you mean the let x=0.99..., then 10x-9=x, hence x=1. I agree, that is not a correct proof, but it is a justification of the definition. It says that if we want to treat infintely long strings of decimals as representatives of real numbers in the obvious way then we have to declare 0.99... to be equal to 1.

 

 

2) People who tend to be less proficient in math but who sense, as another poster suggested, that others are trying to pull the wool over their head.

 

That implicitly states that we *are* pulling the wool of your eyes. We are not.

 

Personally, I like that the mods lock these threads. I think it's great to debate it, but invariably there are people (from BOTH camps) who get frustrated and (a) start to claim that they are experts (please ... give facts that you can back up and you won't need to claim that you are an expert; even if you are, bragging about it won't get you any respect)

 

 

Again, who here has proclaimed themselves an expert (and bragged about it)?

 

who should just be trusted

 

who here as appealed to a proof by authority?

 

For the record (since I'd rather have the wrath of the non-experts), I do believe that .999 = 1. All proofs (compelling or not) aside, we're debating over something so small we can't even decide if it exists.

 

Again, who's debating? We're not, you might be, but we're not.

 

The problem is that too many people (all those in the disbeliever camp and many in the believer camp) do not understand the definitions of the things they are playing around with. The conclusion that you can write the same thing in two different ways is forced on us by the definitions, just like 1/2=2/4. This is not a subjective thing to debate at all.

 

It is certainly frustrating that people refuse to look up the definitions and suggested further reading. Mathematics is a language, in some sense, and these 'debates' are a little like someone attempting to argue in mandarin whilst refusing to speak anything other than english.

Posted
Yes, you have, but you don't realize it: any proof that uses the fact that the reals are theC completion of the rationals will almost certainly be correct. I believe that I've even posted the formal proof here in some form, it is after all equivalent to the assertion that, in the real numbers, 1/n converges to zero.

For me this is what helped me get the idea that 0.999.... = 1. On the face of it this isn't logical but the simple fact that by some convention 1/3 = 0.3333..... and that 0.33333.... must be carried to completion to equal 1/3 and that 3 times 1/3 equals 3/3 which equals 1, then 3 times 0.333333.... equals 0.9999..... which by convention must equal 1. It's a freaking mind trip, but it is hard to argue against:

 1/3=0.3333....
+2/3=0.6666....
---------------
3/3=0.9999....

Since 3/3=1 then 0.9999.... must also equal 1. I see no way to argue against this.

 

I'm no math whiz and I stated up front that I didn't believe that 0.9999.... equaled 1, but I can't argue against the equation above or point to a flaw in its logic. It is a very simple equation and there are no crazy x=y tricks to confuse me. The result seems very counterintuitive to me, but everything adds up, so it must be true.

 

Somebody show me where the logic flaw or math error is in the example I gave.

 

Excuse me while I go wander off to fix the cogs I broke in the ol brain trying to grasp this stuff.

Posted

OH dear. It is somewhat depressing that people think that is proof. It is not. It is at best a justification for a definition. Did none of you stop to consider if the operations you performed are actually valid on such symbols? No, you didn't, you just accepted they were, and two wrongs do not make a right.

 

Still, if no one is prepared to consider the *definitions* of these things, I fail to see what you can do.....

Posted
OH dear. It is somewhat depressing that people think that is proof. It is not. It is at best a justification for a definition. Did none of you stop to consider if the operations you performed are actually valid on such symbols? No' date=' you didn't, you just accepted they were, and two wrongs do not make a right.

 

Still, if no one is prepared to consider the *definitions* of these things, I fail to see what you can do.....[/quote']

 

I was talking about your proofs, matt Grime. I went over some older stuff from calc. from my fall semester, and I realized what you said was true.

Posted
OH dear. It is somewhat depressing that people think that is proof. It is not. It is at best a justification for a definition. Did none of you stop to consider if the operations you performed are actually valid on such symbols? No, you didn't, you just accepted they were, and two wrongs do not make a right. Still, if no one is prepared to consider the *definitions* of these things, I fail to see what you can do.....

To the best of my ability I did try to find definitions (which turned out to be too generic) I have tried to understand the definitions, but I am not a mathematical.

 

I didn't claim my example was a proof, I don't remember how to write a proof. All I know is that if I were to add 0.6666 to 0.3333 I would get 0.9999 I also know that no matter how far out I carry a calculation 6^-n plus 3^-n would equal 9^-n. Thus I have no basis to argue against 0.3333.... plus 0.6666.... equaling 0.9999....

 

I also know that if I take a pie and split it into three even pieces and give all three pieces to the same person they have the whole pie thus I know that 1/3 plus 2/3 equals 3/3.

 

If I try to do a long division on 1/3 I end up getting 0.3333 with a remainder of 1 no matter how far out I carry the division the remainder doesn't change. So I know that 1/3=0.3333.... since apparently the .... is the representation for a continuously repeating number. Using the same method I know that 2/3 equals 0.6666....

 

1) I know that 1/3 + 2/3 = 3/3.

2) I know that 1/3 = 0.3333....

3) I know that 2/3 = 0.6666....

4) I know that 0.3333.... + 0.6666.... = 0.9999....

 

So by virtue of this, 3/3 must equal 0.9999.... and I know 3/3 equals 1 so 1 must equal 0.9999.... or put more simply:

 1/3=0.3333....
+2/3=0.6666....
-------------------
=3/3=0.9999.....=1

 

Maybe it isn't some proper proof using fancy signs, and definitions but sometimes it easier and clearer to skip the fancy stuff and keep things as simple as possible for someone like myself who was not a fan of math in college.

 

Hey in the end I was able to come up with a way to understand the claim that 0.9999.... equals 1 and in spite of my inclination to say that people who thought 1=0.999.... were smoking crack, I have to admit they were right.

Posted
Maybe it isn't some proper proof using fancy signs, and definitions but sometimes it easier and clearer to skip the fancy stuff and keep things as simple as possible for someone like myself who was not a fan of math in college.

 

No, you can't skip out on the definitions. If you do you will literally have no idea what you are talking about- the symbols have no inherent meaning, only the meaning mathematicians assign to them. In my experience, the majority of the people who have problems with 0.99...=1 won't even be able to tell you what the symbol "0.99..." means and yet they feel qualified to make statements about it. This gets annoying fast to anyone who has actually taken the time to understand what the real numbers are (something you don't do in high school, or even your typical intro calculus class), and is why you'll see short replies suggesting they should go and do some much needed reading. If they aren't willing to dive into the details, they will never be qualified to have an opinion worth more than dirt.

 

You linked to a source for a woefully inadequate definition of the real numbers. I would suggest you avoid using what appears to be a glossary on a philosophy website for mathematical definitions. If you want to learn about the reals, go crack opens some textbooks. Any intro to real analysis book will probably do (or something like Spivak's Calculus), look for something that gives a construction of the reals from the more familiar rationals (though theres probably plenty of stuff online). This is admitedly lazy on my part, but it's a much better use of my time if you go away, read, and come back with questions than me building up a detailed theory from scratch when it already exists in many places.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.