Jump to content

AdvRoboticsE529

Senior Members
  • Posts

    42
  • Joined

  • Last visited

Everything posted by AdvRoboticsE529

  1. CharonY and John Cuthber, the tone and provocativeness is quite clear, it is as if you do not want questions to be asked. I am certain that you will attempt to penalise / belittle me for this comment, so go ahead, this forum is not for me.
  2. I never supported the notion.
  3. Yes, I talked about some of the definitions before which fails to addresses the true relationship, they're artificial. Exactly, you should not say that 0 = 1, as the fact of probability theory should not be claimed if insufficient evidence or true pure mathematical proofs are unavailable, this is the exact case, I am not making the claim, statistics is where the claims are made. Also, the "nature" of statistics and probability included may not validate it as a method. The scientific method fails to confirm if there is a lack of technology, one example is the constant modification of the atomic model, lack of technology results in uncertainty, long-term it will amend itself, short-term it can be uncertain.
  4. You defend statistics by contrasting it to the faults of the failures in in science, the scientific example is not necessarily the best method for confirmation. And, you took the argument of authority by claiming I lack knowledge and am ignorant in the field. Other comments aside, I never made any extraordinary claim, it is statistics that made the claim of deduction / induction (specifically) from uncertainty, yet lack proofs that holds to be true without context or definitions, they are only as true as the definitions and theories of which conform to them. I do not need to provide evidence, it is time for statistics and statisticians to live up to their own word, my study in economics is consistent in uncertainty in that statistics fails to address true relationships or fails to determine the true variables. Again, I will concede that I do not know of the application of probability in quantum mechanics, however, I will not concede that statistics is as true as pure maths or certain branches of maths of which proofs does not require context to be true, and addresses the true relationship. Application in reality does not validate the method, we are going in circles. Further, its application in reality does not always holds true, and in certain cases are extremely accurate even if correct on paper. The best example is probably the paper: "On Default Correlation: A Copula Function Approach." Whereby its application in certain areas are accurately enough, yet is quite inaccurate and is claimed that in the application of the statistical method, it has resulted in a market crash. Before the crash, the method was validated by statisticians in the context and definitions of statistics. It is those complexity which makes me think that statistics is a field to be respected, I used to think that even if initial definitions and unproven mathematical theories (to repeat, only proven by their own context, and to be untrue when context is removed) may be ambiguous at least it resulted in a field of great deduction from induction, I don't think so. I don't think we'll be able to discuss, because you are quite confident in your view as am I, and I don't want to go in circles.
  5. Water in the body and its purpose is not sufficient or specific to prevent combustion, claims of water being a barrier to combustion is unrealistic.
  6. Well, I mean I would've love the extent of proofs by statistics found in pure maths, such claims requires better proofs, the whole thing which started my distrust of statistics is the lack of proofs and proofs of which are only true given invented context, whereby when context is removed the proof will no longer apply. Like I said, I don't get this problem when integrating the area of the circle, to give an example, there is no context needed, Also, I wasn't able to get my point across on the ambiguous application of methods and defined definitions. Also, probability theory and such, unfortunately, does not see the rigorous testing as other concepts in math in which I would love to see, for the application of probability in gambling games can be true following the rules of probability theory, yet when applied will not hold, in this case I'm not talking about the application itself but the concept of probability itself. As repeated, am told that it is applicable in quantum physics. I also had many problems with definitions, which I don't think we'll be able to discuss... Anyways, the point of this thread is the validity of statistics, whether statistics is applicable or not (which in most cases, it is, and can be quite useful) does not mean it is validated, and statistics have on occasions failed even when applied by statisticians and agreed amongst statisticians, I gave an example in economics previously with works extensively in correlation.
  7. I don't I am able to get my point across, good talk anyways.
  8. Interesting thought, although the income / wealth of specific households is quite important. Statistical analysis will definitely be quite important in this case, hopefully, in future, better computing technology will assist in mass electronic identification to result in more precise calculations, this uncertainty bugs me.
  9. That is not an attack, stop bothering me. @studiot Also, you need to stop arguments from authority.
  10. The question is loaded, by finding the average age of every citizen of the USA, you would apply the method of average, of course. However, what is your purpose? What relationship do you intend to look at? Artificial definitions and limits whereby the calculations conform to such. Quantifying the errors of statistics with statistics does not seem to be practical.
  11. The application of mean in statistics is not the only definition I had problems with, it is as if my posts weren't read at all. Variance and skewness is very much artificial, I never talked much about skewness, skewness focuses specifically in the difference in quartiles which are set at 25%, 50% and 75%, again, specifically engineered so to suit the ideal world of whoever created such methods, this is very much unpure.
  12. To determine how likely the sample of mean is the mean of the entire set of data with statistics itself seems to be going in circles, I only talked about averages, however, it would seem that many other definitions and the application of them still seems artificial. @studiot You're trying to be personal again.
  13. No, I don't understand quantum mechanics, and I have asked about the application of probability in quantum mechanics, as repeated, am told that there are confirmations with experiments.
  14. For example, the average coordinate given several coordinate in the case of geometry can be expressed with a function, you look at the true relationship, or, suppose the average of different vectors in mechanics, which gives the true final vector, different respective fields and their application of average holds true. In statistics, the basic, the mean, as the average of a set of data, whereby the known and unknown variables affecting every-single value of the set of data is different and not constant hence not equal, yet the average is still applied, and hence only poses to minimise the difference of the mean value from every-single value within the set of data. In pure or mechanics, the coordinates of vectors are such that a function can give you predictions of any coordinate or vector hence they are, shall we say, the same thing. In statistics mostly, especially in applying to reality, the values within the set of data are very much different with unknown variables affecting them. You can try to test the function in economics with statistics, it usually will not work and is quite inaccurate. This is the ambiguity which I find in that when you attempt to calculate the average of a set of data in statistics, that mostly does not give an accurate picture of the set of data.
  15. I'm not claiming sums and divisions aren't maths, I'm claiming that the false application of it hence does not truly address the relationships. Averages are of course common especially in geometry and many fields of pure maths, however, its application in the field of statistics is quite artificial. There is no difference in calculation only the application and purpose of it, you don't seem to understand, for different relationships you search for you apply different methods, and the methodology of the average in statistics only poses to minimize the difference of the final value from every-single value, I'll say it again, it is quite artificial.
  16. Well, it is subjective whether it is maths at all, when I do maths, I always try to find the relationships, whether the value may be difficult or not, I don't try to create definitions and methods specifically to fit the range of values I want, or the type of values. The application is also ambiguous, the application of mean, it is not used as in pure maths, such as by looking at the average coordinate when given several in pure, for geometry purposes. Whereby in statistics, the notion of mean itself is applied to a set of data given, in an attempt to minimise its difference from every-single value, as contrast to determining the true average coordinate as in geometry that is linked to all given coordinates and can be represented with a function.
  17. Sorry, typed a bit quickly. The application of such methods in order to analysis the properties of the set of data is ambiguous,
  18. You already entered development after the application of probability theory with uncertainty and unpure proofs from artificial definitions that does not look at the true relationship but is engineered specifically from previous simpler definitions. If it was pure maths this would not hold at all because the relationship itself of the properties of the set of data is not looked at sufficiently. This is akin to decision maths, where there are algorithms which works yet can only be proven given artificial context made purposely to prove such, only to be replaced with a better algorithm until it begins to delve more into pure maths, such as with the Taylor series.
  19. Exactly. "Half the time you get heads and half the time you get tails." Can only be proven given the context matches, in this case probability theory. It doesn't "turns out", it is the exact case that they prefer the value for variance to be seemingly logical hence positive. Binomial distribution is not subjective as invented, distribution of data however is.
  20. Distribution is defined as such to result in the set of data to possess specific properties and values, if you disagree, you might as well show me the relationship of distribution (extremely subjective) and prove it. To make it less subjective, I had the idea by finding the steepest gradient of any given function, and to of course, know the difference of gradient of an infinite set of limiting values , which shows the true relationship, as contrast to statistics, not that I'm anywhere close, and if I am, will not publish it here. @studiot Please don't make it personal. There are mechanical systems which currently is not conclusive, that doesn't mean it cannot be conclusive, that has always been my point. Certainty is with 100% accuracy. Giving three warfarin tablets daily is not certainty as the patient in which you are treating may be quite different form the average (statistical analysis), personalised treatment will prevail over statistical analysis.
  21. Perhaps the definitions of classics such as the "averages" should be interpreted, however, this does not change the fact the definitions such that of variance or probability theory is far from reality, this isn't f = m*a type of definition or radian theta = r*l, the definitions are purposely engineered so to result in a number which fits *their version* of reality, it does not look at the real relationship, which I keep repeating. If you saw and link the variables in relations that distinguishes the variation of the set of data from others, yet result in different numbers or undesirable numbers such as complex numbers, by changing the definition to avoid those results you are already making it artificial, to fit the their own ideal world. There are countless respective fields where I do not have methods to replace statistics, that does not mean it shouldn't be improved nor does it validate the method, anyways, I intend to work in the medical field to eliminate uncertainty, hence statistics. In probability theory, the likelihood is such that the outcome is only as likely as its supposed "concentration" (so basically, fraction or percentage, to be represented with decimals, again, more artificials) as part of a set, which is assumed to be random when it is likely that it is not understood properly. If statistics is 100% valid as akin to many other math branches, in the application of statistics in unbiased gambling this should hold true, yet, it is dependant on the gambling and whereby the relative differences are usually distinguished by different unknown variables that affects the known variables. I heard that probability is quite applicable in quantum physics, either it is not understood thoroughly due to complexity, or perhaps it is true that probability holds to be 100% valid, as am told by confirmation by experiements. @John Cuthber Yes, based on underlying distribution such that it is specifically engineered to result in desired numbers and properties of the set of data, not, the value given by true relationships itself.
  22. The relative volume of water of the average human body is likely not sufficient, nor is there a sufficient circulation system, most water can also be found in vital organs such as in the ileum of the small intestine, this does not prevent combustion of the skin due to separation. Further, there are more factors involved than just water, as mentioned the wicker effect. Mammals do burn (such as with the case of Thich Quang Duc who set himself on fire), we don't know the cause for such so we should not assume why they should or should not burn.
  23. @Bignose No, that is not what I'm saying, you can substitute statistics yet the mentality impedes the encouragement of certainty. Apart from all other aggressive comments aside: Statistics is not certain, for most concepts there are no proofs. In the definition of variance, I have made this statement before, a component is the summation of the difference of 'x' squared values minus the mean, depending on the person I have spoken with before, the set of reasons are different, most common answers is that it inflates the values, ensures positive value (hence eliminates possible application of complex numbers), or subjectively for some people, is more mathematically "nicer". The formulas of correlation are based of from the definition of variance and they are not proven, only, derived from uncertain definitions, standard deviation is also an example that is derived from the definition variance. One of the reasons I have asked before for the method to determine the steepest gradient of any given function, is that I want to see what I can do in correlation in statistics, to ensure that correlation is more mathematically pure and not just based off from the definition variance that is ambiguous. Now, you have different analytic statistical methods, such as with the classical median, mean or mode, these are classics in which have been made clear that they are also ambiguous, and uncertain, depending on the set of data, the value can be skewed and biased, and are not completely applicable. You also have methods for skewness and distribution of data, which are essentially more definitions, that fails to seek out true relationships as in pure maths, only altering various parts of definitions to result in the numbers they want. Finally, in the case of probability, probability can only be proven in the context of probability theory, that is, in any other given context it is false, unlike proofs found in other branches of maths. If you determine the probability that a huge comet will crash into Earth and destroy large portions of life-form on Earth, you can apply probability and presume to understand the picture, yet, you can search for the variables and calculate the size of comets necessary to pass through Jupiter's gravitational field and such necessities, to be implemented in computer algorithms and result in 100% certainty, this is the thing in which I keep repeating. Also, I study electronics, in modern electronics there are no uncertainty, that's why at the beginning I have asked the application of probability in quantum physics, because in the real world, including economics (which I also study), probability is usually based on luck, it does not give a picture of anything as probability is derived from limited variables with unknown relationship that is not accounted for. "And, there is no way to be 100% sure what is infecting you will kill you-" More presumptuous statements from the people who defend statistics, clearly you're not interested in discussion but decisive statements.
  24. I have no opinion on the approximation theory, the purpose for the replacement of statistics should be to eliminate uncertainty, hence estimations will not be preferred regardless of their relations. You use too much metaphors, and is very unspecific, difficult to discuss with you, I try. You also shouldn't be so presumptuous in that many problems cannot be solved with "explicit" equations, this is the mentality in which should be discouraged, and is encouraged by statistics, just because certain problems seems difficult currently does not mean it is unsolvable, and should be worked on towards certainty in contrast to living in uncertainty. I previously asked my teacher how could you determine the steepest gradient of any given function, she said it is not possible (I searched online for a method, which is beyond my current skill yet is possible), other questions I have asked includes the summation of root numbers which is premature yet does not mean it is not possible to be precise in the formulation of formulas, you are not unlike many authorities who are confident that what they know is the truth, relativity shows that time is relative, a contrast to the previous authorities who refused such concepts. Such that the proof of Fermat's last theorem seems unsolvable with previous methods and seems only possible with the advent of computation, I would happily assume that all problems are solvable and can be shorten into a simple equation / formula, after the rigorous proofs that makes a very interesting journey. Anyways, a thread originally to see the general opinion of the validity of statistics, not really going where it should.
  25. My papers never caught fire on radiators... this is relative, and water does matter, as the moisture level can be important, always relative. Drier climates especially of drier forests are more susceptible to wide-spread fire in contrast to wetter jungles, for example. Also, heat does not guarantee combustion...
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.