Just the Facts Posted January 6, 2011 Posted January 6, 2011 The LHC Project. I completely understand what they are attempting to find. It will result in massive FAIL. (already has) All the time and billions spent on building something that will never prove anything worth knowing. I am certain that within the next ten years the whole thing will be disassembled and sold for scrap. Then they will use the place for some expensive fancy Condominiums. In the mean time... somebody's making a ton of money on it.
timo Posted January 6, 2011 Posted January 6, 2011 It's certainly debatable whether a lot of money should be spent for fundamental research with little or no obvious application. And LHC no doubt is very expensive compared to the average university's backyard experiment. In my experience, it's not someone making a lot of money there, but a lot of people making some. What exactly do you think LHC is trying to find?
insane_alien Posted January 6, 2011 Posted January 6, 2011 (edited) actually, the LHC is spitting out huge amounts of useful data right now. as in its allowing us to test theories to new levels of accuracy. Edited January 6, 2011 by insane_alien
timo Posted January 6, 2011 Posted January 6, 2011 actually, the LHC is spitting out huge amounts of useful data right now. As someone whose computing resources also serve as a Tier-2 node for LHC, I agree with "huge amounts" more than I do with "useful"
Janus Posted January 6, 2011 Posted January 6, 2011 The LHC Project. I completely understand what they are attempting to find. It will result in massive FAIL. (already has) All the time and billions spent on building something that will never prove anything worth knowing. I am certain that within the next ten years the whole thing will be disassembled and sold for scrap. Then they will use the place for some expensive fancy Condominiums. In the mean time... somebody's making a ton of money on it. Once again you display a profound misconception of how science works. Even if the LHC did not find even one predicted particle, that in itself would be information worth knowing. An experiment that does not produce the expected result can push back the boundaries of knowledge just as much, if not more, than one that does. 3
insane_alien Posted January 6, 2011 Posted January 6, 2011 As someone whose computing resources also serve as a Tier-2 node for LHC, I agree with "huge amounts" more than I do with "useful" well, will be useful once processed into nice little graphs.
Just the Facts Posted January 6, 2011 Author Posted January 6, 2011 Once again you display a profound misconception of how science works. Even if the LHC did not find even one predicted particle, that in itself would be information worth knowing. An experiment that does not produce the expected result can push back the boundaries of knowledge just as much, if not more, than one that does. I have a very realistic understanding on how science works. That’s why I know the LHC is a waste of time and money for the little dribble of information they might learn from that massive waste. Just who really benefits from this knowledge they claim will change the world of science. Not you or I. Nothing tangible will come from it. Follow the money. It is a vain pursuit of recognition, award and money by a small group of individuals with way too much influence. Just like Einstein was. As someone whose computing resources also serve as a Tier-2 node for LHC, I agree with "huge amounts" more than I do with "useful" KUDOS to you. Perfect.
John Cuthber Posted January 8, 2011 Posted January 8, 2011 (edited) Since practically everything that Just the facts writes in this forum is unsupported by any experimental evidence it is no surprise to me to see that he doesn't want to see anyone spending money on experiments that will generate real data. He just doesn't seem to understand that you can't advance science without experiments. He seems to think that the way to "prove" a theory is to write bits of it in CAPITAL letters and call all the people whose work shows him to be wrong nasty names. Edited January 8, 2011 by John Cuthber 2
Sisyphus Posted January 8, 2011 Posted January 8, 2011 I completely understand what they are attempting to find. Why don't you summarize your understanding of it, so we can better understand what it is you are objecting to.
Bignose Posted January 9, 2011 Posted January 9, 2011 All, we need to be pleased with this post. JTF has finally made a testable objective prediction! I am certain that within the next ten years the whole thing will be disassembled and sold for scrap. 1
rigney Posted January 9, 2011 Posted January 9, 2011 (edited) All, we need to be pleased with this post. JTF has finally made a testable objective prediction! Even if after two or three years of analysis and still finding the Higgs Boson to be a mystery; this LHC program isn't a failure. Much new knowledge will come from such research. JTFs are constantly needed, if only to spur the best minds in the world to focus on bigger and better things. I actually wish him luck! Edited January 9, 2011 by rigney
chinmayrshah Posted January 9, 2011 Posted January 9, 2011 I just cannot digest the LHC project. They are working on theories which are highly influenced by The Big Bang. The Big Bang itself is an incomplete "if...then" story. How can science work on a very basic false or defective assumptions? Its like assuming "Assume 1+1=9 where the operators have their usual meaning". Please comment whether I am wrong or not! From a new member Chinmay Shah (ChinzFactory Scientifique)
Cap'n Refsmmat Posted January 9, 2011 Posted January 9, 2011 I think you have it reversed; the theories the LHC intends to test have implications for how the Big Bang may have occurred, rather than the other way around. The evidence the LHC provides will determine whether the assumptions are false or defective.
dragonstar57 Posted January 12, 2011 Posted January 12, 2011 wont the LHC BE useless for now until they get that cloud computing network i'v heard about recently up and running data is useless until you can analyze it right?
Cap'n Refsmmat Posted January 12, 2011 Posted January 12, 2011 They got that network up and running a year or two ago.
dragonstar57 Posted January 12, 2011 Posted January 12, 2011 (edited) oh. so does anyone know how well the network is working? Edited January 12, 2011 by dragonstar57
Cap'n Refsmmat Posted January 12, 2011 Posted January 12, 2011 http://lcg.web.cern.ch/LCG/public/default.htm
dragonstar57 Posted January 13, 2011 Posted January 13, 2011 a little off topic but why don't they use a that system (or one like it) for climate forecasting?
Cap'n Refsmmat Posted January 13, 2011 Posted January 13, 2011 Why would that help over a regular supercomputer? It's not like we have gigabytes of weather data being generated every second. Although... Formerly named AWS Convergence Technologies and operators of the Weather Bug Web application, Earth Networks said today it will invest $25 million over five years to equip about 100 locations worldwide with sensors to measure the concentration of greenhouse gases in the atmosphere, including carbon dioxide and methane. http://news.cnet.com/8301-11128_3-20028265-54.html
timo Posted January 13, 2011 Posted January 13, 2011 What exactly do you mean by "a system like it"? A lot of CPU power? Climate science does use a lot of CPU power. I do not know how exactly it compares to LHC requirements, but if it's significantly less then that's likely either because of a lack of funding or because CPU power is not the limiting factor in climate science. Particle physics at colliders is much more accurate than climate physics (at least I think so) or actually almost any other branch of physics (that I do know for at least some other fields). A distributed system? I don't know about climate research, but collider data is remarkably well-suited for trivial parallelization. What you have is collisions of protons, which are independent from another. A huge amount of them. So in principle, you can send each of them to a separate computer, analyze them there, and only send the result back to a central facility. That's not possible in systems where the events in one part of the system influence other parts. A de-centralized processing might require a huge amount of data transfer for communication between the different nodes.
dragonstar57 Posted January 13, 2011 Posted January 13, 2011 (edited) Why would that help over a regular supercomputer? It's not like we have gigabytes of weather data being generated every second. Although... http://news.cnet.com/8301-11128_3-20028265-54.html because than you could have real time temperature (from thermal satellite scans and co2 data from everywhere on the planet) record the data every 10 minutes. that would generate allot of data in a short time. would a normal supercomputer be able to handle THAT much data? but this is a little off topic perhaps a new thread is required. Edited January 13, 2011 by dragonstar57
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now