-
Posts
8390 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by bascule
-
If you're going to apply Occam's Razor, why assume an unparsimonious non-deterministic mechanism operates in addition to the deterministic ones which govern all other known physical behavior? I personally consider indeterminism unparsimonious, but the prevailing attitude among those who have their head in the actual math all day seems to favor the existence of actual non-deterministic systems.
-
That's one way of looking at it
-
And that really is the point: the system is deterministic but the state at each timestep is dependent upon the totality of the data at each previous timestep. Since the totality of data in the system is needed to compute each successive timestep, it's inherently non-local. Rule 30 provides an example of a "non-local hidden variable theory" (if that term is really apt) for explaining the statical randomness of 2D discrete time/space universe.
-
It has nothing to do with Folding@Home, but it has everything to do with distributed computing in general. With reversible computers the cost of computation does not increase with the number of computations being performed. At that point idle cycles will effectively be wasted, and using them will become increasingly important. An idle computer will consume no more power than one under heavy load. There's another distributed computing project I should mention. I'll start another thread.
-
Well, some physicists (here) are under the impression that present evidence supports non-determinism. The C rand() function doesn't generate statistically random numbers. PRNGs generate as close to an even distribution as possible, but there are many tradeoffs involved, specifically in terms of computation time and memory use. Knuth talks about this exhaustively in chapter 3 of volume 2 of the Art of Computer Programming. There are many times that a faster algorithm which generates a more uneven distribution is okay, such a stochastic sorting algorithms like skip lists. Knuth goes through dozens of algorithms and their potential tradeoffs. The ones you see in the C rand() function generally optimize for speed rather than a more even distribution. (Much to the chagrin of people who complain about how the shuffle feature of their music player always plays the same songs) The real evidence that rand() doesn't generate a statistically even distribution is that it isn't cryptographically strong. If the numerical distributions it produced were as statistically random as those of Rule 30, then you could use rand() as the basis of a streaming cipher by seeding it with a shared private key, then running rand() output through some obfuscation algorithm on the plaintext (XOR is a simple but bad example) Rule 30, when employed as the basis of a stream cipher, leaves no statistical signature on the output data: http://www.cs.indiana.edu/~dgerman/2005midwestNKSconference/dgelbm.pdf I'd like for you to find me any other PRNG which can be employed for the purpose of strong cryptography The point of this thread isn't to speculate. It's to point out that there's no actual evidence that quantum behavior is non-deterministic. It's been mathematically proven that a deterministic algorithm can generate statistically random numbers.
-
I've heard both physicists and non-physicists claim that quantum behavior is non-deterministic. As far as I can tell, there are two basic ways this can be argued: We have no deterministic explanation for certain quantum properties, therefore they're non-deterministic. This is an argument from incredulity, and therefore fallacious. That leaves us with: Certain quantum properties appear statistically random. Therefore we conclude they're non-deterministic. However, there is a fundamental assumption here which is wrong. That is: only a non-deterministic process can produce statistically random data. There is a mathematical counterexample to this: the Rule 30 cellular automaton. Rule 30 is a 2 dimensional (1 dimension of space, one dimension of time) discrete time process where the value of a cell and its two neighbors are used to compute the new state of the cell during the next iteration. The transition table looks like this: Rule 30 generates statistically random output. It's random to the point that it can be used for high security cryptography. If a deterministic process can generate statistically random output, then one cannot claim that because a set of data is statistically random that it was generated by a non-deterministic process. So, as far as I'm concerned, non-determinism is nothing more than an unsupported assumption, and determinism is equally likely.
-
I think you're confusing "functional" with "procedural". Object oriented and functional concepts can be used side by side in the same language. There are many languages built around a functional OO paradigm, such as O'Caml, Python, and Ruby. Functional and object oriented programming actually go quite well together. In functional languages where objects are the intrinsic primitive, functions just become first class objects.
-
Japan is the only place I've been which is truly outside of the United States. I've been to border towns but those don't really count. Japan was totally awesome. Two years of studying the language sure helped. At one point I got drunk at karaoke and lost my camera, and was able to go to a police station and recover it, all without speaking a word of English. I also had an interesting trip home once, where I almost missed a necessary train transfer except for the fact that I managed to hear the announcer say "tomaranai de" (without stopping) and figured out I was on a kyuuko (express train) which would not be making a stop at the train station I needed to get off at. Then I managed to board the right bus. One of the friends of my host family pointed at a sign listing the 3 available routes when I took the bus with her and let me know "These kanji!" represented the proper route. I had absolutely no idea how to pronounce them, but I could remember "The ones on the left side of the sign" and pick them out again on the bus. After an apprehensious bus ride I got too nervous and decided to get off at a stop with a police box that could perhaps direct me back home. The bus driver was a little nervous to let me off after I told him I didn't know where I was or where I was going, but after a little bit of finagling I paid and got off only to realize I was at the proper stop. Phew, what a relief. It's a crazy country to travel in, especially if you go to more exotic destinations. We ran into a group of Australians who spoke no Japanese and spent most of their time lost. I'm sure my experience would've been different if I didn't speak the language, at least to a certain degree. Oh, and I got to ride on the world's fastest elevators and see the world's largest Buddha! And I drove up to past timberline on Mt. Fuji. Rock!
-
I'll second Python. Hell, I'll second anything with an interactive environment. If you can get into an environment and start using it as a basic calculator, it goes a long way towards getting you comfortable, and that's half the battle. Any language where you can't do anything except from a file you've written to disk seems rather unfriendly to me.
-
I pick Colossus: The Forbin Project
-
After recent events, if I were to pick a Republican candidate, it'd be Guiliani. While I applaud McCain for both eschewing CPAC (which Guiliani did attend), and for lambasting Ann Coulter for using the word "faggot" at the event, it's pretty obvious what he's trying to do, and it seems to be failing on both sides. Liberals who saw him as a respectable conservative maverick willing to buck some of the retardedness which began to permeating the Republican party saw him playing more and more to his conservative base, supporting both Bush and the war, and then eventually to a degree capitulating to the administration on the use of torture. Conservatives at the same time saw him as being too liberal, and eschewing CPAC was certainly a slap in the face for them. Michael Savage recently called McCain a hippie. I think Obama got the play-to-moderates formula down: rather that doing things which will appeal to one side or the other (and at the same time piss the other side off) don't do anything offensive and hope everyone likes you! So anyway, I watched much of the CPAC coverage. From that it's pretty clear Mitt Romney was the candidate favored by hardline conservatives, and Giuliani managed to piss them off by being "too liberal". And yes, I agree, Giuliani is liberal enough to earn my respect at least. So, to kick off my newfound support for him, here's a quote from Le Tigre's My, My Metrocard:
-
Many CPUs, particularly the Core, support dynamically adjusting the clock speed. CPUs consume energy in the form of irreversible operations (most notably trying to put electricity through a reverse biased semiconductor and having it "blocked" in the form of heat) so if you clock your CPU down you will reduce the number of these operations and thus the amount of energy your CPU consumes. Consequently clocking your CPU up will increase the number of irreversible operations and thus the amount of heat your CPU puts out. Contrarily clocking your CPU higher causes it to heat up. If you're looking for a good way to save energy, throttling your CPU down is a great way to do it. This is a standard energy saving feature on many operating systems, as when a minimal number of programs are being executed the CPU does not need to run at the excessive speeds allowed by today's technology. There's hope on the horizon though. Irreversible operations are not a necessary component of a CPU architecture except in the case of error correction, and even in modern CPU architectures the fundamental logic of their operation is comprised largely of irreversible operations (e.g. bleeding electricity as heat) There's hope on the horizon! If CPUs could be constructed using only physically reversible processes, then only a minimal amount of energy (whatever the system naturally "leaks" and whatever is required for error correction) is required in order for it to perform computation. This idea is known as reversible computing. As long as we're using reverse biased semiconductors as one of the tools for representing 0s and 1s (generally the zeroes) our CPUs are going to waste a lot of energy.
-
Folding@Home is kind of pointless. They're trying to do the same thing as the world's most powerful supercomputer, BlueGene. BlueGene is doing an atom by atom simulation of protein folding pathways and kinetics, which as I understand depends largely on proper modeling of Gibbs Free Energy.
-
From this blog: http://tickletux.wordpress.com/2007/01/24/using-fizzbuzz-to-find-developers-who-grok-coding So, how's your FizzBuzz-fu? You can upload your solution here and have it ranked against others: http://golf.shinh.org/p.rb?FizzBuzz Can you beat me? I did it in 71 bytes of Ruby
-
Cuz liberals are smarter than conservatives, duh
-
I spend 95% of my time in OS X now (I say as I post this from an XP machine) Vista's attempts to bring OS X niceties to the Windows world, but does it with immense bloat. Or, I can run XP in Parallels on my Mac and run Windows applications side-by-side with my OS X ones, and even dock them. It's awesome! Who needs Windows, really?
-
Vista DRM applies more to software (namely itself) than media
-
I think it's better to start off with a higher level language then work your way down. The problem with Java is that the core of what you're trying to do (such as in the example I gave) is lost in overly verbose and often unreadable syntax. hash[key] += 1 is clearly readable and explicit. This sort of mentality is deeply rooted in Paul Graham's "blub paradox": a programmer familiar with higher level languages can always step down to blub, but it's much more difficult for the blub programmer to step up, because they're still thinking in blub. There's an adage to the effect of "You can program Fortran in any language" which embodies that idea quite nicely. I guess above all else: I'd avoid a language with static typing to start out with. Static typing is great for building complex, enterprise caliber software using large teams who communicate primarily in the form of API documentation, but for someone who's just jumping into programming, a dynamically typed language will be much friendlier and more intuitive.
-
I program Ruby professionally and love it as a language. It's definitely seen a massive surge in popularity following the release of Rails, which was certainly its first real killer app, and also rather frustrating to the Pythonistas who had devoted so much time to writing web frameworks, none of which managed to generate the buzz and attention that Rails did. That's not to say that Python hasn't been successful as a web language: It's the basis of many Google applications and also what YouTube is written in. But Rails has managed to capture a lot more attention, namely through agile development practices like convention over configuration and test-driven/behavior-driven development. But that said, both Ruby and Rails have their fair share of problem. There are some serious issues with the way the Ruby VM is implemented which they're presently working on moving from an AST-based VM to a stack-based one which uses a JIT compiler. Rails has grown increasingly bloated and seems to be suffering from a dictatorial developer core who isn't open to new ideas. There are fundamental problems with the way its dispatcher is architected that make it very bad at things like serving files or chunks of data. Furthermore, they've been very anti-component. Rails comes off a lot like WebObjects minus the component architecture. All that said, I'm going to our local Ruby Users Group tonight to learn about a new Ruby web framework called WAX.
-
Sure we've (probably) all heard of BASIC, C/C++, Java, Perl, PHP, and Fortran (this is a science forum after all!), and to a lesser extent I'm sure many have heard of Lisp, Python, Smalltalk, and possibly Ruby (yay!) So, this thread isn't about those languages. What languages have you been dabbling in which are yet to gain mass appeal? Here's my list: Erlang - Originally designed to solve the problems of fault tolerant, highly interconnected telephone communication systems, it draws upon a bizarre lineage which includes history of obscure languages targeted at parallel systems, like the Transputer's Occam. Erlang operates utilizing lightweight processes which communicate using an asynchronous message model and all run concurrently. While I doubt Erlang will be the Next Big Language, whoever eventually comes up with a language designed to target an upcoming generation of massively parallel multicore CPUs will likely borrow heavily on Erlang. Bizarre syntax and a general divergence from common language concepts will prevent Erlang from gaining mass adoption. D - It's everything C++ and Objective C weren't. Imagine a fast, compiled language with C-like syntax, a sensible OO model, and garbage collection. Gone are the arcane syntax of both C++ and Objective C; D focuses on clarity. D is quickly gaining libraries, and provides easy interoperability with C, but will likely serve as a source of ideas for upcoming changes to C++ than a replacement. Lua - What can only be described as a "multiparadigm" language, Lua provides an abstract framework in which higher level concepts like object orientation can be implemented, much like Lisp or CAML. However, unlike Lisp and CAML, Lua is an imperative procedural language and may therefore be a bit more familiar to your average programmer. Lua is one of the few languages which uses a register-based virtual machine which is often regarded as superior to the stack-based virtual machines of Java and the .NET Common Language Runtime. The only other language I know of offhand using a register-based VM is Perl 6's Parrot, which at this point is essentially vaporware (and also Perl 6 developers have considered dropping Parrot entirely) Factor - A stack-based concatenation language, much like a combination of Lisp and Forth. Factor also borrows ideas from Sun Microsystems Self language, particularly a rich and powerful console to develop from. While your average language probably requires around 10,000 lines of Yacc in order to parse it, Factor manages to implement the core runtime in less code, thanks in part to becoming mostly self-hosting. For those of you who thought stack-based languages were dead or only useful for OpenBoot/OpenFirmware debugging, guess again! I've been meaning to look at Haskell since it's apparently all the rave, but haven't had the time to wrap my brain around it and it looks very... weird.
-
Obama: Because everyone else sucks I could see myself voting for Clark (if he runs again). But other than that, not really. Obama's got one big thing going for him: Save for FAUX News character assassination (and "accidental" character assassination on CNN) he's quite electable. He's such a political novice he doesn't have much of a voting record. He didn't hold national office during the vote for the PATRIOT Act or the Iraq War, so you can't hold those against him. He doesn't feel like a politican. He feels like that nice guy down the street you say hi to every morning on your way to work. I really like him, for the above reasons. In terms of what he'd actually do as President, that's a giant question mark for me. That's a bit scary. I feel kind of like Family Guy where I'm being offered various things which feel kind of meh, then someone hands me THE MYSTERY BOX. Clark is Clark, but Obama could be anything. He could even be Clark! I'll take the mystery box
-
I wouldn't recommend Java. For one thing, the primitive types aren't first-class objects. They're weird. There are first-class objects you can use to represent the primitive types, but they're immutable. This is all a throwback to C++, but it gets in the way all the time. For example, in order to store a number in a data structure, you first have to wrap it in an object, since it isn't a first-class object and therefore can't be treated like other objects. Imagine you want a data structure which stores mutable integers. You're kind of SOL with the classes available to you in the class hierarchy. You're basically stuck building your own mutable wrappers for integers. The fun continues: There's no operator overloading! So what in a sensible language would be: hash[key] += 1 Becomes the following in Java: (and you have to implement MutableWrapper yourself!) v = (MutableWrapper)hash.get(key); v.set(v.value() + 1);
-
Life began over some 3 billion years ago
-
I think structures which are functionally equivalent to the mammalian neocortex have evolved in the pallium of birds. I think because of this birds may be "conscious" on levels similar to some mammals. It's certainly happened, as far as I'm concerned, so the question becomes which animals are such structures present in. I think the overwhelming majority of animals lack such structures... particularly insects, and in terms of vertebrates: fish. It's my understanding that the majority of fish think in terms of simple stimulus/response mechanisms (namely Innate Releasing Mechanisms triggering Fixed Action Patterns). This has been a remarkably successful strategy for most fish, so there was never any selection pressure to integrate information at higher levels, certainly not at the cost of a larger, more energy-hungry brain.
-
http://www.fixedearth.com/Size_and_Structure%20Part%20IV.htm Everything science has taught you about cosmology is wrong