Jump to content

bascule

Senior Members
  • Posts

    8390
  • Joined

  • Last visited

Everything posted by bascule

  1. Yep, the hard part is figuring out how to get information back into the brain, encoding it in a way the brain can understand. But this is definitely going to happen, and when it does we're not too far away from technological singularity
  2. The biggest way that CDs and records will sound different comes from the digital master. Digital masters of music done before the age of digital recording/production are always done from tape (as opposed to vinyl). This is unfortunate because tape degrades with time (whereas vinyl degrades with use). Furthermore, many of these songs are being mastered from copies of the original master, or copies of copies. Sometimes they'll be mastered from a copy which has been sitting around for 10 years of another copy which was 10 years old when the second copy was made. There are all sorts of fancy production techniques to try to regain the fidelity of the original master, but what you ultimately get is a distorted copy of the original master run through a bunch of filters to try to make it sound good. If you have a copy of the vinyl which was cut around the same time that the song was recorded that hasn't been used excessively, chances are it will be much truer to the original master than the digital master you get on a CD, since vinyl doesn't degrade with time in the way tape does. In terms of the format itself... "Golden ears" listening testers can fairly reliably discern 16-bit from 24-bit sampling (this mainly affects the dynamic range, but also affects how well the Quadrature Amplitude Modulation produced by Digital to Analog Converters actually represents the shape of the original waveform. Postprocessing filters can help smooth out the blockiness that results from discrete sampling). With records sampling isn't an issue, because the way they store audio is continuous vs. the discrete sampling approach of CDs, however the dynamic range of records is about half that of CDs. 44.1kHz is the sampling rate. Because it takes at least two samples to represent a waveform, the highest frequency waveform a CD can store is 22050Hz. However, using only two samples to represent a waveform will sound quite awful. The Red Book (mastering standards for CDs) specifies that all audio must be run through a 20kHz lowpass filter first, so the maximum frequency a properly mastered CD can store is 20kHz, versus ~48kHz for a record. Frequencies above 20kHz cannot be heard. However that doesn't mean they are imperceptable; many of these frequencies resonate with various parts of the human body, so they are still tangible and therefore the "listening experience" as a whole is affected. CDs have a dyanmic range of 90dB, whereas records have a dynamic range of 45dB. This is great for orchestral music which may express a large dynamic range spanning from pianissimo to fortissimo. However, most producers want to make the quieter parts of modern rock/pop music sound louder so they're easier for people to hear. This is a technique known as "compression." Typical rock music is compressed down to at LEAST a 45dB range (i.e. people LIKE the compressed sound of a record), if not lower. Many modern producers have a tendancy to overcompress music as this sounds trendy. So really it all comes down to personal preference. Many people actually like the clicks and pops that vinyl produces. Some of us, however, try to keep our records clean. However, when you're talking about original recordings being done today in the age of digital music production, vinyl is more of a gimmick and will ALWAYS be lower fidelity than what you're getting on CD (after all, it's a perfect copy of what's in the engineer's computer), or sells to DJs who need to beatmatch the records and adjust their tempo (although CD players exist that let you do that too).
  3. The Von Neumann Universal Constructor comes to mind...
  4. You've presented bullshit Yes, sorry for "spinning" your "facts" as bullshit You're claiming that big oil is holding back cold fusion. Certainly sounds like the bullshit of a conspiracy theorist to me...
  5. Certainly not the conspiracy theorist...
  6. Zionism is "a policy for establishing and developing a national homeland for Jews in Palestine," therefore "A World without Zionism" would entail... wiping Israel off the map...
  7. I'm a big fan of Tesla. I've read multiple biographies and investigated his life quite extensively. Maybe the fact that I already know the truth has rendered me immune to your bullshit... Tesla suffered from the fact that he had very lofty goals which weren't ever immediately practical, or possible, but that didn't stop him from dreaming... just don't confuse his dreams with reality, because for Tesla himself the border between the two was quite fuzzy. Tesla had many, many wholly impractical goals and spent other people's fortunes trying to make them a reality... but rarely succeeded.
  8. Yes, and the article you quote isn't at the URL you gave... THREE TIMES! Furthermore, Tesla claimed to have invented all sorts of things he never did, including a "Death Ray," a weather control machine, machines designed for the wireless transmission of electrical power, etc. Or rather, he thought he was "close" when he, in fact, was quite a ways off the mark. The British government tried to build his death ray. They couldn't get it to work. Tesla was just way too much excitement and way too little math...
  9. Will the Semantic Web ever be successful? For those who don't know, the Semantic Web, pioneered by Tim Berners-Lee himself, is an attempt to move the web from a loosely structured collection of documents to a type of machine-understandable knowledge structures called ontologies. To put it simply, the Semantic Web tries to add meaning to the link structure of the web. It's like what the Xanadu project was trying to do way back in 1960, although they were waaay ahead of their time, and some worry the Semantic Web might be too. So far, no one has managed to make a "killer app" for the Semantic Web's underlying technologies, but this may be in the pipe in the form of Semantic MediaWiki, a Semantic Web-enabled version of the software which powers Wikipedia.org. No longer will the information in Wikipedia be locked up in a way where it can't be used by other applications (except by linking back to the Wikipedia web site), it will be directly accessible to all in a way where it can be cohesively integrated into other sites. The idea is that there's all sorts of functionality spread all over the web which, using the Semantic Web and other "Web 2.0" technologies, you can begin to integrate into more cohesive packages of underlying functionality. In this way the web can begin to take on more of an instantly searchable universal archive of all human knowledge type of feel, when structures that overlay our own understanding of the knowledge they describe can begin to map connections between related fields, and those exploring these connections may soon discover interrelationships between their own fields and similar ones, especially in the sciences. When you look at how interconnected and multidiciplinary interpretation you need to continue scientific advancement (e.g. biologists need to know how old their fossils are, so they turn to geologists to determine the age of the rock layers they were found in, which in turn use techniques created by nuclear physicists) the semantic web will become a great tool for revealing connections in multiple domains of knowledge which we previously couldn't have seen existed because precise descriptions of the interrelationships of ideas in realms of knowledge are typically inaccessible to those outside the field. Searches for similar knowledge structures could automatically reveal such interrelationships in disparate fields, and thus we can begin using computers to automatically reveal similar ideas stumbled upon independently by two different researchers in two different fields, so we don't have to depend on chance encounters to do that job for us (e.g. Geneticists had discovered a 70,000 year old bottleneck in the human genome at the same time geologists had discovered evidence of a 70,000 year old supervolcano eruption in Madagascar) Can you imagine being able to produce a correlated timeline of everything we ever knew of happening all the way back to the dawn of the universe? Think of how many mysteries that could solve, by having instant access to all contemporaneous events... Anyway, it seems like stupid crap like social bookmarking has garnered a lot of interest and in the process usurped a niche that the Semantic Web was aiming to solve. I just wish the Semantic Web would get more attention, and hopefully it will when wikipedia.org itself starts running Semantic MediaWiki... (Ed: There's a Semantic MediaWiki running you can check out if you want: http://wiki.ontoworld.org/index.php/Main_Page)
  10. The problem is there's all sorts of different kind of branes that behave in different ways. If you really want to understand string theory, learn the math behind it.
  11. Tesla did build an electromechanical machine that you could attach to a structural column of a building and vibrate it at his resonance frequency. He did this to the building his laboratory was housed in, much to the chagrin of local police. However, this is obviously far from an "earthquake machine"
  12. Your sense of humor seems to be somewhat lacking
  13. Over here we call it "liberating"
  14. http://www.theaustralian.news.com.au/common/story_page/0,5744,17084566%255E2703,00.html One more similarity... both scare me
  15. Tesla figured out that if you vibrate something at its resonance frequency you'll get complete constructive interference, so you can keep putting more and more energy into something and all of it will be stored up as increasingly stronger vibrations. The process by which he did this though was mechanical... he vibrated his laboratory at its resonance frequency by building a machine to shake it (which we can undoubtably guess was electromechanical in nature)
  16. Yes And my favorite Darwin quote on the matter: But basically, we'll either become gods or destroy ourselves in the process...
  17. http://redwing.hutman.net/~mreed/warriorshtm/necromancer.htm
  18. If I may venture a guess here, I'd say that human memory (and thought as well, although the underlying implementation is no doubt rather different) utilizes underlying data structures which information theorists would refer to as ontologies.
  19. Humans are apes with larger and more powerful abstract association centers which allow higher level thought and processing of abstract concepts imparted by others.
  20. Dr. Roger Pielke, head of the American Association of State Climatologists, debunks the Butterfly Effect: http://climatesci.atmos.colostate.edu/?p=68 http://climatesci.atmos.colostate.edu/?p=70
  21. It's not negative mass you need, it's imaginary mass
  22. *cough*BULLSHIT*cough*
  23. Any of you claiming that it's impossible for us to design a consciousness which is better than own have some rather ill founded ideas about how consciousness actually operates. I suggest you pick up a copy of Daniel Dennett's book Consciousness Explained in which he details an "Empirical Theory of Mind" based upon countless scientific research experiments. We have a genetic blueprint for consciousness already sitting inside of computers in the form of our own genetic blueprint. When we produce computer models capible of growing lifeforms from a digital copy of their genes, we will be able to produce a model of a human being by "growing" one inside of a computer. Once we have this, we can make the most extensively detailed analysis of the operation of the human brain which has ever been accomplished, because we'll be able to produce complete snapshots of the brain in action. From that we can reduce consciousness to a mathematical model of its operation. Once we have this, we can look at fundamental design problems and bottlenecks. We have intelligence on our side; natural selection did not (sorry IDiots). There are certainly major problems with the way our consciousness operates, and many of these things are second nature to computers (i.e. math is hard, memorizing things is hard), so that when we have a computer running a mathematical model of consciousness itself, augmenting its design to accommodate the niceties of modern computers should be a relatively easy task, once the mathematical model (or even just a small portion necessary to interface with it) has been understood. As it stands, even without the luxury of growing a human brain inside of a computer we have a mathematical model of how certain parts of the brain operate. A mathematical model of the hippocampus, the center of short term memory, has already been constructed.
  24. You don't need genetic evidence. There is plenty of statistical evidence to support the assurtion that blacks are, in general, better athletes. http://archive.salon.com/books/feature/2000/01/28/taboo/print.html The Salon article further argues that there are at least observable phenotypical differences (duh), from which we can infer that there are genetic differences (although the research to confirm this has not yet been performed):
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.