Green Xenon Posted March 25, 2011 Share Posted March 25, 2011 Hi: What is the maximum physically-possible clock rate [measured in Hz] of a 1-bit-per-cycle, single-core, purely-serial processor? Thank a bunch, Green Xenon Link to comment Share on other sites More sharing options...
insane_alien Posted March 25, 2011 Share Posted March 25, 2011 you mean what is the maximum clock rate of a transistor? in that case, 100GHz a cpu made of these would be slower. Link to comment Share on other sites More sharing options...
Schrödinger's hat Posted March 25, 2011 Share Posted March 25, 2011 Uhm, it depends on what you mean exactly. I've heard of transistors that will switch in the Terahertz range, but I believe they have many problems that make them unsuitable for the types of chips most general purpose computers use. I suppose if one were to get a very simple processor design from the 1980s, make it out of such transistors, cool it with liquid nitrogen (or helium if it does not undergo some kind of state change), then ramp the clock rate up until it stopped working. I would guess if one were to put a lot of money into such a project one could get something switching at 100s of GHz. It probably wouldn't be very worthwhile, I doubt such a processor would be very useful. Link to comment Share on other sites More sharing options...
Green Xenon Posted March 25, 2011 Author Share Posted March 25, 2011 Would an optical processor be able to run safely at a significantly higher-frequency than an electronic processor? Lets say the optical CPU uses 400-nm-wavelength lasers in place of electronic signals, what is the max clock rate that can be performed by this theoretical CPU without experiencing any physical damage? Link to comment Share on other sites More sharing options...
Xittenn Posted March 26, 2011 Share Posted March 26, 2011 Why 400-nm and what kind of answer are you looking for? This sounds more like a physics question! I say this mainly because you are looking at technology that a) hasn't even been released in a primitive form, or invented and b) you are looking for the physical boundary. Even the photonic processors that are being developed are, for the most part, simply an integration of a light transmission data conduit. They are currently modulating light in various systems at 40 - 100 GHz which you've probably already come across. Why serial? What is it for? Analogue? IBMs Graphene 100GHz transistor technology is still, somewhere .... . . . Link to comment Share on other sites More sharing options...
Green Xenon Posted March 26, 2011 Author Share Posted March 26, 2011 Why 400-nm and what kind of answer are you looking for? This sounds more like a physics question! I say this mainly because you are looking at technology that a) hasn't even been released in a primitive form, or invented and b) you are looking for the physical boundary. Even the photonic processors that are being developed are, for the most part, simply an integration of a light transmission data conduit. They are currently modulating light in various systems at 40 - 100 GHz which you've probably already come across. Why serial? What is it for? Analogue? IBMs Graphene 100GHz transistor technology is still, somewhere .... . . . 1. 400 nm is the shortest wavelength that is safe for human handling at perceptible intensities. Shorter than this and risk of cancer increases. 2. Why choose the shortest safe wavelength? Shorter-wavelengths equates to higher-frequency which equates to greater bandwidth capabilities. 3. I'm asking about photonics because I was hoping for some THz clock rates or even higher. Electric signals at such high-frequencies require massive cooling systems -- else they may start a fire. 4. Serial, because it is more efficient. Parallel systems tend to be affected by disorderly slew rates. HDDs, for example use to be parallel IDEs, now they are serial SATAs. Over parallel devices tend to be bulkier and more power hungry than their serial counterparts. Link to comment Share on other sites More sharing options...
Cap'n Refsmmat Posted March 26, 2011 Share Posted March 26, 2011 Parallel data transfer systems aren't directly analogous to parallel processors. Parallel processing enables higher efficiency, since a single-core processor with sufficient speed to match a dual-core processor would be so fast as to melt itself with heat. Pipelining is also a common efficiency-boosting trick in processors. Link to comment Share on other sites More sharing options...
Green Xenon Posted March 26, 2011 Author Share Posted March 26, 2011 Parallel processing enables higher efficiency, since a single-core processor with sufficient speed to match a dual-core processor would be so fast as to melt itself with heat. True but that is only the case with electric signals. At the same frequency, an optical signal will generate less heat than an electric signal. So when all-optical CPUs are the norm, 400 nm is the sweet spot between high speed and human safety. Lasers are better than LEDs in terms of signal-clarity, because laser light is coherent, while LED light isn't. Link to comment Share on other sites More sharing options...
Xittenn Posted March 26, 2011 Share Posted March 26, 2011 This picture was one of IBMs favorites of 2010. Fairly new and underway! Link to comment Share on other sites More sharing options...
insane_alien Posted March 26, 2011 Share Posted March 26, 2011 (edited) So when all-optical CPUs are the norm, 400 nm is the sweet spot between high speed and human safety. Lasers are better than LEDs in terms of signal-clarity, because laser light is coherent, while LED light isn't. no its not. if nearby humans are being exposed to signaling light from the CPU when it is running then you have much bigger problems than any harmful effects of a minute quantity of UV light. really, even if you had a cpu that used gamma rays, the exposure from the cpu will be minimal and will be swamped by the natural back ground radiation. if, whenever you turned on your processor, it started shooting lightning everywhere, you'd be worried right? and it'd be broken beyond repair. well, same thing with an optical CPU, if you get a laser show its broken beyond all repair and will never work. also, the frequency of the light does not equal the processing speed of an optical cpu. Edited March 26, 2011 by insane_alien Link to comment Share on other sites More sharing options...
John Cuthber Posted March 26, 2011 Share Posted March 26, 2011 "So when all-optical CPUs are the norm, 400 nm is the sweet spot between high speed and human safety. " 400 nm isn't a sensible cut-off. If you were to use 180 nM then, if any did escape from the processor it would be absorbed by oxygen in the air - so it would never reach any people. Of course, this would still only happen if the chip failed big style. A better solution would be to choose the wavelength on the basis of whatever you can get to work. Then put the chip in a box. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now