Jump to content

Recommended Posts

Posted

Hi:

 

What is the maximum physically-possible clock rate [measured in Hz] of a 1-bit-per-cycle, single-core, purely-serial processor?

 

 

Thank a bunch,

 

Green Xenon

Posted

Uhm, it depends on what you mean exactly. I've heard of transistors that will switch in the Terahertz range, but I believe they have many problems that make them unsuitable for the types of chips most general purpose computers use. I suppose if one were to get a very simple processor design from the 1980s, make it out of such transistors, cool it with liquid nitrogen (or helium if it does not undergo some kind of state change), then ramp the clock rate up until it stopped working. I would guess if one were to put a lot of money into such a project one could get something switching at 100s of GHz. It probably wouldn't be very worthwhile, I doubt such a processor would be very useful.

Posted

Would an optical processor be able to run safely at a significantly higher-frequency than an electronic processor?

 

Lets say the optical CPU uses 400-nm-wavelength lasers in place of electronic signals, what is the max clock rate that can be performed by this theoretical CPU without experiencing any physical damage?

Posted

Why 400-nm and what kind of answer are you looking for? This sounds more like a physics question! I say this mainly because you are looking at technology that a) hasn't even been released in a primitive form, or invented and b) you are looking for the physical boundary. Even the photonic processors that are being developed are, for the most part, simply an integration of a light transmission data conduit. They are currently modulating light in various systems at 40 - 100 GHz which you've probably already come across.

 

Why serial? What is it for? Analogue? IBMs Graphene 100GHz transistor technology is still, somewhere .... . . .

Posted

Why 400-nm and what kind of answer are you looking for? This sounds more like a physics question! I say this mainly because you are looking at technology that a) hasn't even been released in a primitive form, or invented and b) you are looking for the physical boundary. Even the photonic processors that are being developed are, for the most part, simply an integration of a light transmission data conduit. They are currently modulating light in various systems at 40 - 100 GHz which you've probably already come across.

 

Why serial? What is it for? Analogue? IBMs Graphene 100GHz transistor technology is still, somewhere .... . . .

 

 

1. 400 nm is the shortest wavelength that is safe for human handling at perceptible intensities. Shorter than this and risk of cancer increases.

 

2. Why choose the shortest safe wavelength? Shorter-wavelengths equates to higher-frequency which equates to greater bandwidth capabilities.

 

3. I'm asking about photonics because I was hoping for some THz clock rates or even higher. Electric signals at such high-frequencies require massive cooling systems -- else they may start a fire.

 

4. Serial, because it is more efficient. Parallel systems tend to be affected by disorderly slew rates. HDDs, for example use to be parallel IDEs, now they are serial SATAs. Over parallel devices tend to be bulkier and more power hungry than their serial counterparts.

Posted

Parallel data transfer systems aren't directly analogous to parallel processors. Parallel processing enables higher efficiency, since a single-core processor with sufficient speed to match a dual-core processor would be so fast as to melt itself with heat.

 

Pipelining is also a common efficiency-boosting trick in processors.

Posted

Parallel processing enables higher efficiency, since a single-core processor with sufficient speed to match a dual-core processor would be so fast as to melt itself with heat.

 

True but that is only the case with electric signals. At the same frequency, an optical signal will generate less heat than an electric signal.

 

So when all-optical CPUs are the norm, 400 nm is the sweet spot between high speed and human safety. Lasers are better than LEDs in terms of signal-clarity, because laser light is coherent, while LED light isn't.

Posted (edited)

So when all-optical CPUs are the norm, 400 nm is the sweet spot between high speed and human safety. Lasers are better than LEDs in terms of signal-clarity, because laser light is coherent, while LED light isn't.

 

no its not.

 

if nearby humans are being exposed to signaling light from the CPU when it is running then you have much bigger problems than any harmful effects of a minute quantity of UV light.

 

really, even if you had a cpu that used gamma rays, the exposure from the cpu will be minimal and will be swamped by the natural back ground radiation.

 

if, whenever you turned on your processor, it started shooting lightning everywhere, you'd be worried right? and it'd be broken beyond repair.

 

well, same thing with an optical CPU, if you get a laser show its broken beyond all repair and will never work.

 

also, the frequency of the light does not equal the processing speed of an optical cpu.

Edited by insane_alien
Posted

"So when all-optical CPUs are the norm, 400 nm is the sweet spot between high speed and human safety. "

 

400 nm isn't a sensible cut-off.

If you were to use 180 nM then, if any did escape from the processor it would be absorbed by oxygen in the air - so it would never reach any people. Of course, this would still only happen if the chip failed big style.

A better solution would be to choose the wavelength on the basis of whatever you can get to work. Then put the chip in a box.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.