Jump to content

Recommended Posts

Posted

If an instruction takes 1,000 lines of code to write, using the normal 2 states (1s and 0s), how many lines would the same instruction take to write by a future computer able to use 8 states (various increments of "on/off").

 

Also, how much would processing speed increase?

 

Thirdly, would storage capacity (hard drives, disks) be increased any due to the extra states?

Posted

I think it's pretty difficult to answer you questions without knowning more about what type of technology we could use to do this.

 

HDD's are inherently on or off (due to the magnetic domain switching used), so we'd need whole new technologies.

 

How would non-binary logic gates work? Can you explain how a logic function would apply to two little bit on states?

Posted
I think it's pretty difficult to answer you questions without knowning more about what type of technology we could use to do this.

 

HDD's are inherently on or off (due to the magnetic domain switching used), so we'd need whole new technologies.

 

How would non-binary logic gates work? Can you explain how a logic function would apply to two little bit on states?

For technology, let's say a photon sensor object, which uses a higher base number than 2 for computing, let's say 8. The number of photons hitting it simultaneously determines the instruction. It works differently than binary instruction, which needs one gate per on/off instance. Let's just say the photon sensor is the gate itself.

 

Now, instead of 0 and 1 being the limit of variables from which to build a computer language, we can stretch the new limit to however many photons the object can absorb simultaneously. To make answering my questions easier, I started with 8.

 

I'm sure one can imagine the possibilities. If your computer suddenly understood 8,000 rather than 2 instructions at once, it might vastly shrink the amount of data you need to instruct the computer to do something.

 

For example, a program that needs 1.5 million lines of code with binary might instead only need 375 lines of code with this new process.

 

I'd like to verify how feasible my thought experiment is.

Posted

You'd need to come up with some other logic type for using base 8 as opposed to base 2, else answering the questions is very difficult.

Posted

You're asking me to describe a method that doesn't exist. It'd be like me asking how fast a spaceship would be traveling if it reached Jupiter in eight days from here, and your response was to ask me how that's possible first.

 

I think my question is legitimately answerable.

 

Please give it a shot. Forget the photon example. I don't even want to know the exact numbers. Just tell me, from your experience in knowing how quickly a program's size can grow (just to accomplish simple tasks), can this be alleviated if the computer were able to use a higher base number than 2?

 

And if so, will it possibly have benefits for overall speed as well?

 

Thank you.

Posted
What if there was a way to use quantum spin or something in the place of 1 and 0?

 

You'd still have two states.


Merged post follows:

Consecutive posts merged
You're asking me to describe a method that doesn't exist. It'd be like me asking how fast a spaceship would be traveling if it reached Jupiter in eight days from here, and your response was to ask me how that's possible first.

 

I think my question is legitimately answerable.

 

Please give it a shot. Forget the photon example. I don't even want to know the exact numbers. Just tell me, from your experience in knowing how quickly a program's size can grow (just to accomplish simple tasks), can this be alleviated if the computer were able to use a higher base number than 2?

 

And if so, will it possibly have benefits for overall speed as well?

 

Thank you.

 

I'm not sure it is easily answerable without knowing something about the logic.

Posted

Does "logic" have a technical meaning in computing? If that's what you're referring to, I know nothing about it.

 

Imagine you're stuck using only two letters of the alphabet to communicate the English language, plus all verbal and mathematical languages in existence.

 

You would use ABABAAAAABBABBAAABBAABABABAAAAABABABAABBABBAABAB for example to say "please". Not very efficient.

 

You'd say there were "AB" kinds of people who understood binary or not, rather than "2".

 

All I'm saying is that if the computer understood 36 states instead of 2, it would be able to write the numbers 1-10 and letters A-Z as they naturally appear in our language. Thus computer programming becomes a vastly easier task.

 

The only way for this to occur is for the computer to understand more than on/off in a practical manner. One such way is by a photon receptor sensitive enough to distinguish between one, two, or more photons hitting it.

 

Now if this receptor can absorb 100 photons, then we can directly code in alphabet of most languages, plus numbers and various math symbols. I think the current system is highly inefficient is all.

Posted

I like the idea in theory. The problem is in how small you can make a structure that will successfully change to that many different states. The Quantum computer Transdecimal referred to is already being thought about (maybe designed, not sure of current state of the science) and has had a lot of published work, so as soon as you get to an order of magnitude larger than quantum level, you've lost any advantage over using binary code.

Posted

Couldn't an 8 state logic system be modeled effectively by a 2 state logic system since 8 is divisible by 2? See for example:

 

http://en.wikipedia.org/wiki/Octal#In_computers

 

Octal is sometimes used in computing instead of hexadecimal, perhaps most often in modern times in conjunction with file permissions under Unix systems (see chmod). It has the advantage of not requiring any extra symbols as digits (the hexadecimal system is base-16 and therefore needs six additional symbols beyond 0–9). It is also used for digital displays.

 

At the time when octal originally became widely used in computing, systems such as the IBM mainframes employed 24-bit (or 36-bit) words. Octal was an ideal abbreviation of binary for these machines because eight (or twelve) digits could concisely display an entire machine word (each octal digit covering three binary digits). It also cut costs by allowing Nixie tubes, seven-segment displays, and calculators to be used for the operator consoles; where binary displays were too complex to use, decimal displays needed complex hardware to convert radixes, and hexadecimal displays needed to display letters.

 

All modern computing platforms, however, use 16-, 32-, or 64-bit words, with eight bits making up a byte. On such systems three octal digits would be required, with the most significant octal digit inelegantly representing only two binary digits (and in a series the same octal digit would represent one binary digit from the next byte). Hence hexadecimal is more commonly used in programming languages today, since a hexadecimal digit covers four binary digits and all modern computing platforms have machine words that are evenly divisible by four. Some platforms with a power-of-two word size still have instruction subwords that are more easily understood if displayed in octal; this includes the PDP-11. The modern-day ubiquitous x86 architecture belongs to this category as well, but octal is almost never used on this platform.

 

Also, there were and are analog computers

 

http://en.wikipedia.org/wiki/Analog_computer

 

which have an infinite number of states (as an example of a non-binary computer). I would suggest you read these wiki articles as a starting point to answer your questions in more detail.

Posted

I've no idea if I'm even in the ballpark here with a correct answer, but I'll toss it out there for the sake of conversation. Let me go back to the OP:

 

If an instruction takes 1,000 lines of code to write, using the normal 2 states (1s and 0s), how many lines would the same instruction take to write by a future computer able to use 8 states (various increments of "on/off").

 

Also, how much would processing speed increase?

 

Thirdly, would storage capacity (hard drives, disks) be increased any due to the extra states?

 

Expanding on Sherlock's post above, the original question sounds akin to the old question of whether or not it's better to increase word size. For example, to stick with 16-bit words, or increase it to 32-bit words. The advantage being that you can do more with each word. If we ignore current technological limitations (as requested), I believe the advantage set it infinite, i.e. it is correct to state that increasing data/instruction bandwidth increases computer speed in direct proportion, right up to the full speed of the processor. (And therein lies the rub, of course -- the processor's already at 100% utilization, making the real issue one of bottlenecks and Von Neumann limitations. But that's irrelevant for our current discussion.)

 

The OP, however, asks a slightly different question -- what if we change the basis for the data itself? Instead of gates storing 1s and 0s, we use "gates" that store 8 states of information. Put in the context of a Von Neumann architecture, this could mean (in theory) shoving 8x as much data through per cycle.

 

Again setting aside issues of technological capability (since the OP uses undefined "future computers"), the answer would seem to be the same -- more data, faster processing. This would seem to be a relatively straightforward path to an answer of "eight times" to the first question. The second question would seem to be answered with a simple "no" -- data is data, and we haven't changed the concentration of information here.

Posted (edited)
If an instruction takes 1,000 lines of code to write, using the normal 2 states (1s and 0s), how many lines would the same instruction take to write by a future computer able to use 8 states (various increments of "on/off").

 

Also, how much would processing speed increase?

 

Thirdly, would storage capacity (hard drives, disks) be increased any due to the extra states?

 

Assuming you're using the same programming language, you'd need the the same amount of code. Why? You're changing the nature data is stored on a chip, not the programming language. The advantage of having 8 states would be to increase the amount of data that can be stored on a chip of the same size.

 

Example (correct me if I'm wrong):

Let's say you want to store the number 8. In binary code, you'd need 4 "storage spaces" (e.g. 4 surface area units) to store 1,0,0,0 because that's number 8 in binary. When there are 8 states, you'd need 1 storage space to store the number 8. However, if you need to store two pieces of code, "8" and "8" (not the number eighty-eight). You'd need 4 storage spaces for the first "8", another 4 for the other when data is stored in binary states - a total of 8 storage spaces. When there are 8 states, you'd need only 1 storage space for the first 8, and another 1 to store the other 8, hence significantly reducing the amount of storage spaces needed. However, when writing the code on a piece of paper, you'd write 8,8 on either the binary or the 8 states system. They'll just be stored differently.

 

Processing speed will definitely be increased for the 8 states system (as opposed to the binary system) if the chips for either system can be read or written (i.e. switched to different states) at the same speed, but that's only an "if." The increased speed comes from the fact that there would be less "storage spaces" (e.g. chip surface area) to process.

Edited by mrburns2012
Posted
Logic is how computers compute.

 

http://en.wikipedia.org/wiki/Logic_gate

 

And there is ternary logic:

 

http://en.wikipedia.org/wiki/Ternary_logic

 

Three values for logic expressions can have a number of uses. One of the most obvious is comparing two integers. In a three-valued logic system this can be done with a single operator, which can tell you if the two numbers are equal, if the first is greater than the other, or if the first is less than the other. Programmers often rely on the concept of a "null" value as well, so in a three valued system it's easy to represent true, false, and null.

 

However, three valued logic is considerably more complex. Couple that with the fact that processors, RAM, hard drives, CDs/DVDs, etc. are already based around binary and there's very little reason to change.

Posted
What if there was a way to use quantum spin or something in the place of 1 and 0?

Wow, you hit the nail on the head.

 

Fresh Spin On Logic

Instead of an electron being there or not there in the gate of a transistor—basically two pieces of information—think about an electron being able to hold a million pieces of information...

Also, thanks SH3RL0CK for the links. Interesting reads. Perhaps someday a mesh of analog/digital will be the ideal thing.

Posted
What about things with Spin-1/2?

 

Take an electron, which you could use as the bassis for your logic system.

 

It has two spin states:

 

+1/2

 

-1/2

 

If you want to monitor different types of particle measuring the spin becomes difficult.

 

Electrons you can do it quite easily using spintronics and spin valves... My masters project was concerned with some of the technologies involved with this... interesting stuff...

Posted
But can you have no electrons?

 

Yes. But you then have the issue of having to have complicated electronics to realise this, the nice thing about quantum computers is you don't need any traditional electronics on the transistor level, you can make the particles interact like transistors do...


Merged post follows:

Consecutive posts merged
in addition to ternary logic check out q-bits

http://en.wikipedia.org/wiki/Qubit

their value is 0 and 1

 

That is infact what using electron spin would be.

Posted

Did anyone read this?

 

http://spectrum.ieee.org/jan07/4819

 

Even more interesting would be a microprocessor using spin. In principle, a device that encoded information using the orientation of electrons could handle data thousands of times as fast as the present-day processors that rely only on charge. “Instead of an electron being there or not there in the gate of a transistor—basically two pieces of information—think about an electron being able to hold a million pieces of information,” says David Awschalom, a physicist at the University of California at Santa Barbara who specializes in the development of magnetic semiconductors. In addition to being much faster, spintronics processors could be much smaller than present-day processors.

It believe it's addressing the same issue of going beyond 1s and 0s, but only from the storage end. Using an electron's orientation to create way more possible states than we use now.

 

The technology is still flawed, but one day it might achieve its full potential, and the same can be done to every use of 1s and 0s throughout the computer.

Posted

Both the previous posts to this one are talking about quantum computers...

 

The first I would say is a little misleading as each qubit would still be a 1 and/or 0, and the system would still be fundamentally binary.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.