gib65 Posted December 16, 2008 Posted December 16, 2008 Isn't it true that computers don't read anything but machine code? I'm having an argument with someone about that. He thinks computers don't read machine code except perhaps the BIOS. I've got a degree in computer science and I'm pretty sure I remember learning that the only information that goes through the CPU is instructions written in machine code. I'm right... right?
YT2095 Posted December 16, 2008 Posted December 16, 2008 yup, but more specifically Binary. the OP codes for the CPU are represented in HEX for us really. you may tell him that a guy who has designed and built his own computer From scratch (chip level) told you this
gib65 Posted December 16, 2008 Author Posted December 16, 2008 Thanks YT, I thought for a second there I was going crazy. He claimed he knows for fact that I'm wrong because he works with logic designs for AI systems which are then implemented into computers.
gcol Posted December 16, 2008 Posted December 16, 2008 And in the earliest days, only octal. I remember when hexadecimal was the new kid on the block. Made it easier for counting to 10, and for alphanumeric. An early example of dumbing-down! Try writing a long division decimal routine in octal, testing for sign, overflow, end-carry, etc. Loads of fun. What we had to do in those old days. Could not do it now.
John Cuthber Posted December 16, 2008 Posted December 16, 2008 This thread looks like it's going to turn into the Dilbert cartoon that finishes up with someone saying "Ones AND zeros? We had to make do with just zeros".
YT2095 Posted December 16, 2008 Posted December 16, 2008 LOL, I rem being forced to learn Octal because I had no BCD 7 segment driver chips, I could have bought some but that meant leaving the house (I was quite anti-social back then). I printed up a conversion chart on my Commodore PET and then memorised it, I kinda miss those days.
gcol Posted December 16, 2008 Posted December 16, 2008 This thread looks like it's going to turn into the Dilbert cartoon that finishes up with someone saying "Ones AND zeros? We had to make do with just zeros". AND we didn't have chips, just trannies (germanium of course) and 0A81 diodes . (I could mention valves, but that would spoil the game)
John Cuthber Posted December 16, 2008 Posted December 16, 2008 I once built a (really crap) robot using relays for all the control logic.
YT2095 Posted December 16, 2008 Posted December 16, 2008 the AC128`s rock, I made my 1`st radio with those in 73, I`v made valve radios too, and also all the gates in valves, just for a giggle I`v also made my own Valves before now too!
gcol Posted December 17, 2008 Posted December 17, 2008 I will trump your valves with my electro-mechanical comptometers, and I still have another winning card in my hand. Unless someone has the Ace of Abacus, or Queen of Quipu, in which case I capitulate immediately.
John Cuthber Posted December 17, 2008 Posted December 17, 2008 I still keep a slide rule on my desk at work, but it's only to frighten the uninitiated. On the other hand (no pun intended) I have been known to count on my fingers. Is this the first "digital" computer? (OK, that pun was intentional)
YT2095 Posted December 17, 2008 Posted December 17, 2008 I have a 1953 all metal Pickett slide rule in its original leather case here too, although I haven`t the remotest idea of how to use it?
timo Posted December 17, 2008 Posted December 17, 2008 Isn't it true that computers don't read anything but machine code? I'm having an argument with someone about that. He thinks computers don't read machine code except perhaps the BIOS. I've got a degree in computer science and I'm pretty sure I remember learning that the only information that goes through the CPU is instructions written in machine code. I'm right... right? My computer read html just a few seconds ago - that's if I count the webbrowser and internet connection being part of the computer (and I do - they are very important parts). So why is the CPU the computer? I dunno why you think so but I will use this definition in the following: If "the computer" was the CPU then on one of the more fundamental levels (not really the most fundamental, but the most fundamental that is reasonable to discuss) the only information that goes through the CPU is time-series of voltages - a completely useless statement . These time-series have a 1-to-1 mapping on larger scales up to assembly code. Above assembly code, the mapping is not 1-to-1; the same C++ program might generate different assembly code on different compilers, for instance. So since by that reasoning (I am only a stupid physicist not a smart computer scientist, so I might be wrong) the highest-level set of expressions that maps 1-to-1 on the intended function of the CPU is assembly, I would chose that as an answer. However: I do disagree with equating "a computer reads" with "a CPU takes as input" - or with the two statements obviously being the same. Formally, the question also had to define whether "only reads X" is true if a computer reads Y and Y is equivalent to X, particularly for the CPU part.
YT2095 Posted December 17, 2008 Posted December 17, 2008 your computer parsed a series of 1`s and 0`s in an HTML format through Tables that broke it down into Machine code would be more accurate assembly language is just MC translated into mnemonics.
timo Posted December 17, 2008 Posted December 17, 2008 your computer parsed a series of 1`s and 0`s in an HTML format through Tables that broke it down into Machine code would be more accurate assembly language is just MC translated into mnemonics. Well, if you consider "my computer" as say the whole part of the machine that is inside my room then physically it read some electromagnetic signals from its surrounding. Ignoring noise, analog-digital conversion, checksums and other technicallities, then logically it read a series of zeroes and ones, yes. However -and I am not sure to what extent this rather fundamental point in my last post became clear- it also read some ASCII characters. Bits and bytes and ASCII characters and hex-numbers are (stuff like "you need 8 bits to form a byte" aside) just different ways to encode the same information. Just like decimal and binary and hexadecimal are simply different ways to express a number; none of the systems being more fundamental than the others. So from "it read bits" or "it read bytes" or "it read ASCII" or "it read hex-code" none of the statements of more correct or fundamental than the other. With "highest level" in my previous post I meant "most human-readable" which is not a really outstanding but a very practical property among a set of equivalent formulations (MC is just assembly translated into numbers ).
bascule Posted December 19, 2008 Posted December 19, 2008 Isn't it true that computers don't read anything but machine code? CPUs understand only the Instruction Set Architectures (i.e. machine code) they implement. CPUs are also able to interact with and transform data which isn't code. So, hard to say, given that sentence...
Ibeamer Posted December 31, 2008 Posted December 31, 2008 This is TRUE only for digital computers. Analog computers ( includes slide rules ) use a voltage or distance ( such as a slide rule ) to represent numeric values, e.g. pi could be represented by 3.14159 volts. In electronic analog computers the program is the wiring!
morganparkar Posted February 23, 2009 Posted February 23, 2009 Isn't it true that computers don't read anything but machine code? I'm having an argument with someone about that. He thinks computers don't read machine code except perhaps the BIOS. I've got a degree in computer science and I'm pretty sure I remember learning that the only information that goes through the CPU is instructions written in machine code. I'm right... right? hi, Ya you are absolutely right.
Xittenn Posted February 25, 2009 Posted February 25, 2009 the AC128`s rock, I made my 1`st radio with those in 73, I`v made valve radios too, and also all the gates in valves, just for a giggle I`v also made my own Valves before now too! I want to make my own tubes for a tube amp................. So we all just think in voltage differentials...................????????????????? I say Answer:True but I don't like it very much. The whole idea leads to really hard to read threads that give me a headache. I really hope to make brain chips one day! They can do the thinking for me.....
Mach1ne Posted February 26, 2009 Posted February 26, 2009 yup, but more specifically Binary.the OP codes for the CPU are represented in HEX for us really. you may tell him that a guy who has designed and built his own computer From scratch (chip level) told you this Just out curiousity, when you compile using gcc -c MyProgram.c (I think its -c) and view the the assembly code, the OP code there seems to be in regular english, not hex. Whats the difference?
YT2095 Posted February 26, 2009 Posted February 26, 2009 Assembly language is yet another stage further up the chain towards a high level language. you may go ASM to HEX then down to raw Binary, after that it`s soldering iron level bear in mind tho, I don`t know any High Lev languages other than BASIC, so I can`t really much when it comes to C+ or the plethora of other new langs out there.
Mach1ne Posted February 26, 2009 Posted February 26, 2009 Oh I see. I didn't know there was a step between assembly and binary.
Xittenn Posted February 27, 2009 Posted February 27, 2009 (edited) look up mnemonics wait I'll do it........ah says look up assembly language................ I always thought of the HEX coding as assembler and the mnemonics as the english translation........ JMP FE I like C++ .............mmmmmmmmm polymorphism and recursion! Edited February 27, 2009 by buttacup
bascule Posted February 27, 2009 Posted February 27, 2009 Just out curiousity, when you compile using gcc -c MyProgram.c (I think its -c) and view the the assembly code, the OP code there seems to be in regular english, not hex. Whats the difference? Assembly is considered a second-generation programming language (or 2GL). This means it's one level of abstraction above machine language, which is considered a first-generation programming language (or 1GL). Languages with the expressive power of C (i.e. context-free languages) are considered to be third-generation programming languages (or 3GLs) because they exist at a level of abstraction above assembly language.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now