Sorthon Posted April 16, 2011 Posted April 16, 2011 So I took on the task of writing a paper about the history behind modern computer science and computing. My question to you all is what kind of fun facts or knowledge can you all share that i may add in my paper. I really want to have some cool info because I'm trying to make the paper appeal to those who don't really know much about computer science. If at all possible can you share what sources your getting your information from and if its just your prior knowledge then i will just site you , in which case some kind of name that i could use to site would be helpful.
michel123456 Posted April 16, 2011 Posted April 16, 2011 How far in the past do you intend to go? Antiquity, or 20th Century?
Sorthon Posted April 16, 2011 Author Posted April 16, 2011 Well I have already talked about the abacus and Stonehenge so really as far back as I can. I am trying to stay away from the very recent stuff just because frankly I don't find it fun because most recent stuff isn't really new ideas its just new ways of using old ideas. I am trying to focus more on the roots rather then the leaves.
michel123456 Posted April 16, 2011 Posted April 16, 2011 (edited) http://en.wikipedia.org/wiki/Antikythera_mechanism http://en.wikipedia.org/wiki/Automaton http://en.wikipedia.org/wiki/Pascal's_calculator http://en.wikipedia.org/wiki/Piano_roll (good picture here) http://www.computerhistory.org/babbage/ http://en.wikipedia.org/wiki/Abacus http://www.homelessdroids.com/post/2776926334/this-was-a-very-rare-find-indeed-an and http://en.wikipedia.org/wiki/Mechanical_calculator http://en.wikipedia.org/wiki/Comptometer http://en.wikipedia.org/wiki/Arithmometer Edited April 16, 2011 by michel123456
TonyMcC Posted April 16, 2011 Posted April 16, 2011 (edited) You might like to mention that the origins for programming are quite obscure. Items like punched card and punched tape were controlling quite a lot of devices long before electronic computers.(Fairground organs, weaving looms for example). I would think you would give mechanical computers some mention especially Babbidge's difference engines. I am sure you would want to include analogue computers which can incorporate mechanical and/or electronic components and had quite a following in the early days of computer development. Picking out key words from the above and googling them will get you plenty of information. For something more light hearted I have been told that the following is true (but cannot guarantee that!). A computer engineer was called several times to a computer terminal which kept entering spaces at random. He could find no fault with the terminal. He decided to watch closely as the terminal was used and discovered that the operator was a rather buxom, short sighted woman who was inadvertently pressing the space bar with her ample bosom as she leaned forward for a closer look! Edited April 16, 2011 by TonyMcC
Sorthon Posted April 16, 2011 Author Posted April 16, 2011 (edited) I appreciate The ideas, what i have talked about so far is the abacus, Stonehenge, how computers where originally people who did calculations for a living who were usually women) i have mentioned Alan Turing but have not actually talked about him yet but i will get there, and i talk a good bit about John Backus and how his development of FORTRAN revolutionized programming while also explaining what a programming language is in laymen terms and discussing what caused him to want to make a high level language in the first place. things on the list to be included are Alan Turing, Difference engines, and Bletchley Park to provide an example of how computer science isn't only used for computer programs to play games and work out of the house. At the moment I'm adding in information about Bletchley Park and i have the book codebreakers but i am having a hard time finding out how exactly they did what they did so that i can partain it too computer science. I know they used a large machine to figure out the code and what not, I have the wiki page open but i need to find it in the book sigh... My main issue that im fighting with is that i can not use wikipedia as a source. I already have planned to talk about the Difference engine, and will also touch on the Turing machine. Edited April 16, 2011 by Sorthon
Xittenn Posted April 16, 2011 Posted April 16, 2011 - I saw no mention of Ada Lovelace being the worlds first programmer - first general purpose digital computer was ENIAC - first single chip processor Intel 4004 - ARPANET I'm sure there is a lot to say about the introduction of computers and robots into surgery and medicine ....
Sorthon Posted April 16, 2011 Author Posted April 16, 2011 (edited) i actually have Lovelace on my list of subjects to add. Like i said before the issue im having is lack of usable sources Edited April 16, 2011 by Sorthon
Xittenn Posted April 17, 2011 Posted April 17, 2011 You know what one of my favorite things to do with a computer is? I like to search for books as every catalog in North America is online. A lot of the magazines are archived online as well. Quoting online sources is usually not a best practice unless you are citing papers, many of which if you look hard enough for you will find. :/ (Wiki usually cites sources use it as a starting point)
michel123456 Posted April 17, 2011 Posted April 17, 2011 (edited) Look here at the Computer History Museum. IMO the interesting part of computers history is in the inter-relations of theoretical developments (sowtware) and instruments (hardware). And also the development of Human Interfaces Devices. All that began with existing devices from older technology transformed in order to build something new. Not so long ago, we were standing in front of typewriter and a tv screen combined with a kind of tape recorder. Although the transmission of voice through wires had been achieved a long time ago (the telephone), the new developments were based on the transmission of simple digits like the telegraph did, which was an older technology. The keyboard I use at this right moment is a remnant of the old typewriter, and my desktop is still a tv screen. My scanner is another development (maybe simplification) of a fax machine, which is a development of the telex. The real developments are hidden inside my pc tower, but also related to the transistor from radio transmitter devices. On the other hand, new original developments occured, like the mouse, in contrast with other steering wheels or aircraft control yokes used in video games. And the main use of all this is still the transmission of sounds and images and the storage of information, something we did in the past times with other means, like paper (remember also the music paper). Note that the printing machines are a remnant of this 3000 years old technology. IMHO we only begin to see inventions created especially for the new purpose. It may come a time when all printers will become obsolete (the e-book is here), personally I don't print photographs any more, and a lot of e-mails disclaimers say to not print in order to protect environment. As a result, scanners and home bookshelves will disappear too. I suppose even the screen display will go to the museum. We have a lot to see yet. ----------------- edit: I made a great omission not speaking about the calculation skills of computers: Here is one of the first comments about the Babbage invention: Again, who can foresee the consequences of such an invention? In truth, how many precious observations remain practically barren for the progress of the sciences, because there are not powers sufficient for computing the results! And what discouragement does the perspective of a long and arid computation cast into the mind of a man of genius, who demands time exclusively for meditation, and who beholds it snatched from him by the material routine of operations! Yet it is by the laborious route of analysis that he must reach truth; but he cannot pursue this unless guided by numbers; for without numbers it is not given us to raise the veil which envelopes the mysteries of nature. Thus the idea of constructing an apparatus capable of aiding human weakness in such researches, is a conception which, being realized, would mark a glorious epoch in the history of the sciences. from here.My link Edited April 17, 2011 by michel123456
Marat Posted April 17, 2011 Posted April 17, 2011 For the early history behind the development of computers you might want to look at the calculating machine invented by G. W. Leibniz in the 1660s, which he used as his entree into the world of science. The history of computers is usually dated from the calculating machine of Charles Babbage, however. 1
Total Science Posted April 19, 2011 Posted April 19, 2011 Sharkey, N., The Programmable Robot of Ancient Greece, New Scientist, Issue 2611, Jul 2007
Marat Posted April 20, 2011 Posted April 20, 2011 As with all searching for precedents of more recent discoveries in the history of science, there is a trade-off between citing some earlier historical forerunner of the modern achievement, and citing one that is so primitively prototypic that it hardly seems to count, like say the abacus being the forerunner of modern computers. The fact that some Ancient puppet theaters could be programmed and reprogrammed to operate for a while on their own through water power and could be variably adjusted for different effects does seem to realize an essential element of the idea behind computers. Socrates speaks of walking humanoid statues which operated by some internal spring mechanism, whose exact nature is now unknown, and there are records of the bleeding figure of the dead Julius Caesar in wax rising from its casket, apparently spontaneously, during Marc Antony's oration over Caesar's body. For a wealth of information on early mechanical devices see Derek Price, 'Science Since Babylon.'
keelanz Posted April 20, 2011 Posted April 20, 2011 (edited) id write a bit on binary numbering system and fibonacci, explaining how if indian mathematicians were to have had a better knowledge of physics and slightly modified their numbering systems that the modern computer could have been made thousands of years ago. id write a bit about boolean and logic manipulation as that is the real birth of modern computing . the only external link you need http://www.nap.edu/openbook.php?record_id=4948 Edited April 20, 2011 by keelanz
khaled Posted April 20, 2011 Posted April 20, 2011 History of the Computer .. I can only think of some names, De Morgan .. Al-Khawarizmi .. Turing .. Church .. Godel .. Post ...
ewmon Posted May 6, 2011 Posted May 6, 2011 (edited) Programming Jacquard looms The formerly ubiquitous slide rule Ancient African roots of Boolean algebra (specifically at 12:30 to 14:00 minutes) Edited May 6, 2011 by ewmon
DevilSolution Posted May 10, 2011 Posted May 10, 2011 Turing and Enigma may be an interesting topic to cover http://en.wikipedia.org/wiki/Alan_Turing http://en.wikipedia.org/wiki/Enigma_machine
ewmon Posted May 10, 2011 Posted May 10, 2011 Two mechanical analog calculators: the planimeter and the south pointing chariot.
nec209 Posted June 6, 2011 Posted June 6, 2011 You may want to talk about Moore's law coming to end ? Moore's law every 18 month processors double in speed and more and more transistors can fit on the IC .The transistors get smaller and smaller .But around 2004 processor clock speed hit brick wall of 3.0 GHz with out overheating the the solution to problem dual core!! Not sure how long we can put money into dual core and keep up with Moore's law .
Guest Dexter AJ Posted June 20, 2011 Posted June 20, 2011 Go simple. This will be helpful for you as you will have a proper understanding about what you are talking and explaining about. History of computers includes the computer, the internet, hardwares, viruses and worms. Here are a few of articles to make things easy for you. http://www.business-science-articles.com/science/articles/computer/535-internet http://www.business-science-articles.com/science/articles/computer/488-computer-virus http://www.business-science-articles.com/science/articles/computer/137-computers
Michel Desouza Posted June 25, 2011 Posted June 25, 2011 The first computers were people! That is, electronic computers (and the earlier mechanical computers) were given this name because they performed the work that had previously been assigned to people. "Computer" was originally a job title: it was used to describe those human beings (predominantly women) whose job it was to perform the repetitive calculations required to compute such things as navigational tables, tide charts, and planetary positions for astronomical almanacs. Imagine you had a job where hour after hour, day after day, you were to do nothing but compute multiplications. Boredom would quickly set in, leading to carelessness, leading to mistakes. And even on your best days you wouldn't be producing answers very fast. Therefore, inventors have been searching for hundreds of years for a way to mechanize (that is, find a mechanism that can perform) this task.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now