Jump to content

Recommended Posts

Posted

COMPUTER Definition & Meaning | Dictionary.com

noun

  1. a programmable electronic device designed to accept data, perform prescribed mathematical and logical operations at high speed, and display the results of these operations. Mainframes, desktop and laptop computers, tablets, and smartphones are some of the different types of computers. Compare analog
     
    When we use our computer, say your Microsoft Windows 10 desktop PC, to type or read a message like this at this site, play an arcade game, play with airplane simulator software, work on an Excel spreadsheet, read an email or type a letter in Word, is data being accepted, are prescribed mathematical and logical operations being performed at high speed and are the results of such operations being displayed? How is this done in a nutshell? 
     
    Dictionaries used to define COMPUTER in terms of "electronic machine". 
     

 

Posted
2 minutes ago, JohnDBarrow said:

COMPUTER Definition & Meaning | Dictionary.com

noun

  1. a programmable electronic device designed to accept data, perform prescribed mathematical and logical operations at high speed, and display the results of these operations. Mainframes, desktop and laptop computers, tablets, and smartphones are some of the different types of computers. Compare analog
     
    When we use our computer, say your Microsoft Windows 10 desktop PC, to type or read a message like this at this site, play an arcade game, play with airplane simulator software, work on an Excel spreadsheet, read an email or type a letter in Word, is data being accepted, are prescribed mathematical and logical operations being performed at high speed and are the results of such operations being displayed? How is this done in a nutshell? 
     
    Dictionaries used to define COMPUTER in terms of "electronic machine". 
     

 

Not all languages are defined by a noun, that would be very confusing...

Posted
1 hour ago, JohnDBarrow said:

When we use our computer, say your Microsoft Windows 10 desktop PC, to type or read a message like this at this site, play an arcade game, play with airplane simulator software, work on an Excel spreadsheet, read an email or type a letter in Word, is data being accepted, are prescribed mathematical and logical operations being performed at high speed and are the results of such operations being displayed? How is this done in a nutshell? 

You want a 5 year course in computer engineering put into a nutshell on a discussion forum ?

 

From your question paragraph you seem to be leaning towards hardware.

 

Are you aware of the distinction between hardware and software or the layer model of computers ?

Posted (edited)
4 hours ago, JohnDBarrow said:

When we use our computer, say your Microsoft Windows 10 desktop PC, to type or read a message like this at this site

The processor in your computer 'reads' the incoming serial data stream that your Wi-Fi adaptor captures, and it performs some mathematical operations on to make it understandable as 64 bit long words of instructions or  data, before placing it to a location in main memory.
Thr display processor then takes over and reads these instructions or data, and manipulates the data ( according to the instructions ) to transform it into intensity and color information into its own memory called the frame buffer, and then outputs it sequentially to each pixel on the screen to form an image which you can read on your monitor.

Any more detail than this and you'll need to take some electronics courses, and learn about transistors and gates, various types of memory ( volatile and 'permanent' types ), ALUs, their registers and how they manipulate binary coded bytes/words/long words, and how modern bit-mapped graphics displays are handled by massively paralleled simple processors for raster operations as well as mapping, rendering, shading and even ray tracing.

Edited by MigL
Posted (edited)

The computer has read-only memory (ROM/BIOS) and regular read-write memory (RAM). The read-only memory contains the booting instructions which reads data from the boot sector of the boot drive and later passes control to the bootloader which loads operating system.

https://en.wikipedia.org/wiki/Booting

https://en.wikipedia.org/wiki/Bootloader

Processors have instructions. Hundreds or thousands performing various basic tasks. Logical operations, arithmetic operations, binary operations, memory operations, floating point operations, special purpose operations. They are performed in sequence. This is called a program. A program can be stored in ROM or RAM. Programs are loaded from external storage (HDD, SSD, flash drive, etc.) into RAM, and then executed by the OS. A modern processor can execute billions of instructions per second.

Programmers write programs in their favorite language that they know or like, or that is required to perform a specific task (not all languages are good enough for every task). High level languages are compiled into final CPU instruction set ("machine code") (e.g. C/C++), interpreted by virtual machine (Java/C#), or interpreted by interpreter (e.g. Python, Perl, PHP, JavaScript, Bash). Interpreters are slow (1000x slower than C/C++), virtual machine code (Java/C#) is "just" 4x slower than C/C++.

 

You can try out some languages without installing anything in an online debugger accessible through a web browser.

Here you have Python (just press "Run" button):

https://www.onlinegdb.com/online_python_debugger

Read the Python tutorial and start coding in the online debugger https://www.google.com/search?q=python+tutorial

W3Schools has easy to understand tutorials for newbies https://www.w3schools.com/python/default.asp

 

Edited by Sensei
Posted

I understand a computer must have so much hardware and software to make it "do something productive".  Is the dictionary definition of computer concise and accurate enough for laymen to understand? 

Posted

Depends.
I don't work in the computer industry, nor have ever taken any computer courses.
I did take a half course in Electronics for Physics over 40 years ago, and did some programming , mostly FORTRAN, Basic, Pascal, Forth, C, and some Z80 assembly, but don't anymore, so I could be considered a 'layman'.
And I understand that definition.

Or did you think the definition was going to tell you everything you need to know about computers ?

Posted
9 hours ago, JohnDBarrow said:

I understand a computer must have so much hardware and software to make it "do something productive".  Is the dictionary definition of computer concise and accurate enough for laymen to understand? 

I really think it would be a good idea to try to focus your question.

Take a look at this website and see if this is what you are really asking about.

Then ask a few more detailed questions.

https://www.cloudflare.com/en-gb/learning/ddos/glossary/open-systems-interconnection-model-osi/

 

Please note the statement

Layer 7 The application layer.

 

Quote

This is the only layer that interacts directly with data from the user

 

 

I am also going to say +1 for encouragement as this is a much more reasonable thread than some of your earlier ones and much more in tune with ScienceForums.

We can actually help you with this one rather than argue.

Posted

A computer, as the name implies, computes. Other electronic devices also computes, f.e. changing the volume of a sound is an analogous multiplication. In a computer, the main difference is that what is done is controlled by a program that can be changed on demand.

Posted

From the early 17th century up until the end of WW2, the term ‘computer’ meant a human being with a talent for doing complex mathematical  calculations.

https://en.wikipedia.org/wiki/Computer_(occupation)

The earliest use of such ‘computers’ was for compiling astronomical almanacs and tables of planetary positions, and the return periods of comets. They were also extensively used to generate critical tables of logarithms and trignometric functions accurate up to 7 decimal places - especially important when slide rules were only accurate up to 2 significant decimal places.

From the earlier part of the 20th century onwards, ‘computers’ were increasingly used for statistical work such as calculating actuarial Life Tables, and engineering studies of various types. During WW1 and WW2 large numbers of ‘computers’ - many of them women - were recruited by the military establishment to produce ballistic artillery tables, as well as surveying, map-making and navigational aids for the armed forces.

During WW2, the Manhattan Project in particular used many female computers  in teams to help calculate the properties of nuclear chain reactions and criticality geometries in the race to develop the first viable atom bombs by 1945.

The popular film Hidden Figures (2016) documents the use of female ‘computers’ (many of them black) by NASA and the early post-war US space program to transcribe and convert test flight data into standard engineering units, and to perform orbital calculations to predict re-entry and splash-down points.

Interestingly, it was considered menial work at that time.

Posted

Ok, as I type this line right now, how is my Windows 10 machine actually performing prescribed mathematical and logical operations  at high speed?

 

 I depress an R on my keyboard and one appears on the monitor and I highlight this line and left click on the Bold tool above with my mouse and nothing happens. 

Posted
28 minutes ago, JohnDBarrow said:

Ok, as I type this line right now, how is my Windows 10 machine actually performing prescribed mathematical and logical operations  at high speed?

 

 I depress an R on my keyboard and one appears on the monitor and I highlight this line and left click on the Bold tool above with my mouse and nothing happens. 

If you don't wish to co operate I can't help you further.

Sorry.

Posted
16 minutes ago, JohnDBarrow said:

Ok, as I type this line right now, how is my Windows 10 machine actually performing prescribed mathematical and logical operations  at high speed?

The keyboard is laid out as a matrix of intersecting lines.
Hitting the 'R' key closes a contact between the row and column lines and a value is sent , in hexadecimal format, to an 8 bit processor in the keyboard, which then serializes the ASCII code for 'R' and sends it, through a USB channel to the computer's USB decoder which converts it back to parallel binary data again, so it can be read by the CPU.
No processing is to be done on this data, so it is placed in the common area of main memory ( RAM ) where it can also be accessed by the display processor.
The display processor changes the binary data into a bit map and places it in the display buffer, adding any attributes that may be required for the specific windows display environment ( color/intensity ), and the display buffer is then serialized again to be output to a monitor, along an HDMI channel, to be 'drawn' on the LED/transistors of the screen matrix, one row at a time, until the whole screen is 'painted' at a refresh rate of 120 Hz, and the letter 'R' is displayed in that window.
The Operating system, Win10 in this case, is responsible for setting aside the part of main memory that corresponds to the display buffer, and the particular subset that is the contents of that window.

Do you really think you understand it better now ?
Unless you learn the basics, you don't have a chance of understanding.

Posted (edited)
1 hour ago, JohnDBarrow said:

Ok, as I type this line right now, how is my Windows 10 machine actually performing prescribed mathematical and logical operations  at high speed?

1: What is your own answer to the question? That could indicate your level of understanding of the answers provided so far; allowing members to provide some clarifications.

2: As an exercise for you: what assumptions have @MigL made about your hardware configuration? 

Spoiler

Examples:

As far as I can tell @MigL did a reasonable assumption and answered in the context of a stationary computer (not laptop) that has a wired keyboard attached (indicated by mentioning USB but not a wireless transmitter and receiver).

The computer is not a vintage model (HDMI rather than older VGA or similar)

 

Edited by Ghideon
missing part of a sentence
Posted (edited)
3 hours ago, JohnDBarrow said:

Ok, as I type this line right now, how is my Windows 10 machine actually performing prescribed mathematical and logical operations  at high speed?

Modern operating systems are complex and sophisticated entities. What user see is less than a tip of iceberg..

 

 

3 hours ago, JohnDBarrow said:

 I depress an R on my keyboard and one appears on the monitor and I highlight this line and left click on the Bold tool above with my mouse and nothing happens. 

Modern operating systems are multithreaded, which means that multiple applications can run in parallel. If an extensive operation is running in the background, the processor and graphics card are busy performing tasks, not what you are asking them to do. Hence the delays and user frustration. Use Control-Alt-Delete and open Task/Process Manager and identify where the computer is losing all its power on what it is busy with. In the case of a web browser or other web application, each keystroke may send data to/from the web server, and poor internet transmission speed can be a bottleneck and a source of slowdown. Yeah, I know, it's annoying..

In extreme cases, it could be a symptom of a hacker hacking into your computer. i.e., the hacker runs code that mirrors your desktop, which requires extensive screen capture and sending it over the network.

 

  

2 hours ago, MigL said:

Hitting the 'R' key closes a contact between the row and column lines and a value is sent , in hexadecimal format, to an 8 bit processor in the keyboard, which then serializes the ASCII code for 'R' and sends it, through a USB channel to the computer's USB decoder which converts it back to parallel binary data again, so it can be read by the CPU.

Hexadecimal or decimal format are only for human convenience (in documentation and source code). For a computer, it is always binary format.

Keyboard sends scan codes (PS/2 keyboard) or HID scan codes (USB keyboard).

https://en.wikipedia.org/wiki/Scancode

https://en.wikipedia.org/wiki/USB_human_interface_device_class

The scan codes are converted by the operating system into virtual keyboard codes.

https://learn.microsoft.com/en-us/windows/win32/inputdev/virtual-key-codes

This happens at the application level when the developer uses the MapVirtualKey or MapVirtualKeyEx function:

https://learn.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-mapvirtualkeya

Finally, the virtual key codes are converted to ASCII (ToAscii() function), or more commonly these days, Unicode (ToUnicode() function):

https://learn.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-tounicode

https://learn.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-toascii

 

2 hours ago, MigL said:

The Operating system, Win10 in this case, is responsible for setting aside the part of main memory that corresponds to the display buffer, and the particular subset that is the contents of that window.

The proper name is framebuffer https://en.wikipedia.org/wiki/Framebuffer

 

Edited by Sensei
Posted
2 hours ago, Ghideon said:

what assumptions have @MigL made about your hardware configuration? 

I made no assumptions.
He stated 

4 hours ago, JohnDBarrow said:

Ok, as I type this line right now, how is my Windows 10 machine ... I depress an R on my keyboard ... appears on the monitor ...click with my mouse

 

1 hour ago, Sensei said:

Keyboard sends scan codes (PS/2 keyboard) or HID scan codes (USB keyboard).

I haven't taken a keyboard apart since they became 'disposable'.
Long ago the key matrix was decoded by a microcontroller ( usually i8748 ) and its internal ROM would output ASCII codes.
Until the early 90s I cut my computing teeth on Sinclair machines, Big Board CP/M, Atari STs and Amigas ( still tinker with ST and Amiga hardware because non surface mount components are easier to see/work on).
Built my first PCs in '94 when the Win95 hoopla started.
Three Pentium 166 machines with 32MB RAM, 2GB mechanical drives, and a 1st generation ATI 3D 'decelerator'.
One for me, one for my brother's 5 yo son, and one for my sister's 6 yo son and 4 yo daughter ( who now won't go near anything that isn't Apple/Mac )

Posted (edited)
On 8/15/2024 at 10:53 PM, MigL said:

I made no assumptions.
He stated 

You are correct.
(In my attempt to post something helpful to OP I missed and/or misinterpreted the provided context.)

Edited by Ghideon
grammar

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.