A basic understanding of the terminology of the dicipline.
(terms and definitions)
An understanding of the phusical hardware.
(the physical stuff, what it is and what it does)
An understanding of how to use some of the applications.
Some History
Look up the definition for computer.
Analog vs Digital
What is an analog clock?
What is a digital cloc?
What is the difference?
Which is more accurate? Trick question.
Which can be read more accurately.
analog representing data by a physical quantity
digital representing data as a number.
Think of a flood gauge, or rain gauge.
Think of a ruler, 14 and 3/16 + a little bit.
the first computers were analog.
Fingers, stones, bones, ...
Abacus
These probably don't fit into the book's definition of a computer.
computer an electronic device that accepts data, processes
that data to produce information. It operates under the control
of a program that is stored internally within the computer's
memory.
Why do the above devices fail to meet this definition?
In your book is a picture of Charles Babbage's Analytical Engine
An analog computer
Almost built in the late 1800's
Could store both data and program
Failed because technology was not equal to theory.
Mondern electronic computers
Were first built in the lat 1940's
Have been through a number of "generations"
0th generation - electro-mechanical
1st generation - vacuume tubes
2st generation - transistors
3rd generation - microchip
Computers once were classified by size
Mainframes: $500,000 and up. Used for Corporate Data Processing
Minicomputers: $100,000 - $500,000. Division Data Processing