A History of Computer Hardware
- Early Computation Hardware
- Fingers
- Stonehenge?
- Abacus, sand tables, ...
- Positions of beads/stones store data
- Changing positions yields computations
- Pascal
- Pascaline (1642)
- gear driven machine to add/subtract whole numbers
- This design carried into the 20th century
- Could not be programmed, and had to be operated like an abacus
- Leibniz
- the same but all four operations on integers
- Could not be programmed, and had to be operated like an abacus
- Jacquard's Loom
- 1801
- weave a pattern in cloth designated on cards
- The cards held the pattern or "program"
- Holes were punched in the cards to store the program
- We will see that this is an important idea
- Babbage
- 1822 Difference Engine
- Finds the solution to polynomial equations
- 1833 Analytical Engine (Design Only)
- Wasn't / Couldn't be built
- Many features of modern computer
- Memory
- ALU - Arithmetic Logic Unit
- I/O devices
- Could be programmed
- This was built much later (20/21 century)
- Each computation took hundreds/thousands of turns of a crank
- Speed was definitely limited.
- Used punched cards for input
- Hollerith
- Tabulating machine to do the 1890 census
- Formed a company that became IBM
- Used punched cards to store data.
- Turing
- 1939, Turing Machine
- Abstract machine, not hardware
- Basis for modeling computing today
- Generation Zero
- Zuse
- 1930's
- An electronic descendant of Babbage's Machine
- Used electronic relays in place of gears
- Nazis would not buy the design, and his house was bombed destroying his prototypes
- Atanasoff
- Completely electronic calculator.
- Could only solve systems of linear equations.
- Generation 1, Vacuum Tube Computers
- Mauchly and Eckert
- Mauchly wanted to predict weather
- This required many computations
- Done on early "computers"
- A job title, not a device
- A person who used an adding machine to perform computations
- ACM
- Built ENIAC
- 17,469 Vacuum tubes
- 1,800 square feet of space
- 30 tons
- Memory of 1000 bits
- Used base 10
- Built by army to compute ballistics charts.
- Vacuum tubes
- A Descendant of the lightbulb
- Can be used to store a charge
- Just like a lighbulb
- They generate heat (a lot of heat)
- They use a lot of power
- They burn out
- They take space
- They are also very slow
- Not quite but people said the machines wouldn't work because tubes would burn out faster than they could be replaced
- Generation 2, Transistors
- Transistor
- Transfer Resistor
- A solid state version of a Vacuum Tube
- base, emitter, collector
- If a charge is applied to the base, electrons flow from the emitter to the collector.
- A simple switch
- Much
- Smaller,
- More Reliable
- Lower power consumption
- Than a vacuum tube.
- This smaller faster computers
- 1950's
- 10 million instructions per second
- 128K words of memory
- Also, no moving parts -> much faster.
- The third generation, Integrated Circuits
- "The Chip"
- Print transistors on a board
- 1965- present
- Dozens of transistors on a chip
- (Today MUCH higher)
- Faster, Smaller Cheaper
- Compatable systems
- The fourth Generation (LSI, VLSI)
- SI - scale Integration
- The number of transistors on a chip
- SSI - Small Scale Integration - the first generation 10's of transistors
- MSI
- LSI - Large Scale Integration - several thousand transistors
- VLSI - Very Large Scale Integration - more than 10,000 transistors
- With this, you can put the entire CPU on a chip
- In 1987, students at upenn built ENIAC on a chip, ~174,000 transistors
- Many small processors followed
- Moore's Law
- Observed in 1965 by Gorden Moore
- The number of transistors on a chip will double every year
- Closer to two years then
- Closer to 18 months now
- Not a law but an observation
- Intel thinks they can pull this off for the next 10 years
- Essentially says speed will double every 18 months
- Parallel Computers
- SIMD - normal computers
- MIMD -
- SIMD