History is a objective level thing so ...
- I made this up from many sources.
- Let's take a look at Ben Eater's 8 bit computer.
- IN this video he develops a program to run on the simple machine.
- Then he translates this to machine code.
- At timestep 14min, he then starts to program the computer.
- watch this
- I have done some reading and
- For the first computers it could take days to enter the code this way.
- And you might have to move jumper cables as well.
- .
- Apparently later it got easier to enter your code but (like reading off of a paper tape)
- The runtime queue might have been people lined up outside of the door for their turn at the computer.
- And if you had 15 minutes of time, 10 might be spent on setup.
- But this was all done in machine language
- By the way, the person using a computer was called an operator.
- By the early 50's
- Punched cards/card readers were introduced.
- Prewritten functions (not subroutines, think sequence of instructions) were available on paper/books, but you had to punch your own.
- A specialist "operator" would run the code (or job) on the cards you handed to them.
- Compilers for FORTRAN became available and assembly language.
- The operator would load the FORTRAN deck and your program.
- The (I assume) your machine code would be loaded and run.
- Then you would be handed a printout of the output.
- It is amazing the number of articles that mention "After waiting a day, a printout would appear stating you were missing a ; on line 4."
- In 1956 UNIVAC introduced the interrupt, and the interrupt service routine
- It seems that at some point things got to the point where
- A queue of jobs could be scanned and were available
- There was enough other resources (memory mostly)
- And job control languages were introduced
- So batch processing could occur.
- This software would take the place of the operator above.
- As resources increase,
- And I assume our knowledge of programming and what we want to do improves
- It is a natural step is to introduce multiprogramming.
- When a process (or job in those days) was interrupted, another job would be allowed to run.
- I don't think they interrupted themselves, but use of the cpu became more efficient.
- By the early 60s
- Multiprocessing systems had been introduced.
- Accounts, with storage protection and passwords existed
- secondary storage and file systems were being developed
- Inter-computer communications (telephone lines) were possible.
- Multitasking becomes available in the 60s
- Timesharing - multiple users work on a system at the same time. The illusion is that they are the only user on the system.
- In the 60s price comes down and diversity of hardware increases
- minicomputers are introduced ($20,000 in one source)
- Everything before this was a mainframe.
- So universities and smaller businesses stepped into the fray.
- Starting in 64, multiple groups (MIT, GE, Bell Labs, ...) worked on an OS called MULTICS
- One huge computer.
- Community level use (Like Boston)
- The idea was to sell computing power like electricity or phone service.
- I never knew but multiple sources sign the praises of MULTICS
- And MIT put it into production and it was used into 2000
- And an open source version is available.
- UNIX
- Bell Labs dropped out of the MULTICS project in 1969
- But Ken Thompson and other decided to build a reduced version of MULTICS
- The story is that Kent Thompson wanted to run a game he had written on a PDP-7.
- And his wife went away for a month, so he could program full time.
- I find the history of UNIX well documented and amusing, it is worth a read if you are interested.
- Among other things, Dennis Ritchie and others worked on a system development language C, which could be used to implement an OS.
- But eventually AT&T sold UNIX licenses for very little to universities.
- This led to
- Major contributions in many directions by university research groups.
- Many students studying UNIX source code in OS class.
- The OS becoming widely distributed and ported.
- From the beginning the UNIX was mostly coded in C, so it was highly portable to other hardware platforms.
- In the 80s AT&T was split up and started charging big time for UNIX licenses.
- This led to the free software foundation and GNU
- Which developed a ton of free system level utilities.
- This was important, a tool base was established when Linux came along.
- Side note, read the GNU Manifesto
- To Linus Torvalds, in 1991 developed a kernel as a student.
- This was open source.
- Developed independently of UNIX code, but UNIX like
- And led to Linux.
- Side note, read about the Tanenbaum - Torvalds debate
- DOS
- I don't know this as well.
- 86-DOS was a small project by Tim Paterson
- Based on Digital Research's CP/M
- Microsoft was working with IBM to develop software for the IBM-PC
- Hired Patterson and purchased 86-DOS
- Rebranded it MS-DOS and started licensing it.
- And made it available on different platforms.