Introduction to Computers
Computer History
By: Hifza Afzal
Computer Science Dept.
BUITEMS
1
Abacus (300 B.C. by the Babylonians)
• The abacus was an early aid for mathematical
computations. Its only value is that it aids the
memory of the human performing the
calculation.
• It doesn’t compute anything, it helps human
to do so
Mechanical Counting Machine
• The Pascaline is a mechanical calculating
device invented by the French philosopher
and mathematician Blaise Pascal in 1642.
• The first calculator or adding machine to be
produced in any quantity and actually used.
• It could only do addition and subtraction, with
numbers being entered by manipulating its
dials.
Analytical Engine
• Charles Babbage (1791-1871)
• Creator of the Analytical Engine - the first general-
purpose mechanical digital computer (1833)
• He designed and partly built
• He worked on it until his death (1871)
• The Analytical Engine was not built until 1943
The Analytical Engine
• A programmable, mechanical, digital machine
• Could carryout any calculation
• Used Steam Engines
• Could make decisions based upon the results of the previous
calculation
• Components: input; memory; processor; output
Ada Augusta (1815-52)
• Babbage: the father of computing
Ada: the mother.
• Ada Wrote a program for computing the Bernoulli’s
sequence on the Analytical Engine - world’s 1st
computer program
• AAda?
programming language specifically designed by
the US Dept of Defense for developing military
applications was named Ada to honor her
contributions towards computing
Diagram for Bernoulli's Process
It could analyze up to 300 billion
chess moves in three minutes
In 1997 Deep Blue, a supercomputer designed
by IBM, beat Gary Kasparov, the World Chess
Champion
That computer was exceptionally fast, did not
get tired or bored. It just kept on analyzing the
situation and kept on searching until it found
the perfect move from its list of possible moves
Babbage theorized that his machine would be
able to play chess
But first, why should we spend time on
recounting the events of the past
Why not just talk about what is happening in
computing now and what is going to happen
in the future?
Why?
• If you do not learn from the history,
your condemned to repeat it
• Recounting the events of the past
provides an excellent opportunity to:
– learn lessons
– discover patterns of evolution, and
– use them in the future
• If we learn from history well, we will:
– neither repeat the mistakes of the past
– nor would we waste time re-inventing
what already has been invented
Babbage’s Analytical Engine - 1833
• Mechanical, digital, general-purpose
• Could store instructions
• Could perform mathematical calculations
• Could store information permanently in punched cards
Punched Cards - 1801
• Initially had no relationship with computers
• Invented by a Frenchman named Joseph
-Marie Jacquard for storing weaving patterns
for automated textile looms (“khuddian”)
• Their value for storing computer-related
information was later realized by the early
computer builders
• Punched cards were replaced my magnetic storage only in the early 1950s
Turing Machine - 1936
• Alan Turing of Cambridge University presented his idea of a theoretically
simplified but fully capable computer, now known as the “Turing Machine”
• The concept of this machine, which could theoretically perform any
mathematical computation, was very important in the future development
of the computer
• The machine can simulate ANY computer algorithm, no matter how
complicated it is!
• Move the tape left of right by one square so that the machine can
read and edit the symbol on a neighboring square.
1st Generation: Vacuum Tube - 1904
• John Fleming, an English Physicist, developed the very first one
• These electronic devices consist of 2 or more electrodes encased in a glass or
metal tube
• They along with electric relays were used in the construction of earlier computers
• These early computers used vacuum tubes as circuitry and magnetic drums for
memory.
• These tubes have now been almost completely replaced by more reliable and less
costly transistors
ABC - 1939
• Attanasoff-Berry Computer (Birth of modern Computer)
• John Attanasoff & Clifford Berry at Iowa State College
• World’s first electronic digital computer
• The first computer that used binary numbers instead of decimal
• Helped grad students in solving simultaneous linear equations
Harvard Mark 1 - 1943
• Howard Aiken of Harvard University
• The first program controlled machine
• Included all the ideas proposed by Babbage for the Analytical Engine
• No steam Engines used
• The last famous electromechanical computer
ENIAC – 1946
• Electronic Numerical Integrator And Computer
• World’s first large-scale, general-purpose electronic
computer
• Built by John Mauchly & John Echert at the
University of Pennsylvania
• Developed for military applications
• 5,000 operations/sec, 19000 tubes, 30 ton
• 150 kilowatts: Used to dim the lights in the City of
Philadelphia down when it ran (Rewired manually each time)
2 Generation: Transistor - 1947
nd
• Invented by Shockly, Bardeen, and Brattain at the
Bell Labs in the US
• Compared to vacuum tubes, it offered:
– much smaller size
– better reliability
– much lower power consumption
– much lower cost
• All modern computers are made of miniaturized
transistors
• Tubes replaced mechanicals
• Transistors replaced tubes
EDVAC – 1948
• Electronic Discrete Variable Automatic Computer
• Built by Echert & Mauchly and included many design ideas
proposed by Von Neumann
• The first electronic computer design to incorporate a
program stored entirely within its memory
• First computer to use Magnetic Tape for storing programs.
Before this, computers needed to be re-wired each time a
new program was to be run
Floppy Disk - 1950
Invented at the Imperial University in Tokyo by
Yoshiro Nakamats
Provided faster access to programs and data as
compared with magnetic tape
Compiler - 1951
• Grace Hopper of US Navy develops the very first
high-level language compiler
• Before the invention of this compiler, developing a
computer program was tedious and prone to errors
• A compiler translates a high-level language (that is
easy to understand for humans) into a language
that the computer can understand
UNIVAC 1 - 1951
• UNIVersal Automatic Computer
• Echert & Mauchly Computer Company
• First computer designed for commercial apps
• First computer that could not only manipulate
numbers but text data as well
• Max speed: 1905 operations/sec
• Cost: US$1,000,000
• 5000 tubes. 943 cu ft. 8 tons. 100 kilowatts
• Between 1951-57, 48 were sold
BASIC - 1965
• Beginner All-purpose Symbolic Instruction Code
• Developed by Thomas Kurtz & John Kemeny at Dartmouth
College
• The first programming language designed for the non-
techies
• The grand-mother of the most popular programming
language in the world today – Visual BASIC
Computer Mouse - 1965
• Invented by Douglas Englebart
• Did not become popular until 1983, when Apple
Computers adopted the concept
ARPANET - 1969
• A network of networks
• The grand-daddy of the today’s global Internet
• A network of around 60,000 computers developed
by the US Dept of Defense to facilitate
communications between research organizations
and universities
Intel 4004 - 1971
• The first microprocessor
• Microprocessor: A complete computer on a
chip
• Speed: 750 kHz
Altair 8800 - 1975
• The commercially available 1st PC
• Based on the Intel 8080
• Cost $397
• Had 256 bytes of memory; my PC at home has a
million times more RAM (Random Access Memory)
Cray 1 - 1976
• The first commercial supercomputer
• Supercomputers are state-of-the-art machines designed to
perform calculations as fast as the current technology
allows
• Used to solve extremely complex tasks: weather prediction,
simulation of atomic explosions; aircraft design; movie
animation
• Cray 1 could do 167 million calculations a send; the current
state-of the-art machines can do many trillion (1012)
calculations per second
IBM PC & MS DOS - 1981
• IBM PC: The tremendously popular PC; the
grand-daddy of 95% of the PC’s in use today
• MS DOS: The tremendously popular operating
system that came bundled with the IBM PC
TCP/IP Protocol - 1982
• Transmission Control Protocol/Internet Protocol
• The communications protocol used by the computer
networks, including the Internet
• A communication protocol is a set of rules that governs
the flow of information over a network
Apple Macintosh - 1984
• The first popular, user-friendly, WIMP-based
PC
• Based on the WIMP (Windows, Icons, Menus,
Pointing Device) ideas first developed for the
Star computer at Xerox PARC (1981)
World Wide Web -1989
• Tim Berners Lee – British physicist
• 1989 – At the European Center for Nuclear Energy
Research (CERN) in Geneva
• 1993 - The 1st major browser “Mosaic” was
developed at the National Center for
Supercomputing Applications at the University of
Illinois, Urbana-Champaign
Deep Blue -vs- Kasparov - 1997
It could analyze up to 300 billion
chess moves in three minutes
In 1997 Deep Blue, a supercomputer designed
by IBM, beat Gary Kasparov, the World Chess
Champion
That computer was exceptionally fast, did not
get tired or bored. It just kept on analyzing
the situation and kept on searching until it
found the perfect move from its list of
possible moves
Mobile Phone-Computer
• A small computer, no bigger than the hand set of
desktop phone
• Can do whatever an Internet-capable computer can
plus can function as a regular phone
• First consumer device formed by the fusion of
computing and wireless telecommunication
What is he next major Milestone?
1. Mechanical computing
2. Electro-mechanical computing
3. Vacuum tube computing
4. Transistor computing
(the current state-of the-art)
5. Quantum computing
Quantum
Mechanics
QUANTUM MECHANICS is the branch of physics which describes the
activity of subatomic particles, i.e. the particles that make up atoms
What is he next major Milestone?
• Quantum computers may one day be millions of times more
efficient than the current state-of-the-art computers.
• They take advantage of the laws that govern the behavior of
subatomic particles.
• These laws allow quantum computers to examine all possible
answers to a question simultaneously
• For example, if you want to find the largest from a list of four
numbers:
– The current computers require on average 2 to 3 steps to get to the
answer
– Whereas, the quantum computer may be able to do that in a single step
For further info …
Read the following article that is available on the Web:
Quantum Computing with Molecules
by Neil Gershenfeld and Isaac L. Chuang
http://www.sciam.com/1998/0698issue/0698gershenfeld.html
The END