0% found this document useful (0 votes)
3K views9 pages

Final Summarized History of Computer With References

Describes and Analysis the history of computers
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3K views9 pages

Final Summarized History of Computer With References

Describes and Analysis the history of computers
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Summarized History of Computer with References

HISTOTRY OF COMPUTER

A computer is an electronic machine that collects data or information, stores the data,
processes it according to user instructions, and then produce the result. A computer is also a
programmable electronic device that performs arithmetic and logical operations
automatically using a set of instructions provided by the user.

In early computing, devices like sticks, stones, and bones were used as counting tools before
computers were invented. More computing devices were produced as technology advanced
and the human intellect improved over time.

EARLY AGE COMPUTING DEVICES

Abacus

Abacus was invented by the Chinese around 4000 years ago. It’s a wooden rack with metal
rods with beads attached to them. The abacus operator moves the beads according to
certain guidelines to complete arithmetic computations. The beads were moved by the
abacus operator according to some rules to perform arithmetic calculations. Abacus is still
used in some countries like China, Russia and Japan. An image of this tool is shown below;

Napier’s Bone

John Napier devised Napier’s Bones, a manually operated calculating apparatus. It was also
the first machine to calculate using the decimal point system. It was a manually-operated
calculating device which was invented by John Napier (1550-1617) of Merchiston. In this
calculating tool, he used 9 different ivory strips or bones marked with numbers to multiply
and divide. So, the tool became known as "Napier's Bones.

Pascaline

Pascaline was invented in 1642 by Biaise Pascal, a French mathematician and philosopher.
It was a wooden box with gears and wheels inside . Pascaline is also known as Arithmetic
Machine or Adding Machine. It was invented between 1642 and 1644 by a French
mathematician-philosopher Biaise Pascal. It is believed that it was the first mechanical and
automatic calculator. Pascal invented this machine to help his father, a tax accountant. It
could only perform addition and subtraction. It was a wooden box with a series of gears and
wheels. When a wheel is rotated one revolution, it rotates the neighboring wheel. A series of
windows is given on the top of the wheels to read the totals. An image of this tool is shown
below;

Stepped Reckoner or Leibniz wheel


In 1673, a German mathematician-philosopher named Gottfried Wilhelm Leibniz improved
on Pascal’s invention to create this apparatus. It was a digital mechanical calculator known
as the stepped reckoner because it used fluted drums instead of gears.

Difference Engine

In the early 1820s, Charles Babbage who was known as “Father of Modern Computer”,
created the Difference Engine. It was a mechanical computer that could do basic
computations. It was a steam-powered calculating machine used to solve numerical tables
such as logarithmic tables.

Analytical Engine

Charles Babbage created another calculating machine, the Analytical Engine, in 1830. It was
a mechanical computer that took input from punch cards. It was capable of solving any
mathematical problem and storing data in an indefinite memory.

Tabulating machine

An American Statistician, Herman Hollerith invented this machine in the year 1890.
Tabulating Machine was a punch card-based mechanical tabulator. It could compute
statistics and record or sort data or information. This machine was used in the 1890 U.S.
Census. Hollerith began manufacturing these machines in his company, which ultimately
became International Business Machines (IBM) in 1924.

Differential Analyzer

Vannevar Bush introduced the first electrical computer, the Differential Analyzer, in 1930. It
was an analog device. This machine is made up of vacuum tubes that switch electrical
impulses in order to do calculations. It was capable of performing 25 calculations in a
matter of minutes.

Mark I

The next major changes in the history of computer began in 1937. Howard Aiken planned to
build a machine in 1937 that could conduct massive calculations or calculations using
enormous numbers. The Mark I computer was constructed in 1944 as a collaboration
between IBM and Harvard. It was the first programmable digital computer.

HISTORY OF COMPUTER GENERATION

The word ‘computer’ was first used in the 16th century for a person who used to compute,
i.e. do calculations. The word was used in the same sense as a noun until the 20th century.
By the last part of the 19th century, the word was also used to describe machines that did
calculations. The modern-day use of the word is generally to describe programmable digital
devices that run on electricity.
Since the evolution of humans, devices have been used for calculations for thousands of
years. One of the earliest and most well-known devices was an abacus. Then in 1822, the
father of computers, Charles Babbage began developing what would be the first mechanical
computer. And then in 1833 he actually designed an Analytical Engine which was a general-
purpose computer. It contained an ALU, some basic flow chart principles and the concept of
integrated memory.

Then more than a century later in the history of computers, we got our first electronic
computer for general purpose. It was the ENIAC, which stands for Electronic Numerical
Integrator and Computer. The inventors of this computer were John W. Mauchly
and J.Presper Eckert.

And with times the technology developed and the computers got smaller and the processing
got faster. We got our first laptop in 1981 and it was introduced by Adam Osborne and
EPSON.

GENERATION OF COMPUTERS

In the history of computers, we often refer to the advancements of modern computers as


the generation of computers. These generation are in five categories. Presently we are on
the fifth generation of computers. Also a generation of computers refers to the specific
improvements in computer technology with time. In 1946, electronic pathways called
circuits were developed to perform the counting. It replaced the gears and other mechanical
parts used for counting in previous computing machines.

In each new generation, the circuits became smaller and more advanced than the previous
generation circuits. The miniaturization helped increase the speed, memory and power of
computers. There are five generations of computers which are described below;

1st Generation: This first generation computers were from the period of 1946-1959. This
was when machine language was developed for the use of computers. They used vacuum
tubes for the circuitry-basic component of CPU and Memory. For the purpose of memory,
they used magnetic drums. These machines were complicated, slow, large, and expensive.
They were mostly reliant on batch operating systems and punch cards. Magnetic tape and
paper tape were used as output and input devices in this generation. Some of the popular
first generation computers are;

ENIAC ( Electronic Numerical Integrator and Computer)

EDVAC ( Electronic Discrete Variable Automatic Computer)

UNIVACI( Universal Automatic Computer)

IBM-701

IBM-650
2nd Generation:

The second generation (1959-1965) was the era of the transistor computers. These
computers used transistors which were cheap, compact and consuming less power; it made
transistor computers faster than the first generation computers. In second-generation
computers, COBOL and FORTRAN are employed as assembly languages and programming
languages and Batch processing and multiprogramming operating systems were used in
these computers. Here they advanced from vacuum tubes to transistors. This made the
computers smaller, faster and more energy-efficient. And they advanced from binary to
assembly languages. Some of the popular second generation computer are, IBM 1620, IBM
7094, CDC 1604, CDC 3600, and UNIVAC 1108.

3rd Generation: The hallmark of this period (1964-1971) was the development of the
integrated circuit. A single integrated circuit (IC) is made up of many transistors, which
increases the power of a computer while simultaneously lowering its cost. These generation
computers used remote processing, time-sharing, multi programming as operating system.
These computers were quicker, smaller, more reliable, and less expensive than their
predecessors. High-level programming languages such as FORTRON-II to IV, COBOL, and
PASCAL PL/1, ALGOL-68 ,were utilized. Some of the popular third generation computer are,
the IBM-360 series, the Honeywell-6000 series, PDP( Personal Data Processor), TDC-316
and the IBM-370/168.

4th Generation: The invention of the microprocessors brought along the fourth generation
of computers. The years 1971-1980 were dominated by fourth generation computers. The
fourth generation computers used very large scale integrated (VLSI) circuits; a chip
containing millions of transistors and other circuit elements. These chips made this
generation computers more compact, powerful, fast and affordable. These generation
computers used real time, time sharing and distributed operating system. C, C++, DBASE
and Java were the programming languages utilized in this generation of computers. For
instance, the DEC 10 STAR 1000, PDP 11, CRAY-1( Super Computer), CRAY-X-MP, and Apple
II. This was when we started producing computers for home use.

5th Generation: These computers have been utilized since 1980 till date. This is the present
and the future of the computer world. The defining aspect of this generation is the use of
parallel processing hardware and artificial intelligence. In this generation the VLSI
technology was replaced with ULSI (Ultra Large Scale Integration). It made possible the
production of microprocessor chips with ten million electronic components. The use of
parallel processing and superconductors are making this a reality and provide a lot of scope
for the future. Fifth-generation computers use ULSI (Ultra Large Scale Integration)
technology.These are the most recent and sophisticated computers. C, C++, Java,.Net, and
more programming languages are used. For instance, IBM, Pentium, Desktop, Laptop,
Notebook, Ultrabook, Chromebook and so on.

NOTABLE DATES AND INVENTORS ON THE GENERATION OF COMPUTERS


19th Century

1801 – Joseph Marie Jacquard, a weaver and businessman from France, devised a loom that
employed punched wooden cards to automatically weave cloth designs.

1822 – Charles Babbage, a mathematician, invented the steam-powered calculating machine


capable of calculating number tables. The “Difference Engine” idea failed owing to a lack of
technology at the time.

1848 – The world’s first computer program was written by Ada Lovelace, an English
mathematician. Lovelace also includes a step-by-step tutorial on how to compute Bernoulli
numbers using Babbage’s machine.

1890 – Herman Hollerith, an inventor, creates the punch card technique used to calculate
the 1880 U.S. census. He would go on to start the corporation that would become IBM.

Early 20th Century

1930 – Differential Analyzer was the first large-scale automatic general-purpose mechanical
analogue computer invented and built by Vannevar Bush.

1936 – Alan Turing had an idea for a universal machine, which he called the Turing
machine, that could compute anything that could be computed.

1939 – Hewlett-Packard was discovered in a garage in Palo Alto, California by Bill Hewlett
and David Packard.

1941 – Konrad Zuse, a German inventor and engineer, completed his Z3 machine, the
world’s first digital computer. However, the machine was destroyed during a World War II
bombing strike on Berlin.

1941 – J.V. Atanasoff and graduate student Clifford Berry devise a computer capable of
solving 29 equations at the same time. The first time a computer can store data in its
primary memory.

1945 – University of Pennsylvania academics John Mauchly and J. Presper Eckert create an
Electronic Numerical Integrator and Calculator (ENIAC). It was Turing-complete and
capable of solving “a vast class of numerical problems” by reprogramming, earning it the
title of “Grandfather of computers.”

1946 – The UNIVAC I (Universal Automatic Computer) was the first general-purpose
electronic digital computer designed in the United States for corporate applications.

1949 – The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at
the University of Cambridge, is the “first practical stored-program computer.”

1950 – The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC,
and it was the first stored-program computer completed in the United States.
Late 20th Century

1953 – Grace Hopper, a computer scientist, creates the first computer language, which
becomes known as COBOL, which stands for COmmon, Business-Oriented Language. It
allowed a computer user to offer the computer instructions in English-like words rather
than numbers.

1954 – John Backus and a team of IBM programmers created the FORTRAN programming
language, an acronym for FORmula TRANslation. In addition, IBM developed the 650.

1958 – The integrated circuit, sometimes known as the computer chip, was created by Jack
Kirby and Robert Noyce.

1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world
at the time, and it pioneered the concept of “virtual memory.”

1964 – Douglas Engelbart proposes a modern computer prototype that combines a mouse
and a graphical user interface (GUI).

1969 – Bell Labs developers, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an
operating system developed in the C programming language that addressed program
compatibility difficulties.

1970 – The Intel 1103, the first Dynamic Access Memory (DRAM) chip, is unveiled by Intel.

1971 – The floppy disc was invented by Alan Shugart and a team of IBM engineers. In
the same year, Xerox developed the first laser printer, which not only produced billions of
dollars but also heralded the beginning of a new age in computer printing.

1973 – Robert Metcalfe, a member of Xerox’s research department, created Ethernet, which
is used to connect many computers and other gear.

1974 – Personal computers were introduced into the market. The first were the Altair Scelbi
& Mark-8, IBM 5100, and Radio Shack’s TRS-80.

1975 – Popular Electronics magazine touted the Altair 8800 as the world’s first
minicomputer kit in January. Paul Allen and Bill Gates offer to build software in the BASIC
language for the Altair.

1976 – Apple Computers is founded by Steve Jobs and Steve Wozniak, who expose the world
to the Apple I, the first computer with a single-circuit board.

1977 – At the first West Coast Computer Faire, Jobs and Wozniak announce the Apple II. It
has colour graphics and a cassette drive for storing music.

1978 – The first computerized spreadsheet program, VisiCalc, is introduced.

1979 – WordStar, a word processing tool from MicroPro International, is released.


1981 – IBM unveils the Acorn, their first personal computer, which has an Intel CPU, two
floppy drives, and a colour display. The MS-DOS operating system from Microsoft is used by
Acorn.

1983 – The CD-ROM, which could carry 550 megabytes of pre-recorded data, hit the market.
This year also saw the release of the Gavilan SC, the first portable computer with a flip-form
design and the first to be offered as a “laptop.”

1984 – Apple launched Macintosh during the Superbowl XVIII commercial. It was priced at
$2,500

1985 – Microsoft introduces Windows, which enables multitasking via a graphical user
interface. In addition, the programming language C++ has been released.

1990 – Tim Berners-Lee, an English programmer and scientist, creates HyperText Markup
Language, widely known as HTML. He also coined the term “WorldWideWeb.” It includes
the first browser, a server, HTML, and URLs.

1993 – The Pentium CPU improves the usage of graphics and music on personal computers.

1995 – Microsoft’s Windows 95 operating system was released. A $300 million promotional
campaign was launched to get the news out. Sun Microsystems introduces Java 1.0, followed
by Netscape Communications’ JavaScript.

1996 – At Stanford University, Sergey Brin and Larry Page created the Google search engine.

1998 – Apple introduces the iMac, an all-in-one Macintosh desktop computer. These PCs
cost $1,300 and came with a 4GB hard drive, 32MB RAM, a CD-ROM, and a 15-inch monitor.

1999 – Wi-Fi, an abbreviation for “wireless fidelity,” is created, originally covering a range
of up to 300 feet.

21st Century

2000 – The USB flash drive is first introduced in 2000. They were speedier and had more
storage space than other storage media options when used for data storage.

2001 – Apple releases Mac OS X, later renamed OS X and eventually simply macOS, as the
successor to its conventional Mac Operating System.

2003 – Customers could purchase AMD’s Athlon 64, the first 64-bit CPU for consumer
computers.

2004 – Facebook began as a social networking website.

2005 – Google acquires Android, a mobile phone OS based on Linux.

2006 – Apple’s MacBook Pro was available. The Pro was the company’s first dual-core, Intel-
based mobile computer.
Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and Amazon Simple Storage
Service, were also launched (S3)

2007 – The first iPhone was produced by Apple, bringing many computer operations into
the palm of our hands. Amazon also released the Kindle, one of the first electronic reading
systems, in 2007.

2009 – Microsoft released Windows 7.

2011 – Google introduces the Chromebook, which runs Google Chrome OS.

2014 – The University of Michigan Micro Mote (M3), the world’s smallest computer, was
constructed.

2015 – Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.

2016 – The world’s first reprogrammable quantum computer is built.

TYPES OF COMPUTERS

Analog Computers – Analog computers are built with various components such as gears and
levers, with no electrical components. One advantage of analogue computation is that
designing and building an analogue computer to tackle a specific problem can be quite
straightforward.

Digital Computers – Information in digital computers is represented in discrete form,


typically as sequences of 0s and 1s (binary digits, or bits). A digital computer is a system or
gadget that can process any type of information in a matter of seconds. Digital computers
are categorized into many different types. They are as follows:

Mainframe computers – It is a computer that is generally utilized by large enterprises for


mission-critical activities such as massive data processing. Mainframe computers were
distinguished by massive storage capacities, quick components, and powerful
computational capabilities. Because they were complicated systems, they were managed by
a team of systems programmers who had sole access to the computer. These machines are
now referred to as servers rather than mainframes.

Supercomputers – The most powerful computers to date are commonly referred to as


supercomputers. Supercomputers are enormous systems that are purpose-built to solve
complicated scientific and industrial problems. Quantum mechanics, weather forecasting,
oil and gas exploration, molecular modelling, physical simulations, aerodynamics, nuclear
fusion research, and cryptoanalysis are all done on supercomputers.

Minicomputers – A minicomputer is a type of computer that has many of the same features
and capabilities as a larger computer but is smaller in size. Minicomputers, which were
relatively small and affordable, were often employed in a single department of an
organization and were often dedicated to a specific task or shared by a small group.
Microcomputers – A microcomputer is a small computer that is based on a microprocessor
integrated circuit, often known as a chip. A microcomputer is a system that incorporates at a
minimum a microprocessor, program memory, data memory, and input-output system
(I/O). A microcomputer is now commonly referred to as a personal computer (PC).

Embedded processors – These are miniature computers that control electrical and
mechanical processes with basic microprocessors. Embedded processors are often simple
in design, have limited processing capability and I/O capabilities, and need little power.
Ordinary microprocessors and microcontrollers are the two primary types of embedded
processors. Embedded processors are employed in systems that do not require the
computing capability of traditional devices such as desktop computers, laptop computers,
or workstations.

TURING’S VISION

Alan Turing mathematical theory gave rise to modern computer science and applications
from the desktops to cell phones.

In 1936, when he was just 24 years old, Alan Turing wrote a remarkable paper in which he
outlined the theory of computation, laying out the ideas that underlie all modern computers.
This groundbreaking and powerful theory now forms the basis of computer science.
In Turing’s Vision, Chris Bernhardt explains the theory for the general reader, beginning
with its foundations and systematically building to its surprising conclusions. He also views
Turing’s theory in the context of mathematical history, other views of computation
(including those of Alonzo Church), Turing’s later work, and the birth of the modern
computer.

Turing wanted to show that there were problems that were beyond any computer’s ability
to solve; in particular, he wanted to find a decision problem that he could prove was
undecidable. To explain Turing’s ideas, Bernhardt examines 3 well-known decision
problems to explore the concept of undecidability; investigates theoretical computing
machines, including Turing machines; explains universal machines; and proves that certain
problems are undecidable, including Turing’s problem concerning computable numbers.

You might also like