0% found this document useful (0 votes)
16 views18 pages

Computer

an essay about the computer and nearly everything about it!

Uploaded by

boiorange81
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views18 pages

Computer

an essay about the computer and nearly everything about it!

Uploaded by

boiorange81
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Computer

For other uses, see Computer (disambiguation).

Computers and computing devices from different eras—left to right, top to


bottom:
 Early vacuum tube computer (ENIAC)
 Mainframe computer (IBM System 360)
 Smartphone (LYF Water 2)
 Desktop computer (IBM ThinkCentre S50 with monitor)
 Video game console (Nintendo GameCube)
 Supercomputer (IBM Summit)
A computer is a machine that can be programmed to
automatically carry out sequences of arithmetic or logical
operations (computation). Modern digital electronic computers
can perform generic sets of operations known as programs.
These programs enable computers to perform a wide range of
tasks. The term computer system may refer to a nominally
complete computer that includes the hardware, operating
system, software, and peripheral equipment needed and used
for full operation; or to a group of computers that are linked
and function together, such as a computer
network or computer cluster.
A broad range of industrial and consumer products use
computers as control systems, including simple special-purpose
devices like microwave ovens and remote controls, and factory
devices like industrial robots. Computers are at the core of
general-purpose devices such as personal
computers and mobile devices such as smartphones.
Computers power the Internet, which links billions of computers
and users.
Early computers were meant to be used only for calculations.
Simple manual instruments like the abacus have aided people
in doing calculations since ancient times. Early in the Industrial
Revolution, some mechanical devices were built to automate
long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did
specialized analog calculations in the early 20th century. The
first digital electronic calculating machines were developed
during World War II, both electromechanical and
using thermionic valves. The first semiconductor transistors in
the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated
circuit chip technologies in the late 1950s, leading to
the microprocessor and the microcomputer revolution in the
1970s. The speed, power, and versatility of computers have
been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that
counts doubled every two years), leading to the Digital
Revolution during the late 20th and early 21st centuries.
Conventionally, a modern computer consists of at least
one processing element, typically a central processing
unit (CPU) in the form of a microprocessor, together with some
type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic
and logical operations, and a sequencing and control unit can
change the order of operations in response to
stored information. Peripheral devices include input devices
(keyboards, mice, joystick, etc.), output devices (monitor
screens, printers, etc.), and input/output devices that perform
both functions (e.g., the 2000s-era touchscreen). Peripheral
devices allow information to be retrieved from an external
source, and they enable the results of operations to be saved
and retrieved.
Etymology

human computer, with microscope and calculator,


1952
It was not until the mid-20th century that the word acquired its
modern definition; according to the Oxford English Dictionary,
the first known use of the word computer was in a different
sense, in a 1613 book called The Yong Mans Gleanings by the
English writer Richard Brathwait: "I haue [sic] read the truest
computer of Times, and the best Arithmetician that euer [sic]
breathed, and he reduceth thy dayes into a short number." This
usage of the term referred to a human computer, a person who
carried out calculations or computations. The word continued to
have the same meaning until the middle of the 20th century.
During the latter part of this period, women were often hired as
computers because they could be paid less than their male
counterparts.[1] By 1943, most human computers were women.
[2]
The Online Etymology Dictionary gives the first attested use
of computer in the 1640s, meaning 'one who calculates'; this is
an "agent noun from compute (v.)". The Online Etymology
Dictionary states that the use of the term to mean "'calculating
machine' (of any type) is from 1897." The Online Etymology
Dictionary indicates that the "modern use" of the term, to
mean 'programmable digital electronic computer' dates from
"1945 under this name; [in a] theoretical [sense] from 1937,
as Turing machine".[3] The name has remained, although
modern computers are capable of many higher-level functions.
History
Main articles: History of computing and History of computing
hardware
For a chronological guide, see Timeline of computing.
Pre-20th century

Ishango bone, a bone tool dating back to prehistoric Africa


Devices have been used to aid computation for thousands of
years, mostly using one-to-one correspondence with fingers.
The earliest counting device was most likely a form of tally
stick. Later record keeping aids throughout the Fertile
Crescent included calculi (clay spheres, cones, etc.) which
represented counts of items, likely livestock or grains, sealed in
hollow unbaked clay containers.[a][4] The use of counting
rods is one example.

suanpan (算盘). The number represented on


this abacus is 6,302,715,408.
The abacus was initially used for arithmetic tasks. The Roman
abacus was developed from devices used in Babylonia as early
as 2400 BCE. Since then, many other forms of reckoning boards
or tables have been invented. In a medieval European counting
house, a checkered cloth would be placed on a table, and
markers moved around on it according to certain rules, as an
aid to calculating sums of money.[5]

Antikythera mechanism, dating back to ancient


Greece circa 150–100 BCE, is an early analog computing device.
The Antikythera mechanism is believed to be the earliest
known mechanical analog computer, according to Derek J. de
Solla Price.[6] It was designed to calculate astronomical
positions. It was discovered in 1901 in the Antikythera wreck off
the Greek island of Antikythera, between Kythera and Crete,
and has been dated to approximately 100 BCE. Devices of
comparable complexity to the Antikythera mechanism would
not reappear until the fourteenth century.[7]
Many mechanical aids to calculation and measurement were
constructed for astronomical and navigation use.
The planisphere was a star chart invented by Abū Rayhān al-
Bīrūnī in the early 11th century.[8] The astrolabe was invented
in the Hellenistic world in either the 1st or 2nd centuries BCE
and is often attributed to Hipparchus. A combination of the
planisphere and dioptra, the astrolabe was effectively an
analog computer capable of working out several different kinds
of problems in spherical astronomy. An astrolabe incorporating
a mechanical calendar computer[9][10] and gear-wheels was
invented by Abi Bakr of Isfahan, Persia in 1235.[11] Abū Rayhān
al-Bīrūnī invented the first mechanical geared lunisolar
calendar astrolabe,[12] an early fixed-wired knowledge
processing machine[13] with a gear train and gear-wheels,[14]
1000 AD.
The sector, a calculating instrument used for solving problems
in proportion, trigonometry, multiplication and division, and for
various functions, such as squares and cube roots, was
developed in the late 16th century and found application in
gunnery, surveying and navigation.
The planimeter was a manual instrument to calculate the area
of a closed figure by tracing over it with a mechanical linkage.

slide rule
The slide rule was invented around 1620–1630, by the English
clergyman William Oughtred, shortly after the publication of the
concept of the logarithm. It is a hand-operated analog
computer for doing multiplication and division. As slide rule
development progressed, added scales provided reciprocals,
squares and square roots, cubes and cube roots, as well
as transcendental functions such as logarithms and
exponentials, circular and hyperbolic trigonometry and
other functions. Slide rules with special scales are still used for
quick performance of routine calculations, such as
the E6B circular slide rule used for time and distance
calculations on light aircraft.
In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a
mechanical doll (automaton) that could write holding a quill
pen. By switching the number and order of its internal wheels
different letters, and hence different messages, could be
produced. In effect, it could be mechanically "programmed" to
read instructions. Along with two other complex machines, the
doll is at the Musée d'Art et d'Histoire
of Neuchâtel, Switzerland, and still operates.[15]
In 1831–1835, mathematician and engineer Giovanni
Plana devised a Perpetual Calendar machine, which, through a
system of pulleys and cylinders and over, could predict
the perpetual calendar for every year from 0 CE (that is, 1 BCE)
to 4000 CE, keeping track of leap years and varying day length.
The tide-predicting machine invented by the Scottish
scientist Sir William Thomson in 1872 was of great utility to
navigation in shallow waters. It used a system of pulleys and
wires to automatically calculate predicted tide levels for a set
period at a particular location.
The differential analyser, a mechanical analog computer
designed to solve differential equations by integration, used
wheel-and-disc mechanisms to perform the integration. In
1876, Sir William Thomson had already discussed the possible
construction of such calculators, but he had been stymied by
the limited output torque of the ball-and-disk integrators.[16] In
a differential analyzer, the output of one integrator drove the
input of the next integrator, or a graphing output. The torque
amplifier was the advance that allowed these machines to
work. Starting in the 1920s, Vannevar Bush and others
developed mechanical differential analyzers.
In the 1890s, the Spanish engineer Leonardo Torres
Quevedo began to develop a series of advanced analog
machines that could solve real and complex roots
of polynomials,[17][18][19][20] which were published in 1901 by
the Paris Academy of Sciences.[21]
First computer

Charles Babbage, an English


mechanical engineer
and polymath, originated the
concept of a programmable
computer. Considered the
"father of the computer",
[22] he conceptualized and

The Difference Engine


Number 2 at
A diagram of a portion the Intellectual
of Babbage's Difference Ventures laboratory in
engine Seattle
invented the first mechanical computer in the early 19th
century.
After working on his difference engine he announced his
invention in 1822, in a paper to the Royal Astronomical Society,
titled "Note on the application of machinery to the computation
of astronomical and mathematical tables",[23] he also designed
to aid in navigational calculations, in 1833 he realized that a
much more general design, an analytical engine, was possible.
The input of programs and data was to be provided to the
machine via punched cards, a method being used at the time
to direct mechanical looms such as the Jacquard loom. For
output, the machine would have a printer, a curve plotter and a
bell. The machine would also be able to punch numbers onto
cards to be read in later. The engine would incorporate
an arithmetic logic unit, control flow in the form of conditional
branching and loops, and integrated memory, making it the
first design for a general-purpose computer that could be
described in modern terms as Turing-complete.[24][25]
The machine was about a century ahead of its time. All the
parts for his machine had to be made by hand – this was a
major problem for a device with thousands of parts. Eventually,
the project was dissolved with the decision of the British
Government to cease funding. Babbage's failure to complete
the analytical engine can be chiefly attributed to political and
financial difficulties as well as his desire to develop an
increasingly sophisticated computer and to move ahead faster
than anyone else could follow. Nevertheless, his son, Henry
Babbage, completed a simplified version of the analytical
engine's computing unit (the mill) in 1888. He gave a
successful demonstration of its use in computing tables in
1906.
Electromechanical calculating machine

Leonardo Torres Quevedo.


In his work Essays on Automatics published in 1914, Leonardo
Torres Quevedo wrote a brief history of Babbage's efforts at
constructing a mechanical Difference Engine and Analytical
Engine. The paper contains a design of a machine capable to
calculate formulas like , for a sequence of sets of
values. The whole machine was to be controlled by a read-
only program, which was complete with provisions
for conditional branching. He also introduced the idea
of floating-point arithmetic.[26][27][28] In 1920, to celebrate the
100th anniversary of the invention of the arithmometer, Torres
presented in Paris the Electromechanical Arithmometer, which
allowed a user to input arithmetic problems through
a keyboard, and computed and printed the results,[29][30][31]
[32] demonstrating the feasibility of an electromechanical
analytical engine.[33]
Analog computers
Main article: Analog computer

Sir William Thomson's third tide-predicting machine


design, 1879–81
During the first half of the 20th century, many
scientific computing needs were met by increasingly
sophisticated analog computers, which used a direct
mechanical or electrical model of the problem as a basis
for computation. However, these were not programmable and
generally lacked the versatility and accuracy of modern digital
computers.[34] The first modern analog computer was a tide-
predicting machine, invented by Sir William Thomson (later to
become Lord Kelvin) in 1872. The differential analyser, a
mechanical analog computer designed to solve differential
equations by integration using wheel-and-disc mechanisms,
was conceptualized in 1876 by James Thomson, the elder
brother of the more famous Sir William Thomson.[16]
The art of mechanical analog computing reached its zenith with
the differential analyzer, built by H. L. Hazen and Vannevar
Bush at MIT starting in 1927. This built on the mechanical
integrators of James Thomson and the torque amplifiers
invented by H. W. Nieman. A dozen of these devices were built
before their obsolescence became obvious. By the 1950s, the
success of digital electronic computers had spelled the end for
most analog computing machines, but analog computers
remained in use during the 1950s in some specialized
applications such as education (slide rule) and aircraft (control
systems).
Digital computers
Electromechanical
By 1938, the United States Navy had developed an
electromechanical analog computer small enough to use
aboard a submarine. This was the Torpedo Data Computer,
which used trigonometry to solve the problem of firing a
torpedo at a moving target. During World War II similar devices
were developed in other countries as well.

Konrad Zuse's Z3, the first fully automatic, digital


(electromechanical) computer
Early digital computers were electromechanical; electric
switches drove mechanical relays to perform the calculation.
These devices had a low operating speed and were eventually
superseded by much faster all-electric computers, originally
using vacuum tubes. The Z2, created by German
engineer Konrad Zuse in 1939 in Berlin, was one of the earliest
examples of an electromechanical relay computer.[35]

Konrad Zuse, inventor of the modern computer[36][37]


In 1941, Zuse followed his earlier machine up with the Z3, the
world's first working electromechanical programmable, fully
automatic digital computer.[38][39] The Z3 was built with
2000 relays, implementing a 22 bit word length that operated
at a clock frequency of about 5–10 Hz.[40] Program code was
supplied on punched film while data could be stored in 64
words of memory or supplied from the keyboard. It was quite
similar to modern machines in some respects, pioneering
numerous advances such as floating-point numbers. Rather
than the harder-to-implement decimal system (used in Charles
Babbage's earlier design), using a binary system meant that
Zuse's machines were easier to build and potentially more
reliable, given the technologies available at that time. [41] The
Z3 was not itself a universal computer but could be extended
to be Turing complete.[42][43]
Zuse's next computer, the Z4, became the world's first
commercial computer; after initial delay due to the Second
World War, it was completed in 1950 and delivered to the ETH
Zurich.[44] The computer was manufactured by Zuse's own
company, Zuse KG, which was founded in 1941 as the first
company with the sole purpose of developing computers in
Berlin.[44]
Vacuum tubes and digital electronic circuits
Purely electronic circuit elements soon replaced their
mechanical and electromechanical equivalents, at the same
time that digital calculation replaced analog. The
engineer Tommy Flowers, working at the Post Office Research
Station in London in the 1930s, began to explore the possible
use of electronics for the telephone exchange. Experimental
equipment that he built in 1934 went into operation five years
later, converting a portion of the telephone exchange network
into an electronic data processing system, using thousands
of vacuum tubes.[34] In the US, John Vincent
Atanasoff and Clifford E. Berry of Iowa State
University developed and tested the Atanasoff–Berry
Computer (ABC) in 1942,[45] the first "automatic electronic
digital computer".[46] This design was also all-electronic and
used about 300 vacuum tubes, with capacitors fixed in a
mechanically rotating drum for memory.[47]

Colossus, the first electronic


digital programmable computing device, was used to break German ciphers during
World War II. It is seen here in use at Bletchley Park in 1943.
During World War II, the British code-breakers at Bletchley
Park achieved a number of successes at breaking encrypted
German military communications. The German encryption
machine, Enigma, was first attacked with the help of the
electro-mechanical bombes which were often run by women.
[48][49] To crack the more sophisticated German Lorenz SZ
40/42 machine, used for high-level Army communications, Max
Newman and his colleagues commissioned Flowers to build
the Colossus.[47] He spent eleven months from early February
1943 designing and building the first Colossus. [50] After a
functional test in December 1943, Colossus was shipped to
Bletchley Park, where it was delivered on 18 January
1944[51] and attacked its first message on 5 February.[47]
Colossus was the world's first electronic digital programmable
computer.[34] It used a large number of valves (vacuum tubes).
It had paper-tape input and was capable of being configured to
perform a variety of boolean logical operations on its data, but
it was not Turing-complete. Nine Mk II Colossi were built (The
Mk I was converted to a Mk II making ten machines in total).
Colossus Mark I contained 1,500 thermionic valves (tubes), but
Mark II with 2,400 valves, was both five times faster and
simpler to operate than Mark I, greatly speeding the decoding
process.[52][53]

ENIAC was the first electronic, Turing-complete


device, and performed ballistics trajectory calculations for the United States
Army.
The ENIAC[54] (Electronic Numerical Integrator and Computer)
was the first electronic programmable computer built in the
U.S. Although the ENIAC was similar to the Colossus, it was
much faster, more flexible, and it was Turing-complete. Like the
Colossus, a "program" on the ENIAC was defined by
the states of its patch cables and switches, a far cry from
the stored program electronic machines that came later. Once
a program was written, it had to be mechanically set into the
machine with manual resetting of plugs and switches. The
programmers of the ENIAC were six women, often known
collectively as the "ENIAC girls".[55][56]
It combined the high speed of electronics with the ability to be
programmed for many complex problems. It could add or
subtract 5000 times a second, a thousand times faster than
any other machine. It also had modules to multiply, divide, and
square root. High speed memory was limited to 20 words
(about 80 bytes). Built under the direction of John
Mauchly and J. Presper Eckert at the University of Pennsylvania,
ENIAC's development and construction lasted from 1943 to full
operation at the end of 1945. The machine was huge, weighing
30 tons, using 200 kilowatts of electric power and contained
over 18,000 vacuum tubes, 1,500 relays, and hundreds of
thousands of resistors, capacitors, and inductors. [57]
Modern computers
Concept of modern computer
The principle of the modern computer was proposed by Alan
Turing in his seminal 1936 paper,[58] On Computable Numbers.
Turing proposed a simple device that he called "Universal
Computing machine" and that is now known as a universal
Turing machine. He proved that such a machine is capable of
computing anything that is computable by executing
instructions (program) stored on tape, allowing the machine to
be programmable. The fundamental concept of Turing's design
is the stored program, where all the instructions for computing
are stored in memory. Von Neumann acknowledged that the
central concept of the modern computer was due to this paper.
[59] Turing machines are to this day a central object of study
in theory of computation. Except for the limitations imposed by
their finite memory stores, modern computers are said to
be Turing-complete, which is to say, they
have algorithm execution capability equivalent to a universal
Turing machine.
Stored programs
Main article: Stored-program computer
Manchester Baby, the first electronic stored-
program computer
Early computing machines had fixed programs. Changing its
function required the re-wiring and re-structuring of the
machine.[47] With the proposal of the stored-program computer
this changed. A stored-program computer includes by design
an instruction set and can store in memory a set of instructions
(a program) that details the computation. The theoretical basis
for the stored-program computer was laid out by Alan Turing in
his 1936 paper. In 1945, Turing joined the National Physical
Laboratory and began work on developing an electronic stored-
program digital computer. His 1945 report "Proposed Electronic
Calculator" was the first specification for such a device. John
von Neumann at the University of Pennsylvania also circulated
his First Draft of a Report on the EDVAC in 1945.[34]
The Manchester Baby was the world's first stored-program
computer. It was built at the University of Manchester in
England by Frederic C. Williams, Tom Kilburn and Geoff Tootill,
and ran its first program on 21 June 1948.[60] It was designed
as a testbed for the Williams tube, the first random-
access digital storage device.[61] Although the computer was
described as "small and primitive" by a 1998 retrospective, it
was the first working machine to contain all of the elements
essential to a modern electronic computer.[62] As soon as the
Baby had demonstrated the feasibility of its design, a project
began at the university to develop it into a practically useful
computer, the Manchester Mark 1.
The Mark 1 in turn quickly became the prototype for
the Ferranti Mark 1, the world's first commercially available
general-purpose computer.[63] Built by Ferranti, it was delivered
to the University of Manchester in February 1951. At least
seven of these later machines were delivered between 1953
and 1957, one of them to Shell labs in Amsterdam.[64] In
October 1947 the directors of British catering company J. Lyons
& Company decided to take an active role in promoting the
commercial development of computers. Lyons's LEO
I computer, modelled closely on the Cambridge EDSAC of 1949,
became operational in April 1951[65] and ran the world's first
routine office computer job.
Transistors
Main articles: Transistor and History of the transistor
Further information: Transistor computer and MOSFET

Bipolar junction transistor (BJT)


The concept of a field-effect transistor was proposed by Julius
Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain,
while working under William Shockley at Bell Labs, built the
first working transistor, the point-contact transistor, in 1947,
which was followed by Shockley's bipolar junction transistor in
1948.[66][67] From 1955 onwards, transistors replaced vacuum
tubes in computer designs, giving rise to the "second
generation" of computers. Compared to vacuum tubes,
transistors have many advantages: they are smaller, and
require less power than vacuum tubes, so give off less
heat. Junction transistors were much more reliable than
vacuum tubes and had longer, indefinite, service life.
Transistorized computers could contain tens of thousands of
binary logic circuits in a relatively compact space. However,
early junction transistors were relatively bulky devices that
were difficult to manufacture on a mass-production basis,
which limited them to a number of specialized applications. [68]
At the University of Manchester, a team under the leadership
of Tom Kilburn designed and built a machine using the newly
developed transistors instead of valves.[69] Their
first transistorized computer and the first in the world,
was operational by 1953, and a second version was completed
there in April 1955. However, the machine did make use of
valves to generate its 125 kHz clock waveforms and in the
circuitry to read and write on its magnetic drum memory, so it
was not the first completely transistorized computer. That
distinction goes to the Harwell CADET of 1955,[70] built by the
electronics division of the Atomic Energy Research
Establishment at Harwell.[70][71]

MOSFET (MOS transistor), showing gate (G), body


(B), source (S) and drain (D) terminals. The gate is separated from the body by
an insulating layer (pink).
The metal–oxide–silicon field-effect transistor (MOSFET), also
known as the MOS transistor, was invented by Mohamed M.
Atalla and Dawon Kahng at Bell Labs in 1959.[72] It was the first
truly compact transistor that could be miniaturized and mass-
produced for a wide range of uses.[68] With its high scalability,
[73] and much lower power consumption and higher density
than bipolar junction transistors,[74] the MOSFET made it
possible to build high-density integrated circuits.[75][76] In
addition to data processing, it also enabled the practical use of
MOS transistors as memory cell storage elements, leading to
the development of MOS semiconductor memory, which
replaced earlier magnetic-core memory in computers. The
MOSFET led to the microcomputer revolution,[77] and became
the driving force behind the computer revolution.[78][79] The
MOSFET is the most widely used transistor in computers, [80]
[81] and is the fundamental building block of digital electronics.
[82]
Integrated circuits
Main articles: Integrated circuit and Invention of the integrated
circuit
Further information: Planar process and Microprocessor

The next great advance in computing power came with the


advent of the integrated circuit (IC). The idea of the integrated
circuit was first conceived by a radar scientist working for
the Royal Radar Establishment of the Ministry of
Defence, Geoffrey W.A. Dummer. Dummer presented the first
public description of an integrated circuit at the Symposium on
Progress in Quality Electronic Components in Washington, D.C.,
on 7 May 1952.[83]
The first working ICs were invented by Jack Kilby at Texas
Instruments and Robert Noyce at Fairchild Semiconductor.
[84] Kilby recorded his initial ideas concerning the integrated
circuit in July 1958, successfully demonstrating the first
working integrated example on 12 September 1958.[85] In his
patent application of 6 February 1959, Kilby described his new
device as "a body of semiconductor material ... wherein all the
components of the electronic circuit are completely
integrated".[86][87] However, Kilby's invention was a hybrid
integrated circuit (hybrid IC), rather than a monolithic
integrated circuit (IC) chip.[88] Kilby's IC had external wire
connections, which made it difficult to mass-produce. [89]
Noyce also came up with his own idea of an integrated circuit
half a year later than Kilby.[90] Noyce's invention was the first
true monolithic IC chip.[91][89] His chip solved many practical
problems that Kilby's had not. Produced at Fairchild
Semiconductor, it was made of silicon, whereas Kilby's chip was
made of germanium. Noyce's monolithic IC
was fabricated using the planar process, developed by his
colleague Jean Hoerni in early 1959. In turn, the planar process
was based on Mohamed M. Atalla's work on semiconductor
surface passivation by silicon dioxide in the late 1950s. [92][93]
[94]
Modern monolithic ICs are predominantly MOS (metal–oxide–
semiconductor) integrated circuits, built from MOSFETs (MOS
transistors).[95] The earliest experimental MOS IC to be
fabricated was a 16-transistor chip built by Fred Heiman and
Steven Hofstein at RCA in 1962.[96] General
Microelectronics later introduced the first commercial MOS IC in
1964,[97] developed by Robert Norman.[96] Following the
development of the self-aligned gate (silicon-gate) MOS
transistor by Robert Kerwin, Donald Klein and John Sarace at
Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned
gates was developed by Federico Faggin at Fairchild
Semiconductor in 1968.[98] The MOSFET has since become the
most critical device component in modern ICs.[95]
Die photograph of a MOS 6502, an early 1970s
microprocessor integrating 3500 transistors on a single chip
The development of the MOS integrated circuit led to the
invention of the microprocessor,[99][100] and heralded an
explosion in the commercial and personal use of computers.
While the subject of exactly which device was the first
microprocessor is contentious, partly due to lack of agreement
on the exact definition of the term "microprocessor", it is
largely undisputed that the first single-chip microprocessor was
the Intel 4004,[101] designed and realized by Federico Faggin
with his silicon-gate MOS IC technology,[99] along with Ted
Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b][103] In the
early 1970s, MOS IC technology enabled the integration of
more than 10,000 transistors on a single chip. [76]
System on a Chip (SoCs) are complete computers on
a microchip (or chip) the size of a coin.[104] They may or may
not have integrated RAM and flash memory. If not integrated,
the RAM is usually placed directly above (known as Package on
package) or below (on the opposite side of the circuit board)
the SoC, and the flash memory is usually placed right next to
the SoC, this all done to improve data transfer speeds, as the
data signals do not have to travel long distances. Since ENIAC
in 1945, computers have advanced enormously, with modern
SoCs (Such as the Snapdragon 865) being the size of a coin
while also being hundreds of thousands of times more powerful
than ENIAC, integrating billions of transistors, and consuming
only a few watts of power.
Mobile computers
The first mobile computers were heavy and ran from mains
power. The 50 lb (23 kg) IBM 5100 was an early example. Later
portables such as the Osborne 1 and Compaq Portable were
considerably lighter but still needed to be plugged in. The first
laptops, such as the Grid Compass, removed this requirement
by incorporating batteries – and with the continued
miniaturization of computing resources and advancements in
portable battery life, portable computers grew in popularity in
the 2000s.[105] The same developments allowed manufacturers
to integrate computing resources into cellular mobile phones
by the early 2000s.
These smartphones and tablets run on a variety of operating
systems and recently became the dominant computing device
on the market.[106] These are powered by System on a
Chip (SoCs), which are complete computers on a microchip the
size of a coin.[104]

You might also like