0% found this document useful (0 votes)
23 views2 pages

Computers

A computer is a programmable machine that performs arithmetic and logical operations, with modern computers capable of executing a wide range of tasks through various programs. The evolution of computers began with simple calculating devices and advanced through the development of digital electronic machines, leading to the Digital Revolution. The term 'computer' originally referred to a human calculator before it was applied to machines in the mid-20th century.

Uploaded by

arshad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views2 pages

Computers

A computer is a programmable machine that performs arithmetic and logical operations, with modern computers capable of executing a wide range of tasks through various programs. The evolution of computers began with simple calculating devices and advanced through the development of digital electronic machines, leading to the Digital Revolution. The term 'computer' originally referred to a human calculator before it was applied to machines in the mid-20th century.

Uploaded by

arshad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

A computer is a machine that can be programmed to automatically carry

out sequences of arithmetic or logical operations (computation). Modern digital


electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.

A broad range of industrial and consumer products use computers as control


systems, including simple special-purpose devices like microwave ovens and remote
controls, and factory devices like industrial robots. Computers are at the core of
general-purpose devices such as personal computers and mobile devices such
as smartphones. Computers power the Internet, which links billions of computers
and users.

Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More sophisticated
electrical machines did specialized analog calculations in the early 20th century. The
first digital electronic calculating machines were developed during World War II,
both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies
in the late 1950s, leading to the microprocessor and the microcomputer revolution in
the 1970s. The speed, power, and versatility of computers have been increasing
dramatically ever since then, with transistor counts increasing at a rapid pace
(Moore's law noted that counts doubled every two years), leading to the Digital
Revolution during the late 20th and early 21st centuries.

Conventionally, a modern computer consists of at least one processing element,


typically a central processing unit (CPU) in the form of a microprocessor, together
with some type of computer memory, typically semiconductor memory chips. The
processing element carries out arithmetic and logical operations, and a sequencing
and control unit can change the order of operations in response to
stored information. Peripheral devices include input devices
(keyboards, mice, joysticks, etc.), output devices (monitors, printers, etc.),
and input/output devices that perform both functions (e.g. touchscreens). Peripheral
devices allow information to be retrieved from an external source, and they enable
the results of operations to be saved and retrieved.

Etymology
A human computer, with microscope and
calculator, 1952
It was not until the mid-20th century that the word acquired its modern definition;
according to the Oxford English Dictionary, the first known use of the
word computer was in a different sense, in a 1613 book called The Yong Mans
Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest
computer of Times, and the best Arithmetician that euer [sic] breathed, and he
reduceth thy dayes into a short number." This usage of the term referred to a human
computer, a person who carried out calculations or computations. The word
continued to have the same meaning until the middle of the 20th century. During the
latter part of this period, women were often hired as computers because they could
be paid less than their male counterparts.[1] By 1943, most human computers were
women.[2]

The Online Etymology Dictionary gives the first attested use of computer in the
1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)".
The Online Etymology Dictionary states that the use of the term to mean "'calculating
machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that
the "modern use" of the term, to mean 'programmable digital electronic computer'
dates from "1945 under this name; [in a] theoretical [sense] from 1937, as Turing
machine".[3] The name has remained, although modern computers are capable of
many higher-level functions.

You might also like