0% found this document useful (1 vote)
1K views8 pages

A Brief Computer History

The document provides a brief history of computers from the 19th century to present day in four generations: 1) First generation (1937-1946) used vacuum tubes and included the ENIAC, the first general-purpose computer. 2) Second generation (1947-1962) used transistors instead of vacuum tubes. The IBM 650 and 700 series popularized commercial computers. 3) Third generation (1963-present) used integrated circuits, making computers smaller, more powerful, and able to run multiple programs simultaneously. Personal computers became widely used in the 1980s. 4) Fourth generation (1971-present) began using microprocessors, allowing the development of microcomputers and personal computers.

Uploaded by

Chong Uy Alawi
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (1 vote)
1K views8 pages

A Brief Computer History

The document provides a brief history of computers from the 19th century to present day in four generations: 1) First generation (1937-1946) used vacuum tubes and included the ENIAC, the first general-purpose computer. 2) Second generation (1947-1962) used transistors instead of vacuum tubes. The IBM 650 and 700 series popularized commercial computers. 3) Third generation (1963-present) used integrated circuits, making computers smaller, more powerful, and able to run multiple programs simultaneously. Personal computers became widely used in the 1980s. 4) Fourth generation (1971-present) began using microprocessors, allowing the development of microcomputers and personal computers.

Uploaded by

Chong Uy Alawi
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd

A BRIEF COMPUTER HISTORY

The computer as we know it today had its beginning with a 19th century English mathematics professor name Charles Babbage. He designed the Analytical Engine and it was this design that the basic framework of the computers of today are based on. Generally speaking, computers can be classified into three generations. Each generation lasted for a certain period of time,and each gave us either a new and improved computer or an improvement to the existing computer. First generation: 1937 1946 - In 1937 the first electronic digital computer was built by Dr. John V. Atanasoff and Clifford Berry. It was called the Atanasoff-Berry Computer (ABC). In 1943 an electronic computer name the Colossus was built for the military. Other developments continued until in 1946 the first general purpose digital computer, the Electronic Numerical Integrator and Computer (ENIAC) was built. It is said that this computer weighed 30 tons, and had 18,000 vacuum tubes which was used for processing. When this computer was turned on for the first time lights dim in sections of Philadelphia. Computers of this generation could only perform single task, and they had no operating system. Second generation: 1947 1962 - This generation of computers used transistors instead of vacuum tubes which were more reliable. In 1951 the first computer for commercial use was introduced to the public; the Universal Automatic Computer (UNIVAC 1). In 1953 the International Business Machine (IBM) 650 and 700 series computers made their mark in the computer world. During this generation of computers over 100 computer programming languages were developed, computers had memory and operating systems. Storage media such as tape and disk were in use also were printers for output. Third generation: 1963 - present - The invention of integrated circuit brought us the third generation of computers. With this invention computers became smaller, more powerful more reliable and they are able to run many different programs at the same time. In1980 Microsoft Disk Operating System (MS-Dos) was born and in 1981 IBM introduced the personal computer (PC) for home and office use. Three years later Apple gave us the Macintosh computer with its icon driven interface and the 90s gave us Windows operating system.

As a result of the various improvements to the development of the computer we have seen the computer being used in all areas of life. It is a very useful tool that will continue to experience new development as time passes.

What is a Computer?

In its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations. In that respect the earliest computer was the abacus, used to perform basic arithmetic operations. Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results. Modern computers do this electronically, which enables them to perform a vastly greater number of calculations or computations in less time. Despite the fact that we currently use computers to process images, sound, text and other non-numerical forms of data, all of it depends on nothing more than basic numerical calculations. Graphics, sound etc. are merely abstractions of the numbers being crunched within the machine; in digital computers these are the ones and zeros, representing electrical on and off states, and endless combinations of those. In other words every image, every sound, and every word have a corresponding binary code. While abacus may have technically been the first computer most people today associate the word computer with electronic computers which were invented in the last century, and have evolved into modern computers we know of today.

ENIAC

First Generation Computers (1940s 1950s)

First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didnt operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 150 kilowatts of power. It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors. The first non-general purpose computer was ABC (AtanasoffBerry Computer), and other similar computers of this era included german Z3, ten British Colossus computers, LEO, Harvard Mark I, and UNIVAC.

IBM 1401

Second Generation Computers (1955 1960)


The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by todays standards. The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC.

Third Generation Computers (1960s)

IBM System/360

The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number

of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on. First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBMs System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew.

Fourth Generation Computers (1971 present)


First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004. The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today.

First Generation of Microcomputers (1971 1976)

Altair 8800

First microcomputers were a weird bunch. They often came in kits, and many were essentially just boxes with lights and switches, usable only to engineers and hobbyists whom could understand binary code. Some, however, did come with a keyboard and/or a monitor, bearing somewhat more resemblance to modern computers. It is arguable which of the early microcomputers could be called a first. CTC Datapoint 2200 is one candidate, although it actually didnt contain a microprocessor (being based on a multi-chip CPU design instead), and wasnt meant to be a standalone computer, but merely a terminal for the mainframes. The reason some might consider it a first microcomputer is because it could be used as a de-facto standalone computer, it was small enough, and its multi-chip CPU architecture actually became a basis for the x86 architecture later used in IBM PC and its descendants. Plus, it even came with a keyboard and a monitor, an exception in those days. However, if we are looking for the first microcomputer that came with a proper microprocessor, was meant to be a standalone computer, and didnt come as a kit then it would be Micral N, which used Intel 8008 microprocessor.

Popular early microcomputers which did come in kits include MOS Technology KIM-1, Altair 8800, and Apple I. Altair 8800 in particular spawned a large following among the hobbyists, and is considered the spark that started the microcomputer revolution, as these hobbyists went on to found companies centered around personal computing, such as Microsoft, and Apple.

Second Generation Microcomputers (1977 present)

Commodore PET2001 (Image by Tomislav Medaklicensed under CC-BY-SA).

As microcomputers continued to evolve they became easier to operate, making them accessible to a larger audience. They typically came with a keyboard and a monitor, or could be easily connected to a TV, and they supported visual representation of text and numbers on the screen. In other words, lights and switches were replaced by screens and keyboards, and the necessity to understand binary code was diminished as they increasingly came with programs that could be used by issuing more easily understandable commands. Famous early examples of such computers include Commodore PET, Apple II, and in the 80s the IBM PC. The nature of the underlying electronic components didnt change between these computers and modern computers we know of today, but what did change was the number of circuits that could be put onto a single microchip. Intels co-founder Gordon Moore predicted the doubling of the number of transistor on a single chip every two years, which became known as Moores Law, and this trend has roughly held for over 30 years thanks to advancing manufacturing processes and microprocessor designs. The consequence was a predictable exponential increase in processing power that could be put into a smaller package, which had a direct effect on the possible form factors as well as applications of modern computers, which is what most of the forthcoming paradigm shifting innovations in computing were about.

Graphical User Interface (GUI)

Macintosh 128k (Image by All About Apple museum licensed under CC-BY-SA-2.5-it)

Possibly the most significant of those shifts was the invention of the graphical user interface, and the mouse as a way of controlling it. Doug Engelbart and his team at the Stanford Research Lab developed the first mouse, and a graphical user interface, demonstrated in 1968. They were just a few years short of the beginning of the personal computer revolution sparked by the Altair 8800 so their idea didnt take hold. Instead it was picked up and improved upon by researchers at the Xerox PARC research center, which in 1973 developed Xerox Alto, the first computer with a mouse-driven GUI. It never became a commercial product, however, as Xerox management wasnt ready to dive into the computer market and didnt see the potential of what they had early enough. It took Steve Jobs negotiating a stocks deal with Xerox in exchange for a tour of their research center to finally bring the user friendly graphical user interface, as well as the mouse, to the masses. Steve Jobs was shown what Xerox PARC team had developed, and directed Apple to improve upon it. In 1984 Apple introduced the Macintosh, the first mass-market computer with a graphical user interface and a mouse. Microsoft later caught on and produced Windows, and the historic competition between the two companies started, resulting in improvements to the graphical user interface to this day. Meanwhile IBM was dominating the PC market with their IBM PC, and Microsoft was riding on their coat tails by being the one to produce and sell the operating system for the IBM PC known as DOS or Disk Operating System. Macintosh, with its graphical user interface, was meant to dislodge IBMs dominance, but Microsoft made this more difficult with their PC-compatible Windows operating system with its own GUI.

Portable Computers

Powerbook 150 (Image by Dana Sibera licensed under CC-BY-SA.)

As it turned out the idea of a laptop-like portable computer existed even before it was possible to create one, and it was developed at Xerox PARC by Alan Kay whom called it the Dynabook and intended it for children. The first portable computer that was created was the Xerox Notetaker, but only 10 were produced. The first laptop that was commercialized was Osborne 1 in 1981, with a small 5 CRT monitor and a keyboard that sits inside of the lid when closed. It ran CP/M (the OS that Microsoft bought and based DOS on). Later portable computers included Bondwell 2 released in 1985, also running CP/M, which was among the first with a hingemounted LCD display. Compaq Portable was the first IBM PC compatible computer, and it ran MS-DOS, but was less portable than Bondwell 2. Other examples of early portable computers included Epson HX-20, GRiD compass, Dulmont Magnum, Kyotronic 85, Commodore SX-64, IBM PC Convertible, Toshiba T1100, T1000, and T1200 etc. The first portable computers which resemble modern laptops in features were Apples Powerbooks, which first introduced a built-in trackball, and later a trackpad and optional color LCD screens. IBMs ThinkPad was largely inspired by Powerbooks design, and the evolution of the two led to laptops and notebook computers as we know them. Powerbooks were eventually replaced by modern MacBook Pros. Of course, much of the evolution of portable computers was enabled by the evolution of microprocessors, LCD displays, battery technology and so on. This evolution ultimately allowed computers even smaller and more portable than laptops, such as PDAs, tablets, and smartphones.

Common questions

Powered by AI

The IBM PC, introduced in 1981, provided a standard hardware platform which other manufacturers could clone, hence promoting widespread adoption and compatibility in personal computer hardware. Microsoft supplied the operating system, DOS, which became a staple across compatible systems, fostering software development and use. The Macintosh, launched by Apple in 1984, introduced a graphical user interface and mouse operations, making computers more accessible to the general public. This led to a significant shift away from command-line interfaces. The dual influence of these systems set the stage for the eventual dominance of graphical user environments, shaping user expectations and spurring competition between Apple and Microsoft that drove further innovations .

The third generation of computers marked a significant technological leap due to the incorporation of integrated circuits (ICs). This advancement allowed computers to become smaller, more powerful, and more reliable . Integrated circuits replaced individual transistors, enabling many components to be placed on a single chip, which greatly reduced the size of computers. This generation differed from the previous ones as computers were now capable of multitasking, running multiple programs at once, a sharp contrast to the first and second generations that lacked such capabilities . Additionally, these computers could support more user-friendly interfaces, easing their operation and broadening their accessibility beyond specialist users .

The invention of the integrated circuit paved the way for the third generation of computers by significantly reducing the size and cost of computers while increasing their power and reliability. Integrated circuits enabled the integration of multiple transistors and components onto a single chip of silicon, which not only made computers smaller and more practical but also catalyzed the production of minicomputers, such as IBM's System/360, that were smaller and cheaper than the mainframes from the previous generations. This era marked the beginning of widespread use of computers in business and industry, as well as the start of multitasking and the use of operating systems capable of running multiple programs .

The move from minicomputers to microcomputers and later personal computers was enabled by several key technological innovations, including the development of integrated circuits (ICs), microprocessors, and semiconductors . Integrated circuits allowed for the creation of smaller, more efficient, and cost-effective computer systems by integrating numerous electronic components onto a single chip . The microprocessor further advanced these changes by incorporating the CPU functions into one chip, paving the way for the development of microcomputers that could serve as standalone units . These innovations reduced the size and power requirements of computers, making them more accessible and practical for personal use, thereby catalyzing the emergence of personal computing .

The shift to microprocessor-based design in the fourth generation significantly enhanced both the capabilities and availability of computers by integrating all CPU components onto a single chip . This allowed computers to be smaller, more energy-efficient, and cheaper to produce, expanding their accessibility beyond corporate and scientific users to include wider consumer markets . Single-chip CPUs facilitated the development and proliferation of microcomputers, which became personal computers that were easier to use and more affordable for everyday consumers . This transition marked a vital step in making computing technology ubiquitous and central to modern life .

The transition from the first to the second generation of computers was marked by the replacement of vacuum tubes with transistors, which were more reliable, consumed less power, produced less heat, and were significantly smaller, making computers not only more efficient but also decreasing the physical size of the hardware involved. This period also saw the introduction of commercial computers like the UNIVAC 1 and IBM 650. In addition, over 100 computer programming languages were developed, and computers began to have memory and operating systems. Storage media such as tape and disk were introduced, and printers for output became common .

Xerox PARC (Palo Alto Research Center) played a pioneering role in the development of graphical user interfaces (GUIs). They were the first to create a computer with a mouse-driven GUI, the Xerox Alto, in 1973. However, it was never commercialized due to the management’s lack of vision for its potential. The technology caught the attention of Apple, particularly Steve Jobs, who, after a demonstration at Xerox PARC, directed Apple to develop and refine the GUI concept into a commercial product. This resulted in the development and introduction of the Macintosh in 1984, the first mass-market computer to successfully implement a graphical user interface. Xerox PARC's early work essentially laid the foundational concepts that were further developed and commercialized by companies like Apple .

The Altair 8800 played a pivotal role in the microcomputer revolution by sparking interest and innovation among hobbyists and engineers. Introduced as a kit, it allowed individuals to engage directly with computer assembly and programming, thus democratizing computer technology . The Altair 8800's success inspired many enthusiasts to establish companies that would shape personal computing, notably Microsoft and Apple . Its introduction catalyzed the microcomputer industry, moving computers from large organizations into homes, and ultimately setting the stage for the consumer-driven tech industry . The Altair's impact extended beyond its technical specifications; it ushered in a transformative era in computer history, marking the transition towards affordable, personal computing .

Moore’s Law played a crucial role in the evolution of microcomputers into modern personal computers by predicting the doubling of transistors on a microchip approximately every two years . This prediction has closely held, allowing for a steady increase in processing power while reducing costs and size. As a result, computers have become exponentially more powerful and compact, facilitating advancements from rudimentary microcomputers to sophisticated personal computers with extensive capabilities . This continual improvement enabled new applications and form factors, further driving the computer industry's growth and the development of devices like laptops, smartphones, and tablets .

The invention of transistors significantly improved the design and functionality of computers in the second generation by replacing vacuum tubes. Transistors consumed less power, produced less heat, and allowed computers to become smaller and more reliable . These improvements made second-generation computers more efficient and practical, marking a shift from the first-generation computers that were large, complex, and prone to overheating due to the use of numerous vacuum tubes . Moreover, the second generation of computers introduced memory and operating systems, as well as the development of over 100 computer programming languages .

You might also like