Information Technology (IT) Just As The Internet and The We Have Affected All
Information Technology (IT) Just As The Internet and The We Have Affected All
The purpose of this book is to help you use and understand information technology
. This involves two aspects: computer competency and computer knowledge. Computer
competency refers to acquiring computer related skills.
The term is commonly used as a synonym for computers and computer networks, but it
also encompasses other information distribution technologies such as television and
telephones. Several industries are associated with information
technology, including computer
hardware, software, electronics, semiconductors, internet, telecommunications
equipment, engineering, healthcare, e-commerce and computer services.
Humans have been storing, retrieving, manipulating and communicating information since
the Sumerians developed writing in about 3000 BC, but the term information technology
in its modern sense first appeared in a 1958 article
1
published in the Harvard Business Review; authors Harold J. Leavitt and Thomas L.
Whisler commented that "the new technology does not yet have a single established
name. We shall call it information technology (IT)." Their definition consists of three
categories: techniques for processing, the application of statistical and mathematical
methods to decision-making, and the simulation of higher-order thinking through computer
programs.
Zuse Z3 replica on display at Deutsches Museum in Munich. The Zuse Z3 is the first
programmable computer.
Devices have been used to aid computation for thousands of years, probably initially in
the form of a tally stick. The Antikythera mechanism, dating from about the beginning of
the first century BC, is generally considered to be the earliest known mechanical analog
computer, and the earliest known geared mechanism. Comparable geared devices did not
emerge in Europe until the 16th century, and it was not until 1645 that the first mechanical
calculator capable of performing the four basic arithmetical operations was developed.
Electronic computers, using either relays or valves, began to appear in the early 1940s.
The electromechanical Zuse Z3, completed in 1941, was the world's first programmable
computer, and by modern standards one of the first machines that could be considered
a complete computing machine. Colossus,
2
developed during the Second World War to decrypt German messages was the first
electronic digital computer. Although it was programmable, it was not general-purpose,
being designed to perform only a single task. It also lacked the ability to store its program
in memory; programming was carried out using plugs and switches to alter the internal
wiring. The first recognizably modern electronic digital stored-program computer was the
Manchester Small-Scale Experimental Machine (SSEM), which ran its first program on
21 June 1948.
The development of transistors in the late 1940s at Bell Laboratories allowed a new
generation of computers to be designed with greatly reduced power consumption. The
first commercially available stored-program computer, the Ferranti Mark I, contained
4050 valves and had a power consumption of 25 kilowatts. By comparison the first
transistorized computer, developed at the University of Manchester and operational by
November 1953, consumed only 150 watts in its final version.
Computers are used in so many fields in our daily life. From Engineers to Doctors,
Students, Teachers, Government Organization they all use computers to perform specific
tasks, for entertainment or just to finish office work. Computers have made our life easier.
With greater precision and accuracy and less time taking computers can do alot in short
time while that task can take alot of time while doing manually. Computers have taken
industries and businesses to a whole new level. They are used at Home for work and
entertainment purposes, at Office, In hospitals, in government organizations. Here we are
going to discuss some of the uses of computers in various fields.
3
Uses Of Computer At Home
Home Budget
Computer can be used to manage Home Budget. You can easily calculate your
expenses and income. You can list all expenses in one column and income in another
column. Then you can apply any calculation on these columns to plan your home budget.
There are also specialize software that can manage your income and expenses and
generate some cool reports.
Computer Games
People can manage the office work at home. The owner of a company can check
the work of the employees from home. He can control his office while sitting at home.
Entertainment
People can find entertainment on the internet. They can watch movies, listen to
songs, and watch videos download different stuff. They can also watch live matches on
the internet.
Information
People can find any type of information on the internet. Educational and informative
websites are available to download books, tutorials etc. to improve their knowledge and
learn new things.
4
Chatting & Social Media
People can chat with friends and family on the internet using different software like
Skype etc. One can interact with friends over social media websites like Facebook, Twitter
& Google Plus. They can also share photos and videos with friends.
CBT are different programs that are supplied on CD-ROM. These programs include
text, graphics and sound. Audio and Video lectures are recorded on the CDs. CBT is a
low cost solution for educating people. You can train a large number of people easily.
Benefits Of CBT
1. The students can learn new skills at their own pace. They can easily
acquire knowledge in any available time of their own choice.
5
Distance Learning
Distance learning is a new learning methodology. Computer plays the key role in
this kind of learning. Many institutes are providing distance learning programs. The
student does not need to come to the institute. The institute provides the reading material
and the student attends virtual classroom. In virtual classroom, the teacher delivers lecture
at his own workplace. The student can attend the lecture at home by connecting to a
network. The student can also ask questions to the teacher.
Online Examination
• Lynda.com (For different Software training and Web development and CMS
tutorials)
6
Marketing
Stock Exchange
Stock Exchange is the most important place for businessmen. Many stock
exchanges use computers to conduct bids. The stockbrokers perform all trading activities
electronically. They connect with the computer where brokers match the buyers with
sellers. It reduces cost as no paper or special building is required to conduct these
activities.
Specialized hospital management softwares are used to automate the day to day
procedures and operations at hospitals. These tasks may be Online appointments, payroll
admittance and discharge records etc.
Patient History
Hospital management systems can store data about patients. Computers are used
to store data about patients, their diseases & symptoms, the medicines that are
prescribed.
Patients Monitoring
Monitoring systems are installed in medical wards and Intensive care units to
monitoring patients continously. These systems can monitor pulse, blood pressure and
body temperature and can alert medical staff about any serious situations.
7
Life Support Systems
Specialised devices are used to help impaired patients like hearing aids.
Diagnosis Purpose
The line where the distinction should be drawn is not always clear] Many operating
systems bundle application software. Such software is not considered system
software when it can be uninstalled usually without affecting the functioning of other
software. Exceptions could be e.g. web browsers such as Internet Explorer where
Microsoft argued in court that it was system software that could not be uninstalled.
Later examples are Chrome OS and Firefox OS where the browser functions as
the only user interface and the only way to run programs (and other web browsers
cannot be installed in their place), then they can well be argued to be (part of) the
operating system and hence system software.
8
Another borderline example is cloud-based software. This software provides services
to a software client (usually a web browser or a JavaScript application running in the
web browser), not to the user directly, and is therefore systems software. It is also
developed using system programming methodologies and systems programming
languages. Yet from the perspective of functionality there is little difference between a
word processing application and word processing web application.
A kernel is the core part of the operating system that defines an API for applications
programs (including some system software) and an interface to device drivers.
Device drivers, including also computer BIOS and device firmware, provide basic
functionality to operate and control the hardware connected to or built into the
computer.
A user interface "allows users to interact with a computer." Either a command- line
interface (CLI) or, since the 1980s a graphical user interface (GUI). Since this is the
part of the operating system the user directly interacts with, it may be considered an
application and therefore not system software.
For historical reasons, some organizations use the term systems programmer to
describe a job function which is more accurately termed systems administrator.
Software tools these employees use are then called system software. This so- called
Utility software helps to analyze, configure, optimize and maintain the
computer, such as virus protection. In some publications, the term system
software also includes software development tools (like a
compiler, linker or debugger).
9
Systems programming, or system programming, is the activity of programming
computer system software. The primary distinguishing characteristic of systems
programming when compared to application programming is that application
programming aims to produce software which provides services to the user directly
(e.g. word processor), whereas systems programming aims to produce software and
software platforms which provide services to other software, are performance
constrained, or both (e.g. operating systems, computational science applications,
game engines, industrial automation, and software as a service applications).[1]
Overview
• The programmer can make assumptions about the hardware and other
properties of the system that the program runs on, and will often exploit
those properties, for example by using an algorithm that is known to be
efficient when used with specific hardware.
10
• Often systems programs cannot be run in a debugger. Running the
program in a simulated environment can sometimes be used to reduce this
problem.
In systems programming, often limited programming facilities are available. The use of
automatic garbage collection is not common and debugging is sometimes hard to do.
The runtime library, if available at all, is usually far less powerful, and does less
error checking. Because of those
limitations, monitoring and logging are often used; operating systems may have
extremely elaborate logging subsystems.
History
Alternate usage
For historical reasons, some organizations use the term systems programmer to
describe a job function which would be more accurately termed systems administrator.
This is particularly true in organizations whose computer resources have historically
been dominated by mainframes, although the term is even used to describe job
functions which do not involve mainframes. This usage arose because administration
of IBM mainframes often involved the writing of custom assembler code (IBM's Basic
Assembly Language (BAL)), which integrated with the operating system such as
OS/MVS, DOS/VSE or VM/CMS. Indeed, some IBM software products had substantial
code contributions from customer programming staff. This type of programming is
progressively less
11
Common, but the term systems programmer is still the de facto job title for staff directly
administering IBM mainframes.
System languages, in contrast, are designed not for compatibility, but for performance
and ease of access to the underlying hardware while still providing high-level
programming concepts like structured programming. Examples include Systems
Programming Language (SPL or SPL/3000) and Executive Systems Problem
Oriented Language (ESPOL), both of which are similar to ALGOL in syntax but
tuned to their respective platforms. Others are cross- platform but designed to work
close to the hardware, like JOVIAL and BCPL.
Some languages straddle the system and application domains, bridging the gap
between these uses. The canonical example is C, which is used widely for both system
and application programming. Some modern languages also do this such as Rust and
Swift.
12
History
The earliest system software was written in assembly language primarily because
there was no alternative, but also for reasons including efficiency of object code,
compilation time, and ease of debugging. Application languages such as
FORTRAN were used for system programming, although they usually still required
some routines to be written in assembly language.
Mid-level languages
Mid-level languages "have much of the syntax and facilities of a higher level language,
but also provide direct access in the language (as well as providing assembly
language) to machine features. The earliest of these was ESPOL on Burroughs
mainframes in about 1960, followed by Niklaus Wirth's PL360 (first written on a
Burroughs system as a cross compiler), which had the general syntax of ALGOL 60
but which statements directly manipulated CPU registers and memory. Other
languages in this category include MOL-360 and PL/S.
with the contents of register 6, and the final result placed into register 9.
Higher-level languages
While PL360 is at the semantic level of assembly language, another kind of system
programming language operates at a higher semantic level, but has specific
extensions designed to make the language suitable for system programming. An early
example of this kind of language is LRLTRAN, which extended Fortran with features
for character and bit manipulation, pointers, and directly addressed jump tables.
13
used for this purpose. Although many such languages were developed, ] C and
C++ are the ones which survived.
System Programming Language (SPL) is also the name of a specific language on the HP
3000 computer series, used for its operating system HP Multi- Programming Executive
(MPE), and other parts of its system software.
The Xbox 360 system software or the Xbox 360 Dashboard is the updateable software
and operating system for the Xbox 360. It formerly resided in a 16 MB file
system. However, starting with the NXE Update, more storage became a requirement,
rectified by either having a Hard Drive installed, or one of the later revisions of the
console with adequate flash storage embedded within the console. The system
software has access to a maximum of 32 MB of the system's Random Access
Memory.[8] The updates can be downloaded from the Xbox Live service directly
to the Xbox 360 and subsequently installed. Microsoft has also provided the ability
to download system software updates from their respective official Xbox website to
their PCs and then storage media, from which the update can be installed to the
system.
The Xbox 360 game system allows users to download applications that add to the
functionality of the dashboard. Most apps required the user to be signed into a valid
Xbox Live Gold account in order to use the features advertised for the given app. But
as of the 2.0.16756.0 update, most apps do not require an Xbox Live Gold Subscription
to access them, although the app may have its own subscription to be able to use
it.[9][10] With the exception of a few early apps, Microsoft has added partners to
develop apps for the Xbox 360 system since the New Xbox Experience (NXE)
Dashboard update in 2008.[11]
Following the success of Xbox One preview program launched in 2014, in March 2015,
Microsoft announced the Xbox 360 preview program to the public.
Microsoft released the Xbox 360 console on November 22, 2005, a whole year earlier
than both the Sony PlayStation 3 and Nintendo Wii. Having the advantage of the lead,
Microsoft was able to experiment with various customization options for the
consumer's individual consoles. The ability to customize the way the console looked
with various themes to fit the front and sides of it was something very different for
home console users. In system, the Xbox 360 Dashboard had the ability to have
multiple profiles with password on the same console with each user being able to
customize the dashboard to exactly fit their own unique style. There were premium
themes available for purchase on the Xbox Live Marketplace apart from the default
styles. Originally there were five tabs or sections known
14
as the "blades" for the Xbox 360 menu, namely the Marketplace, Xbox live, Games,
Media and System. In scrolling from left to right, each section would have a different-
colored background signifying its own unique area but users also had the option to
change all sections to one background color as well. In 2008 however, when the gaming
scene changed dramatically because of the competitions with the PlayStation 3 and the
Wii, a new Xbox Dashboard titled the New Xbox Experience (NXE) was launched,
which features major changes in
both the user interface and other functionalities. The new user interface had a
navigation system similar to that of Windows Media Center.
Multimedia features
While the Xbox 360 console is primarily designed to play games just like other video
game consoles, it can be used as a media player too. Similar to the PlayStation 3
from Sony, Xbox 360 has media center capabilities built in, so it is relatively easy to
set up. With the Xbox 360 users can also copy videos directly to the hard drive, or play
via a USB stick. There are two ways to watch videos on Xbox 360. The first is to
download videos from the Xbox Live Marketplace. Some of these videos are available
for free while others have to be paid. Microsoft is in control of what videos are available
through the Xbox Live Marketplace. The second is to stream videos from a Windows
Media Center PC by using Xbox 360 as a Media Center Extender. In this way users
are in control of what videos they want to watch, however there are restrictions on
what kind of video they can playback. More specifically, it only supports playback of
DVR- MS, MPEG-1, MPEG-2 and WMV videos.[16] Every Xbox 360 can play DVD
movies out of the box using the built-in DVD drive, with no additional parts necessary,
although the user may control everything with an optional remote. There are other
improvements to the experience on the Xbox 360 over the original Xbox too, including
the ability to upscale the image so it will look better. Progressive scan is another
feature of the DVD output in the Xbox 360 that produces smoother output when playing
movies on televisions that support high definition, although using a dedicated DVD
player would offer even more features and sound quality.
15
Backward compatibility [
The Xbox 360 system software includes built in software emulation support for the
original Xbox game system. Software emulation is achieved with downloadable
emulation profiles, which require a hard drive. Not all original Xbox games are
supported; the last reported update to the compatibility list was
in 2007 and support has since been discontinued for adding new titles. There are more
than 400 titles on the list which covers most of the big name titles, and as a requirement
for backwards compatibility the users have to have a hard drive for their Xbox 360,
specifically an official Microsoft-brand Xbox 360 hard drive. ] In contrast, Xbox 360's
successor, the Xbox One console was not backward compatible at launch, but after
applying the November 2015 "New Xbox One Experience" system update it also
supports a select group of Xbox 360 games using a software emulator, similar to Xbox
360's backward compatibility feature. However, there are also notable differences
between the ways of their emulations—unlike Xbox 360's emulation of the original
Xbox, by Xbox One's emulation of the Xbox 360 games do not have to be specifically
patched but instead need to be repackaged in the Xbox One format.
Starting with the NXE Dashboard in November 2008, Larry Hryb (known on Xbox Live
as "Major Nelson") and other team members hosted a new segment using Microsoft
Connect to allow members of the Xbox Live community to get a chance to have a
preview of the next dashboard. Small bug fixes & minor improvements were not
included in the Preview Program; it was limited to major releases (NXE, Kinect, Metro)
released in November of some years. In 2009, the Preview Program returned in August
rather than November for a summer update.
16
Preview Dashboard app is the place for Preview participants to give feedback about
the program, get the latest news, change console enrollment settings, and report
problems. If users decide that they didn't want to get Preview updates anymore they
could opt out in the Xbox Preview Dashboard app. details for the Preview Program
were located on the Xbox official website.
The Xbox Live Preview Program for the Xbox 360 has since been discontinued.
History of updates
The first version of the Xbox 360 system software was 2.0.1888.0, released on
November 22, 2005, as shipped in the original Xbox 360 consoles, although the
version numbered "2.0" was available at product launch. Over the course of next a few
years saw the continuous updates of the system software. While early updates such
as version 2.0.4532.0 released on October 31, 2006 added supports for 1080p video
output and the external HD DVD drive attachment, version 2.0.7357.0 released on
November 19, 2008 was the first major upgrade of the system software, titled the New
Xbox Experience that had added many new features, [22] including a completely
redesigned GUI. It included changes in the menu system, featuring a more 3D style
vibe with more options and sections, new sound effects (menus only, notification
sounds remain the same), support for 1440×900 and 1680×1050 16:10
resolutions (letterboxed) over VGA, HDMI and when using DVI, as well as the
abilities to preview themes before setting them, to disable notifications (new
messages, chat requests, etc.) or mute the notification sound, and to change to
QWERTY keyboard in place of alphabetical keyboard.
Subsequent system software updates after this major upgrade continued to add
(although usually numerically smaller) new features or make other changes, including
bugfixes. An example of the new features introduced in version 2.0.8498.0 released
on August 11, 2009 was the addition of Display Discovery to allow console to override
factory settings for HDTV resolutions and refresh rates as well as discovering the best
possible resolution and refresh rates that the HDTV is capable of displaying (Selected
HDTVs). Version 2.0.12611.0 released on November 1, 2010 also added features
such as the ability to install game updates to the HDD (select games only) and a visual
refresh to incorporate elements of Microsoft's Metro design style. It also featured a
new boot screen animation with redesigned Xbox 360 orb and ribbons. New anti-piracy
2.5 scheme to newly released games was also added in this version, later updated to
17
anti-piracy 2.6 in the version 2.0.13599.0 released on July 19, 2011. Version
2.0.14699.0 released on December 6, 2011 introduced a redesigned interface and a
fresh new take on a platform that has had more than half a decade of changes and
enhancements.[14] The releases after the version 2.0.16197.0 released October 16,
2012 were typically minor, usually bugfixes or as a
Mandatory update that prepared for subsequent growth of the service, but the system
software is still being constantly updated by now. On June 15, 2020 the
Advertisements included on the Xbox 360 software were removed via a server- side
update without notice.
See also
• Xbox One system software, the operating system for the eighth-generation
home video game console, Xbox One
• Wii U system software, the official firmware version and operating system for
Nintendo's Wii U game console
18
Other gaming platforms from this generation:
The Wiki system software is a discontinued set of updatable firmware versions and a
software frontend on the Wiki home video game console. Updates, which can be
downloaded over the Internet or read from a game disc, allowed Nintendo
to add additional features and software, as well as to patch security vulnerabilities
used by users to load homebrew software. When a new update became available,
Nintendo sent a message to the Wii Message Board of Internet-connected systems
notifying them of the available update.
Most game discs, including first-party and third-party games, include system software
updates so that systems that are not connected to the Internet can still receive
updates. The system menu will not start such games if their updates have not been
installed, so this has the consequence of forcing users to install updates in order to
play these games. Some games, such as online games like Super Smash Bros.
Brawl and Mario Kart Wii, contain specific extra updates, such as the ability to receive
Wii Message Board posts from game-specific addresses; therefore, these games
always require that an update be installed before their first time running on a given
console.
Technology IOS
The Wii's firmware is in the form of IOSes, thought by the Wii homebrew
developers to stand for "Input Output Systems" or "Internal Operating Systems" IOS
runs on a separate ARM926EJ-S processor, unofficially
19
Starlet. The patent for the Wii U shows a similar device which is simply named
"Input/Output Processor" IOS controls I/O between the code running on the main
Broadway processor and the various Wiki that does not also exist on the
GameCube.
Except for bug fixes, new IOS versions do not replace existing IOS versions. Instead,
Wii consoles have multiple IOS versions installed. All native Wii software (including
games distributed on Nintendo optical discs, the System Menu itself, Virtual Console
games, WiiWare, and Wii Channels), with the exception of certain homebrew
applications, have the IOS version hardcoded into the software.
When the software is run, the IOS that is hardcoded gets loaded by the Wii, which then
loads the software itself. If that IOS does not exist on the Wii, in the case of disc-based
software, it gets installed automatically (after the user is prompted). With downloaded
software, this should not theoretically happen, as the user cannot access the shop to
download software unless the player has all the IOS versions that they require.
However, if homebrew is used to forcefully install or run a piece of software when the
required IOS does not exist, the user is brought back to the system menu.
Nintendo created this system so that new updates would not unintentionally break
compatibility with older games, but it does have the side effect that it uses up space
on the Wii's internal NAND Flash memory. IOSes are referred to by their number,
which can theoretically be between 0 and 254, although many numbers are skipped,
presumably being development versions that were never completed.
Only one IOS version can run at any given time. The only time an IOS is not running
is when the Wii enters GameCube backward compatibility mode, during which the Wii
runs a variant of IOS specifically for GameCube games, MIOS, which contains a
modified version of the GameCube's IPL.
User interface
The system provides a graphical interface to the Wiki’s abilities. All games run directly
on the Broadway processor, and either directly interface with the hardware (for the
hardware common to the Wiki and GameCube), or interface with IOS running on the
ARM architecture processor (for Wiki-specific hardware). The ARM processor does
not have access to the screen, and therefore
20
neither does IOS. This means that while a piece of software is running, everything
seen on the screen comes from that software, and not from any operating system or
firmware. Therefore, the version number reported by the Wii is actually only the version
number of the System Menu. This is why some updates do not result in a change of
the version number: the System Menu itself is not updated, only (for example) IO uses
and channels. As a side effect, this means it is impossible for Nintendo to implement
any functions that would affect the games themselves, for example an in-game system
menu (similar to the Xbox 360's in-game Dashboard or the PlayStation 3's in-game
XMB).
The Wiki Menu (known internally as the System Menu) is the name of the user
interface for the Wiki game console, and it is the first thing to be seen when the system
boots up. Similar to many other video game consoles, the Wiki is not only about games.
For example, it is possible to install applications such as Netflix to stream media
(without requiring a disc) on the Wiki. The Wiki Menu let users access both game and
no-game functions through built-in applications called Channels, which are designed
to represent television channels. There are six primary channels: the Disc Channel,
Mii Channel, Photo Channel, Wii Shop Channel, Forecast Channel and News
Channel, although the latter two were not initially included and only became available
via system updates. Some of the functions provided by these Channels on the
Wii used to be limited to a computer, such as a full-featured web browser and
digital photo viewer can be navigated using the pointer capability of the Wii
Remote.[10] Users can also rearrange these Channels if they are not satisfied with
how the Channels are originally organized on the menu.
Network features
The Wiki system supports wireless connectivity with the Nintendo DS handheld
console with no additional accessories. This connectivity allows players to use the
Nintendo DS microphone and touch screen as inputs for Wii games. Pokémon Battle
Revolution is the first example Nintendo has given of a game using Nintendo DS-Wii
connectivity. Nintendo later released the Nintendo Channel for
the Wii allowing its users to download game demos or additional data to their Nintendo
DS.
Like many other video game consoles, the Wii console is able to connect to the
Internet, although this is not required for the Wii system itself to function. Each Wii has
its own unique 16-digit Wii Code for use with Wii's non-game features. With Internet
connection enabled users are able to access the established Nintendo Wi-Fi
Connection service. Wireless
21
encryption by WEP, WPA (TKIP/RC4) and WPA2 (CCMP/AES) is supported.[12]
AOSS support was added in System Menu version 3.0. As with the Nintendo DS,
Nintendo does not charge for playing via the service; the 12-digit Friend Code system
controls how players connect to one another. The service has a few features
for the console, including the Virtual Console, WiiConnect24 and several
Channels. The Wii console can also communicate and connect with other Wii systems
through a self-generated wireless LAN, enabling local wireless multiplayer on different
television sets. The system also implements console-based software, including the Wii
Message Board. One can connect to the Internet with third-party devices as well.
The Wii console also includes a web browser known as the Internet Channel, which is
a version of the Opera 9 browser with menus. It is meant to be a convenient way to
access the web on the television screen, although it is far from
Backward compatibility
The original designs of the Nintendo Wii console, more specifically the Wii models made
pre-2011 were fully backward compatible with GameCube devices including game
discs, memory cards and controllers. This was because the Wii hardware had ports
for both GameCube memory cards, and peripherals and its slot-loading drive was able
to accept and read the previous console's discs. GameCube games work with the Wii
without any additional configuration, but a GameCube controller is required to play
GameCube titles; neither the Wii Remote or the Classic Controller functions in this
capacity. The Wii supports progressive-scan output in 480p-enabled GameCube
titles. Peripherals can be connected via a set of four GameCube controller sockets
and two Memory Card slots (concealed by removable flip-open panels) The console
retains connectivity with the Game Boy Advance and e-Reader through the Game Boy
Advance Cable, which is used in the same manner as with the GameCube; however,
this feature can only be accessed on select GameCube titles which previously utilized
it.
22
There are also a few limitations in the backward compatibility. For example, online
and LAN features of certain GameCube games were not available since the Wii
does not have serial ports for the Nintendo GameCube Broadband Adapter and
Modem Adapter. The Wii uses a proprietary port for video output, and is
incompatible with all Nintendo GameCube audio/video cables (composite video,
S-Video, component video and RGB SCART). The console also lacks the
GameCube footprint and high-speed port needed for Game Boy Player support.
Application software (app for short) is a program or group of programs designed for
end users. Examples of an application include a word processor, a
spreadsheet, an accounting application, a web browser, an email client, a
media player, a file viewer, simulators, a console game or a photo editor. The
collective noun application software refers to all applications collectively. This
contrasts with system software, which is mainly involved with running the
computer.
Applications may be bundled with the computer and its system software or
published separately, and may be coded as proprietary, open-source or university
projects.[2] Apps built for mobile platforms are called mobile apps.
Terminology
User-written software tailors systems to meet the user's specific needs. User-
written software includes spreadsheet templates, word processor macros,
scientific simulations, audio, graphics, and animation scripts. Even email filters
are a kind of user software. Users create this software themselves and often
overlook how important it is.
24
The delineation between system software such as operating systems and
application software is not exact, however, and is occasionally the object of
controversy. For example, one of the key questions in the United States v. Microsoft
Corp. antitrust trial was whether Microsoft's Internet Explorer web browser was
part of its Windows operating system or a separable piece of application software.
As another example, the GNU/Linux naming controversy is, in part, due to
disagreement about the relationship between the Linux kernel and the operating
systems built over this kernel. In some types of embedded systems, the application
software and the operating system software may be indistinguishable to the user,
as in the case of software used to control a VCR, DVD player or microwave oven.
The above definitions may exclude some applications that may exist on some
computers in large organizations. For an alternative definition of an app: see
Application Portfolio Management.
Metonymy [
The word "application" used as an adjective is not restricted to the "of or pertaining
to application software" meaning.] For example, concepts such as application
programming interface (API), application server, application virtualization,
application lifecycle management and portable application apply to all computer
programs alike, not just application software.
Some applications are available in versions for several different platforms; others
only work on one and are thus called, for example, a
geography application for Microsoft Windows, or an Android application for
education, or a Linux game. Sometimes a new and popular application arises which
only runs on one platform, increasing the desirability of that platform. This is
called a killer application or killer app. For example, VisiCalc was the first modern
spreadsheet software for the Apple II and helped selling the then- new personal
computers into offices. For Blackberry it was their email software.
In recent years, the shortened term "app" (coined in 1981 or earlier) has become
popular to refer to applications for mobile devices such as smart phones
and tablets, the shortened form matching their typically smaller scope compared
to applications on PCs. Even more recently, the shortened version is used for
desktop application software as well
25
Classification
There are many different and alternative ways in order to classify application
software.
By the legal point of view, application software is mainly classified with a black box
approach, in relation to the rights of its final end-users or subscribers (with
eventual intermediate and tiered subscription levels).
Free and open-source software shall be run, distributed, sold or extended for any
purpose, and -being open- shall be modified or reversed in the same way.
FOSS software applications released under a free license may be perpetual and
also royalty-free. Perhaps, the owner, the holder or third-party enforcer of any
right (copyright, trademark, patent, or ius in re aliena) are entitled to add
exceptions, limitations, time decays or expiring dates to the license terms of use.
By coding language
26
indeed greatly increased in popularity for some uses, but the advantages of
applications make them unlikely to disappear soon, if ever. Furthermore, the two
can be complementary, and even integrated.
27
• type of enterprise software with a focus on smaller organizations or groups
within a large organization. (Examples include travel expense management
and IT Helpdesk.)
• Enterprise infrastructure software provides common capabilities needed to
support enterprise software systems. (Examples include databases, email
servers, and systems for managing networks and security.)
• Application platform as a service (aPaaS) is a cloud computing service that offers
development and deployment environments for application services.
• Information worker software lets users create and manage information, often
for individual projects within a department, in contrast to enterprise
management. Examples include time management, resource management,
analytical, collaborative and documentation tools. Word processors,
spreadsheets, email and blog clients, personal information system, and
individual media editors may aid in multiple information worker tasks.
• Content access software is used primarily to access content without editing,
but may include software that allows for content editing. Such software
addresses the needs of individuals and groups to consume digital
entertainment and published digital content. (Examples include media
players, web browsers, and help browsers.)
• Educational software is related to content access software, but has the
content or features adapted for use in by educators or students. For
example, it may deliver evaluations (tests), track progress through
material, or include collaborative capabilities.
• Simulation software simulates physical or abstract systems for either
research, training or entertainment purposes.
• Media development software generates print and electronic media for others
to consume, most often in a commercial or educational setting. This
includes graphic-art software, desktop publishing software, multimedia
development software, HTML editors, digital-animation editors, digital
audio and video composition, and many others.
28
• Product engineering software is used in developing hardware and software
products. This includes computer-aided design (CAD), computer- aided
engineering (CAE), computer language editing and compiling tools,
integrated development environments, and application programmer
interfaces.
• Entertainment Software can refer to video games, screen savers, programs to
display motion pictures or play recorded music, and other forms of
entertainment which can be experienced through use of a computing
device.
The market has been undergoing considerable consolidation since the mid- 1990s,
with many suppliers ceasing to trade or being bought by larger groups.
A general ledger is a bookkeeping ledger that serves as a central repository for accounting
data transferred from all sub ledgers like accounts payable, accounts receivable, cash
management, fixed assets, purchasing and projects. Each account maintained by an
organization is known as a ledger account, and the collection of all
29
These accounts is known as the general ledger. The general ledger is the backbone of any
accounting system which holds financial and non-financial data for an organization.
The listing of the account names is called the chart of accounts. The extraction of
account balances is called a trial balance. The purpose of the trial balance is, at a
preliminary stage of the financial statement preparation process, to ensure the
equality of the total debits and credits.
Spreadsheet users can adjust any stored value and observe the effects on calculated
values. This makes the spreadsheet useful for "what-if" analysis since many cases
can be rapidly investigated without manual recalculation. Modern spreadsheet
software can have multiple interacting sheets, and can display data either as text
and numerals, or in graphical form.
30
They now are used extensively in any context where tabular lists are built, sorted,
and shared.
LANPAR, available in 1969, was the first electronic spreadsheet on mainframe and
time sharing computers. LANPAR was an acronym: LANguage for Programming
Arrays at Random. VisiCalc was the first electronic spreadsheet on a
microcomputer, and it helped turn the Apple II computer into a popular and
widely used system. Lotus 1-2-3 was the leading spreadsheet when DOS was the
dominant operating system. Excel now has the largest market share on the
Windows and Macintosh platforms. A spreadsheet program is a standard feature
of an office productivity suite; since the advent of web apps, office suites now also
exist in web app form. Web based spreadsheets are a relatively new category.
The acceptance of the IBM PC following its introduction in August, 1981, began
slowly, because most of the programs available for it were translations from other
computer models. Things changed dramatically with the introduction of Lotus 1-
2-3 in November, 1982, and release for sale in January, 1983. Since it was written
especially for the IBM PC, it had good performance and became the killer app for
this PC. Lotus 1-2-3 drove sales of the PC due to the improvements in speed and
graphics compared to VisiCalc on the Apple II.
Lotus 1-2-3, along with its competitor Borland Quattro, soon displaced VisiCalc.
Lotus 1-2-3 was released on January 26, 1983, started outselling then-most-
popular VisiCalc the very same year, and for a number of years was the leading
spreadsheet for DOS.
Microsoft Excel [
Microsoft released the first version of Excel for the Macintosh on September 30,
1985, and then ported it to Windows, with the first version being numbered 2.05
(to synchronize with the Macintosh version 2.2) and released in November 1987.
The Windows 3.x platforms of the early 1990s made it possible for Excel to take
market share from Lotus. By the time Lotus responded with usable Windows
products, Microsoft had begun to assemble their Office suite. By 1995, Excel was
31
The market leader, edging out Lotus 1-2-3, and in 2013, IBM discontinued Lotus
1-2-3 altogether.
With the advent of advanced web technologies such as Ajax circa 2005, a new
generation of online spreadsheets has emerged. Equipped with a rich Internet
application user experience, the best web based online spreadsheets have many of
the features seen in desktop spreadsheet applications. Some of them such as
Edit Grid, Google Sheets, Microsoft Excel Online, Smart sheet, or Zoho Sheet also
have strong multi-user collaboration features or offer real time updates from
remote sources such as stock prices and currency exchange rates.
Other spreadsheets [
The database management system (DBMS) is the software that interacts with end
users, applications, and the database itself to capture and analyze the data. The
DBMS software additionally encompasses the core facilities provided to
administer the database. The sum total of the database, the DBMS and the
associated applications can be referred to as a "database system". Often the term
"database" is also used to loosely refer to any of the DBMS, the database system or
an application associated with the database.
32
1960s, navigational DBMS
The introduction of the term database coincided with the availability of direct-
access storage (disks and drums) from the mid-1960s onwards. The term
represented a contrast with the tape-based systems of the past, allowing shared
interactive use rather than daily batch processing. The Oxford English Dictionary
cites a 1962 report by the System Development Corporation of California as the
first to use the term "data-base" in a specific technical sense.
Edgar F. Codd worked at IBM in San Jose, California, in one of their offshoot
offices that was primarily involved in the development of hard disk systems. He
was unhappy with the navigational model of the CODASYL approach, notably the
lack of a "search" facility. In 1970, he wrote a number of papers that outlined a new
approach to database construction that eventually culminated in the
groundbreaking A Relational Model of Data for Large Shared Data Banks.
In this paper, he described a new system for storing and working with large
databases. Instead of records being stored in some sort of linked list of free-form
records as in CODASYL, Codd's idea was to organise the data as a number of
"tables", each table being used for a different type of entity. Each table would
contain a fixed number of columns containing the attributes of the entity.
33
to assemble a new document. This process is increasingly used within certain
industries to assemble legal documents, contracts and letters. Document
automation systems can also be used to automate all conditional text, variable text,
and data contained within a set of documents.
Automation systems allow companies to minimize data entry, reduce the time
spent proof-reading, and reduce the risks associated with human error. Additional
benefits include: time and financial savings due to decreased paper handling,
document loading, storage, distribution, postage/shipping, faxes, telephone, labor
and waste
Document assembly
The basic functions are to replace the cumbersome manual filling in of repetitive
documents with template-based systems where the user answers software- driven
interview questions or data entry screen. The information collected then populates
the document to form a good first draft'. Today's more advanced document
automation systems allow users to create their own data and rules (logic) without
the need for programming.
• promissory note
• environmental indemnity
• trust deed
• mortgage
• guaranty
A word processor (WP) is a device or computer program that provides for input,
editing, formatting and output of text, often with some additional features.
34
Early word processors were stand-alone devices dedicated to the function, but
current word processors are word processor programs running on general purpose
computers.
Contents
• 1Background
• 5See also
• 6References
Background
Word processors did not develop out of computer technology. Rather, they evolved
from mechanical machines and only later did they merge with the computer field.
The history of word processing is the story of the gradual automation of the
physical aspects of writing and editing, and then to the refinement of the
technology to make it available to corporations and Individuals.
The term word processing appeared in American offices in early 1970s centered on
the idea of streamlining the work to typists, but the meaning soon shifted toward
the automation of the whole editing cycle.
35
Mechanical word processing
The first word processing device (a "Machine for Transcribing Letters" that
appears to have been similar to a typewriter) was patented by Henry Mill for a
machine that was capable of "writing so clearly and accurately you could not
distinguish it from a printing press". More than a century later, another patent
appeared in the name of William Austin Burt for the typographer. In the late 19th
century, Christopher Latham Sholes created the first recognizable typewriter that
although it was a large size, which was described as a "literary piano".
The only "word processing" these mechanical systems could perform was to
change where letters appeared on the page, to fill in spaces that were previously
left on the page, or to skip over lines. It was not until decades later that the
introduction of electricity and electronics into typewriters began to help the writer
with the mechanical part. The term “word processing” itself was created in the
1950s by Ulrich Steinhilper, a German IBM typewriter sales executive. However,
it did not make its appearance in 1960s office management or computing
literatures, though many of the ideas, products, and technologies to which it would
later be applied were already well known.
By the late 1960s, IBM had developed the IBM MT/ST (Magnetic Tape/Selectric
Typewriter). This was a model of the IBM Selectric typewriter from the earlier part
of this decade, but built into its own desk, and integrated with magnetic tape
recording and playback facilities, with controls and a bank of electrical relays. The
MT/ST automated word wrap, but it had no screen. This device allowed rewriting
text that had been written on another tape and you could collaborate (send the
tape to another person for them to edit or make a copy). It was a revolution for the
word processing industry. In 1969 the tapes were replaced by magnetic cards.
These memory cards were introduced in the side of an extra device that
accompanied the MT/ST, able to read and record the work.
In the early 1970s, word processing then became computer-based (although only
with single-purpose hardware) with the development of several innovations. Just
before the arrival of the personal computer (PC), IBM developed the floppy disk.
Also in the early 1970s word-processing systems with a CRT screen display editing
were designed.
36
Word processing software
The final step in word processing came with the advent of the personal computer
in the late 1970s and 1980s and with the subsequent creation of word processing
software. Word processing systems that would create much more complex and
capable text were developed and prices began to fall, making them more accessible
to the public.
The first word processing program for personal computers (microcomputers) was
Electric Pencil, from Michael Shrayer Software, which went on sale in December of
1976. In 1978 WordStar appeared and because of its many new features soon
dominated the market. However, WordStar was written for the early CP/M (Control
Program–Micro) operating system, and by the time it was rewritten for the newer MS-
DOS (Microsoft Disk Operating System), it was obsolete. WordPerfect and its
competitor Microsoft Word replaced it as the main word processing programs during
the MS-DOS era, although there were less successful programs such as XyWrite.
Desktop publishing (DTP) is the creation of documents using page layout software
on a personal ("desktop") computer. It was first used almost exclusively for print
publications, but now it also assists in the creation of various forms of online
content.[1] Desktop publishing software can generate layouts and produce
typographic-quality text and images comparable to traditional typography and
printing. Desktop publishing is also the main reference for digital typography. This
technology allows individuals, businesses, and other organizations to self-publish
a wide variety of content, from menus to magazines to books, without the expense
of commercial printing.
37
Desktop publishing often requires the use of a personal computer
and WYSIWYG page layout software to create documents for either large-scale
publishing or small-scale local multifunction peripheral output and distribution –
although a non-WYSIWYG system such as LaTeX could also be used for the creation
of highly structured and technically demanding documents as well.[3] Desktop
publishing methods provide more control over design, layout, and typography than
word processing. However, word processing software has evolved to include most, if
not all, capabilities previously available only with professional printing or desktop
publishing.
Most modern meeting rooms and conference halls are configured to include
presentation electronics, such as overhead projectors suitable for displaying
presentation slides, often driven by the presenter's own laptop, under direct
control of the presentation program used to develop the presentation. Often the
presenter will present a lecture using the slides as a visual aid for both the
presenter (to track the lecture's coverage) and the audience (especially whn an
audience member mishears or misunderstands the verbal component).
38
different hosts across the ARPANET, using the @ sign to link the user name with a
destination server. By the mid-1970s, this was the form recognized as email.
Email operates across computer networks, primarily the Internet. Today's email
systems are based on a store-and-forward model. Email servers accept, forward,
deliver, and store messages. Neither the users nor their computers are required to
be online simultaneously; they need to connect, typically to a mail server or a
webmail interface to send or receive messages or download it
The computer is very powerful and useful that they make our life very convenient
and comfortable. They are use in scheduling airlines flight which makes ticket
reservation so easy and fast, not to mention accuracy and convenience. They are
even use predict or forecast the weather conditions, making us well –informed
about the incoming typhoon (or even a tidal wave). Automated Teller Machines
(ATM) are computers that allow us to withdraw cash anytime, anywhere. These are
all but the few benefits which we have enjoyed because of the invention of
computers. Imagine, living without computers, how lousy life could be.
But, what the computer can really do? Why can they accomplish such incredible
tasks? The truth is computers can do only for simple tasks: receive an input,
process information, produce output, and store information. The question here is
why they can accomplish such tremendous task? The answer is: man made them
so. Behind every computer’s power and tremendous capabilities is an intelligent
man. Man’s boundless creativity and brilliance are the driving forces that power-
up this computer; and they express it in terms of program. A program is a set of
commands or instruction for a computer to follow. We usually call this end –
product: a software, this will be our topic in the next chapter.
Every computer system contains hardware components that are used for receiving
input, processing information, producing the output and storing information. Let
us start with the input device. The most common input devices are keyboards,
mouse, and joystick (used in computer games.) and the most common output
devices are monitor (screen) and printer.
39
Let us go to the microprocessor. The microprocessor is used in processing
information such as performing arithmetic computations and making decisions
based on given choices or alternatives. Technically speaking, the microprocessor
is also called the central processing unit (CPU) or the brain of the computer.
The most common storage devices that are used for storing information are floppy
diskettes, hard disks, CD-ROMS, DVD-ROMS USB, Flash Drive and backup
tapes., Computer’s memory (both temporary or permanent storage) such as RAM
(Random Access Memory) are considered storage devices (technically speaking) .
It’s because they are capable of holding information temporarily (in the case of
RAM chips) and permanently (in the case of ROM Chips ). The computer is made
of these major components.
The first aids to computation were purely mechanical devices which required the
operator to set up the initial values of an elementary arithmetic operation, then
manipulate the device to obtain the result. Later, computers represented numbers
in a continuous form (e.g. distance along a scale, rotation of a shaft, or a voltage).
Numbers could also be represented in the form of digits, automatically
manipulated by a mechanism. Although this approach generally required more
complex mechanisms, it greatly increased the precision of results. The
development of transistor technology and then the integrated circuit chip led to a
series of breakthroughs, starting with transistor computers and then integrated
circuit computers, causing digital computers to largely replace analog computers.
Metal-oxide-semiconductor (MOS) large-scale integration (LSI) then enabled
semiconductor memory and the microprocessor, leading to another key
breakthrough, the miniaturized personal computer (PC), in the 1970s. The cost of
computers gradually became so low that personal computers by the 1990s, and
then mobile computers (smartphones and tablets) in the 2000s, became
ubiquitous.
40
Contents
• 1Early devices
o 1.3Mechanical calculators
o 1.5Calculators
• 3Analog computers
o 4.1Electromechanical computers
o 4.2Digital computation
• 5Stored-program computer
o 5.1Theory
o 5.2Manchester Baby
o 5.3Manchester Mark 1
o 5.4EDSAC
o 5.5EDVAC
o 5.6Commercial computers
o 5.7Microprogramming
• 6Magnetic memory
• 8Transistor computers
41
o 8.1Transistor peripherals
o 8.2Transistor supercomputers
• 10Semiconductor memory
• 11Microprocessor computers
• 12Epilogue
• 13See also
• 14Notes
• 15References
• 16Further reading
• 17External links
42
The Ishango bone is thought to be a Paleolithic tally stick.
Devices have been used to aid computation for thousands of years, mostly using
one-to-one correspondence with fingers. The earliest counting device was
probably a form of stick. The Lebombo bone from the mountains between
Swaziland and South Africa may be the oldest known mathematical artifact. It
dates from 35,000 BCE and consists of 29 distinct notches that were deliberately
cut into a baboon's fibula. Later record keeping aids throughout the Fertile
Crescent included calculi (clay spheres, cones, etc.) which represented counts of
items, probably livestock or grains, sealed in hollow unbaked clay containers. The
use of counting rods is one example. The abacus was early used for arithmetic
tasks. What we now call the Roman abacus was used in Babylonia as
early as c. 2700–2300 BC. Since then, many other forms of reckoning boards or
tables have been invented. In a medieval European counting house, a checkered
cloth would be placed on a table, and markers moved around on it according to
certain rules, as an aid to calculating sums of money.
43
Circle: a notional machine for calculating answers to philosophical questions (in
this case, to do with Christianity) via logical combinatorics. This idea was taken up
by Leibniz centuries later, and is thus one of the founding elements in computing
and information science.
A slide rule
44
Generations of engineers and other mathematically involved professional workers,
until the invention of the pocket calculator.
Mechanical calculators [
View through the back of Pascal's calculator. Pascal invented his machine in 1642.
In 1642, while still a teenager, Blaise Pascal started some pioneering work on
calculating machines and after three years of effort and 50 prototypes he invented
a mechanical calculator. He built twenty of these machines (called Pascal's
calculator or Pascaline) in the following ten years. Nine Pascal Ines have survived,
most of which are on display in European museums. A continuing debate exists
over whether Schickard or Pascal should be regarded as the "inventor of the
mechanical calculator" and the range of issues to be considered is discussed
elsewhere.
Gottfried Wilhelm von Leibniz invented the stepped reckoner and his famous
stepped drum mechanism around 1672. He attempted to create a machine that
could be used not only for addition and subtraction but would utilise a moveable
carriage to enable long multiplication and division. Leibniz once said "It is
unworthy of excellent men to lose hours like slaves in the labor of calculation
which could safely be relegated to anyone else if machines were used." However,
Leibniz did not incorporate a fully successful carry mechanism. Leibniz also
described the binary numeral system, a central ingredient of all modern
computers. However, up to the 1940s, many subsequent designs
45
Punched-card data processing
In 1804, French weaver Joseph Marie Jacquard developed a loom in which the
pattern being woven was controlled by a paper tape constructed from punched
cards. The paper tape could be changed without changing the mechanical design
of the loom. This was a landmark achievement in programmability. His machine
was an improvement over similar weaving looms. Punched cards were preceded
by punch bands, as in the machine proposed by Basile Bouchon. These bands
would inspire information recording for automatic pianos and more recently
numerical control machine tools.
In the late 1880s, the American Herman Hollerith invented data storage on
punched cards that could then be read by a machine. To process these punched
cards, he invented the tabulator and the keypunch machine. His machines used
electromechanical relays and counters. Hollerith's method was used in the 1890
United States Census. That census was processed two years faster than the prior
census had been. Hollerith's company eventually became the core of IBM.
46
million workers. Punched cards became ubiquitous in industry and government
for accounting and administration.
The book IBM and the Holocaust by Edwin Black outlines the ways in which IBM's
technology helped facilitate Nazi genocide through generation and tabulation of
punch cards based on national census data. See also: Dehomag
Calculators
47
Richardson's interest in weather prediction led him to propose human computers
and numerical analysis to model the weather; to this day, the most powerful
computers on Earth are needed to adequately model its weather using the Navier–
Stokes equations.
The world's first all-electronic desktop calculator was the British Bell Punch
ANITA, released in 1961. It used vacuum tubes, cold-cathode tubes and
Dynatrons in its circuits, with 12 cold-cathode "Nixie" tubes for its display. The
ANITA sold well since it was the only electronic desktop calculator available,
and was silent and quick. The tube technology was superseded in June 1963 by the
U.S. manufactured Friden EC-130, which had an all-transistor design, a stack of
four 13-digit numbers displayed on a 5-inch (13 cm) CRT, and introduced reverse
Polish notation (RPN).
48
Charles Babbage, an English mechanical engineer and polymath, originated the
concept of a programmable computer. Considered the "father of the computer",
he conceptualized and invented the first mechanical computer in the early 19th
century. After working on his revolutionary difference engine, designed to aid in
navigational calculations, in 1833 he realized that a much more general design, an
Analytical Engine, was possible. The input of programs and data was to be
provided to the machine via punched cards, a method being used at the time to
direct mechanical looms such as the Jacquard loom. For output, the machine
would have a printer, a curve plotter and a bell. The machine would also be able to
punch numbers onto cards to be read in later. It employed ordinary base-10 fixed-
point arithmetic.
The Engine incorporated an arithmetic logic unit, control flow in the form of
conditional branching and loops, and integrated memory, making it the first
Trial model of a part of the Analytical Engine, built by Babbage, as displayed at the
Science Museum (London
49
The programming language to be employed by users was akin to modern day
assembly languages. Loops and conditional branching were possible, and so the
language as conceived would have been Turing-complete as later defined by Alan
Turing. Three different types of punch cards were used: one for arithmetical
operations, one for numerical constants, and one for load and store operations,
transferring numbers from the store to the arithmetical unit or back. There were
three separate readers for the three types of cards.
The machine was about a century ahead of its time. However, the project was
slowed by various problems including disputes with the chief machinist building
parts for it. All the parts for his machine had to be made by hand—this was a major
problem for a machine with thousands of parts. Eventually, the project
was dissolved with the decision of the British Government to cease funding.
Babbage's failure to complete the analytical engine can be chiefly attributed to
difficulties not only of politics and financing, but also to his desire to develop an
increasingly sophisticated computer and to move ahead faster than anyone else
could follow. Ada Lovelace translated and added notes to the "Sketch of the
Analytical Engine" by Luigi Federico Menabrea. This appears to be the first
published description of programming, so Ada Lovelace is widely regarded as the
first computer programmer
Analog computers
In the first half of the 20th century, analog computers were considered by many to
be the future of computing. These devices used the continuously changeable
aspects of physical phenomena such as electrical, mechanical,
50
or hydraulic quantities to model the problem being solved, in contrast to digital
computers that represented varying quantities symbolically, as their numerical
values change. As an analog computer does not use discrete values, but rather
continuous values, processes cannot be reliably repeated with exact equivalence,
as they can with Turing machines.
The first modern analog computer was a tide-predicting machine, invented by Sir
William Thomson, later Lord Kelvin, in 1872. It used a system of pulleys and wires
to automatically calculate predicted tide levels for a set period at a particular
location and was of great utility to navigation in shallow waters. His device was the
foundation for further developments in analog computing.
A Mk. I Drift Sight. The lever just in front of the bomb aimer's fingertips sets the
altitude, the wheels near his knuckles set the wind and airspeed.
An important advance in analog computing was the development of the first fire-
control systems for long range ship gun laying. When gunnery ranges increased
dramatically in the late 19th century it was no longer a simple matter of calculating
the proper aim point, given the flight times of the shells. Various spotters on board
the ship would relay distance measures and observations to a central plotting
station. There the fire direction teams fed in the location, speed
51
and direction of the ship and its target, as well as various adjustments for
Coriolis effect, weather effects on the air, and other adjustments; the computer
would then output a firing solution, which would be fed to the turrets for laying.
In 1912, British engineer Arthur Pollen developed the first electrically powered
mechanical analogue computer (called at the time the Argo Clock). It was used by
the Imperial Russian Navy in World War I. The alternative Dreyer Table fire
control system was fitted to British capital ships by mid-1916.
Mechanical devices were also used to aid the accuracy of aerial bombing. Drift
Sight was the first such aid, developed by Harry Wimperis in 1916 for the Royal
Naval Air Service; it measured the wind speed from the air, and used that
measurement to calculate the wind's effects on the trajectory of the bombs. The
system was later improved with the Course Setting Bomb Sight, and reached a climax
with World War II bomb sights, Mark XIV bomb sight (RAF Bomber Command) and
the Norden (United States Army Air Forces).
The art of mechanical analog computing reached its zenith with the differential
analyzer, built by H. L. Hazen and Vannevar Bush at MIT starting in 1927, which
built on the mechanical integrators of James Thomson and the torque amplifiers
invented by H. W. Nieman. A dozen of these devices were built before their
obsolescence became obvious; the most powerful was constructed at the
University of Pennsylvania's Moore School of Electrical Engineering, where the
ENIAC was built.
Electromechanical computers
The era of modern computing began with a flurry of development before and during
World War II. Most digital computers built in this period were electromechanical –
electric switches drove mechanical relays to perform the calculation. These devices
had a low operating speed and were eventually
52
superseded by much faster all-electric computers, originally using vacuum tubes.
In the same year, electro-mechanical devices called bombs were built by British
cryptologists to help decipher German Enigma-machine-encrypted secret
messages during World War II. The bombe's initial design was created in 1939 at
the UK Government Code and Cypher School (GC&CS) at Bletchley Park by
Alan Turing, with an important refinement devised in 1940 by Gordon Welchman.
The engineering design and construction was the work of Harold Keen of the
British Tabulating Machine Company. It was a substantial development from a
device that had been designed in 1938 by Polish Cipher Bureau cryptologist
Marian Rejewski, and known as the "cryptologic bomb" (Polish: "bomba
kryptologiczna").
In 1941, Zuse followed his earlier machine up with the Z3, the world's first working
electromechanical programmable, fully automatic digital computer. The Z3 was
built with 2000 relays, implementing a 22-bit word length that operated at a clock
frequency of about 5–10 Hz. Program code and data were stored on punched film.
It was quite similar to modern machines in some respects, pioneering numerous
advances such as floating point numbers. Replacement of the hard-to-implement
decimal system (used in Charles Babbage's earlier design) by the simpler binary
system meant that Zuse's machines were easier to build
53
and potentially more reliable, given the technologies available at that time. The Z3
was probably a Turing-complete machine. In two 1936 patent applications, Zuse
also anticipated that machine instructions could be stored in the same storage
used for data—the key insight of what became known as the von Neumann
architecture, first implemented in 1948 in America in the
electromechanical IBM SSEC and in Britain in the fully electronic
Manchester Baby.
Digital computation
The term digital was first suggested by George Robert Stibitz and refers to where a
signal, such as a voltage, is not used to directly represent a value (as it would be in
an analog computer), but to encode it. In November 1937, George Stibitz, then
working at Bell Labs (1930–1941), completed a relay-based calculator he later
dubbed the "Model K" (for "kitchen table", on which he had assembled it), which
became the first binary adder. Typically signals have two states – low (usually
representing 0) and high (usually representing 1), but sometimes three- valued logic
is used, especially in high-density memory. Modern computers generally use binary
logic, but many early machines were decimal computers. In these machines, the basic
unit of data was the decimal digit, encoded in one of several schemes, including binary-
coded decimal or BCD, bi-quinary, excess-3, and two-out-of-five code.
54
Atanasoff–Berry Computer replica at first floor of Durham Center, Iowa State
University
Computers whose logic was primarily built using vacuum tubes are now known as
first generation computers.
55
The electronic programmable computer
Colossus was the first electronic digital programmable computing device, and was
used to break German ciphers during World War II. It remained unknown, as a
military secret, well into the 1970s
During World War II, British code breakers at Bletchley Park, 40 miles (64 km)
north of London, achieved a number of successes at breaking encrypted enemy
military communications. The German encryption machine, Enigma, was first
attacked with the help of the electro-mechanical bombes. Women often operated
these bombe machines. They ruled out possible Enigma settings by performing
chains of logical deductions implemented electrically. Most possibilities led to a
contradiction, and the few remaining could be tested by hand.
56
Wartime photo of Colossus No. 10
Colossus was the world's first electronic digital programmable computer. It used
a large number of valves (vacuum tubes). It had paper-tape input and was
capable of being configured to perform a variety of Boolean logical operations on
its data, but it was not Turing-complete. Data input to Colossus was by
photoelectric reading of a paper tape transcription of the enciphered
intercepted message. This was arranged in a continuous loop so that it could be
read and re-read multiple times – there being no internal store for the data. The
reading mechanism ran at 5,000 characters per second with the paper tape
moving at 40 ft./s (12.2 m/s; 27.3 mph). Colossus Mark 1 contained 1500
thermionic valves (tubes), but Mark 2 with 2400 valves and five processors in
parallel, was both 5 times faster and simpler to operate than Mark 1, greatly
speeding the decoding process. Mark 2 was designed while Mark 1 was being
constructed. Allen Coombs took over leadership of the Colossus Mark 2 project
when Tommy Flowers moved on to other projects.[91] The first Mark 2 Colossus
became operational on 1 June 1944, just in time for the Allied Invasion of
Normandy on D-Day.
Most of the use of Colossus was in determining the start positions of the Tunny
rotors for a message, which was called "wheel setting". Colossus included the first-
ever use of shift registers and systolic arrays, enabling five simultaneous tests, each
involving up to 100 Boolean calculations. This enabled five different possible start
positions to be examined for one transit of the paper tape. As well as wheel setting
some later Colossi included mechanisms intended to help determine pin patterns
known as "wheel breaking". Both models were programmable using switches and
plug panels in a way their predecessors had not been. Ten Mk 2 Colossi were
operational by the end of the war.
57
ENIAC was the first Turing-complete electronic device, and performed ballistics
trajectory calculations for the United States Army.
Without the use of these machines, the Allies would have been deprived of the very
valuable intelligence that was obtained from reading the vast quantity of
enciphered high-level telegraphic messages between the German High Command
(OKW) and their army commands throughout occupied Europe. Details of their
existence, design, and use were kept secret well into the 1970s. Winston
Churchill personally issued an order for their destruction into pieces no larger than
a man's hand, to keep secret that the British were capable of cracking Lorenz SZ
cyphers (from German rotor stream cipher machines) during the oncoming Cold
War. Two of the machines were transferred to the newly formed GCHQ and the
others were destroyed. As a result, the machines were not included in many
histories of computing. A reconstructed working copy of one of the Colossus
machines is now on display at Bletchley Park.
It combined the high speed of electronics with the ability to be programmed for
many complex problems. It could add or subtract 5000 times a second, a thousand
times faster than any other machine. It also had modules to multiply, divide, and
square root. High-speed memory was limited to 20 words (equivalent to about 80
bytes). Built under the direction of John Mauchly and J. Presper Eckert at the
University of Pennsylvania, ENIAC's development and construction lasted from
1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons,
using 200 kilowatts of electric power and contained over 18,000 vacuum tubes,
1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. ]
One of its major engineering feats was to minimize the effects of tube burnout,
which was a common problem in machine reliability at that time. The machine
was in almost constant use for the next ten years.
Stored-program computer
58
Further information: List of vacuum tube computers
Early computing machines were programmable in the sense that they could follow
the sequence of steps they had been set up to execute, but the "program", or steps
that the machine was to execute, were set up usually by changing how the wires
were plugged into a patch panel or plug board. "Reprogramming", when it was
possible at all, was a laborious process, starting with engineers working out
flowcharts, designing the new set up, and then the often-exacting process of physically
re-wiring patch panels. Stored-program computers, by contrast, were designed to
store a set of instructions (a program), in memory – typically the same memory as
stored data.
Theory
The theoretical basis for the stored-program computer had been proposed by
Alan Turing in his 1936 paper. In 1945 Turing joined the National Physical
Laboratory and began his work on developing an electronic stored-program digital
computer. His 1945 report ‘Proposed Electronic Calculator’ was the first
specification for such a device.
59
Turing's theoretical work, received more publicity, despite its incomplete nature
and questionable lack of attribution of the sources of some of the ideas.
Manchester Baby
The Manchester Baby was the world's first electronic stored-program computer. It
was built at the Victoria University of Manchester by Frederic C. Williams, Tom
Kilburn and Geoff Toot ill, and ran its first program on 21 June 1948.
The machine was not intended to be a practical computer but was instead designed
as a test bed for the Williams tube, the first random-access digital storage device.
Invented by Freddie Williams and Tom Kilburn at the University of Manchester in 1946
and 1947, it was a cathode ray tube that used an effect called secondary emission to temporarily
store electronic binary data, and was used successfully in several early computers.
Although the computer was small and primitive by the standards of the 1990s, it was the first
working machine to contain all of the elements essential to a modern electronic computer. As soon
as the Baby had demonstrated the feasibility of its design, a project was initiated at the university
to develop it into a more usable computer, the Manchester Mark 1. The Mark 1 in turn quickly
became the prototype for the Ferranti Mark 1, the world's first commercially available general-
purpose computer.
The Baby had a 32-bit word length and a memory of 32 words. As it was designed to be the simplest
possible stored-program computer, the only arithmetic operations implemented in hardware were
subtraction and negation; other
60
arithmetic operations were implemented in software. The first of three programs written for the
machine found the highest proper divisor of 218 (262,144), a calculation that was known would
take a long time to run—and so prove the computer's reliability—by testing every integer from 218
− 1 downwards, as division was implemented by repeated subtraction of the divisor. The program
consisted of 17 instructions and ran for 52 minutes before reaching the correct answer of 131,072,
after the Baby had performed 3.5 million operations (for an effective CPU speed of 1.1 kIPS).
Manchester Mark 1
The Experimental machine led on to the development of the Manchester Mark 1 at the University
of Manchester.[104] Work began in August 1948, and the first version was operational by April 1949;
a program written to search for Mersenne primes ran error-free for nine hours on the night of
16/17 June 1949. The machine's successful operation was widely reported in the British press,
which used the phrase "electronic brain" in describing it to their readers.
The computer is especially historically significant because of its pioneering inclusion of index
registers, an innovation which made it easier for a program to read sequentially through an array
of words in memory. Thirty-four patents resulted from the machine's development, and many of the
ideas behind its design were incorporated in subsequent commercial products such as the IBM
701 and 702 as well as the Ferranti Mark 1. The chief designers, Frederic C. Williams and Tom
Kilburn, concluded from their experiences with the Mark 1 that computers would be used more in
scientific roles than in pure mathematics. In 1951 they started development work on Meg,
the Mark 1's successor, which would include a floating point unit.
EDSAC
61
EDSAC
The other contender for being the first recognizably modern digital stored-
program computer was the EDSAC, designed and constructed by Maurice Wilkes
and his team at the University of Cambridge Mathematical Laboratory in
England at the University of Cambridge in 1949. The machine was inspired by
John von Neumann's seminal First Draft of a Report on the EDVAC and was one of the
first usefully operational electronic digital stored- program computer.
EDSAC ran its first programs on 6 May 1949, when it calculated a table of
squares and a list of numbers. The EDSAC also served as the basis for the first
commercially applied computer, the LEO I, used by food manufacturing
company J. Lyons & Co. Ltd. EDSAC 1 and was finally shut down on 11 July
1958, having been superseded by EDSAC 2 which stayed in use until 1965.
EDVAC
EDVAC
ENIAC inventors John Mauchly and J. Presper Eckert proposed the EDVAC's
construction in August 1944, and design work for the EDVAC commenced at the
University of Pennsylvania's Moore School of Electrical Engineering, before the
ENIAC was fully operational. The design implemented a number of important
architectural and logical improvements conceived during the ENIAC's
construction, and a high-speed serial-access memory. However, Eckert and
Mauchly left the project and its construction floundered.
62
Commercial computers
The first commercial computer was the Ferranti Mark 1, built by Ferranti and
delivered to the University of Manchester in February 1951. It was based on the
Manchester Mark 1. The main improvements over the Manchester Mark 1 were
in the size of the primary storage (using random access Williams tubes),
secondary storage (using a magnetic drum), a faster multiplier, and additional
instructions. The basic cycle time was 1.2 milliseconds, and a multiplication
could be completed in about 2.16 milliseconds. The multiplier used almost a
quarter of the machine's 4,050 vacuum tubes (valves). [112] A second machine
was purchased by the University of Toronto, before the design was revised into
the Mark 1 Star. At least seven of these later machines were delivered between
1953 and 1957, one of them to Shell labs in Amsterdam.[113]
In October 1947, the directors of J. Lyons & Company, a British catering company
famous for its teashops but with strong interests in new office management
techniques, decided to take an active role in promoting the commercial
development of computers. The LEO I computer became operational in April
1951[114] and ran the world's first regular routine office computer job. On 17
November 1951, the J. Lyons Company began weekly operation of a bakery
valuations job on the LEO (Lyons Electronic Office). This was the first business
application to go live on a stored program computer.[115]
In June 1951, the UNIVAC I (Universal Automatic Computer) was delivered to the
U.S. Census Bureau. Remington Rand eventually sold 46 machines at more than
US$1 million each ($9.85 million as of 2020). UNIVAC was the first "mass
63
Produced" computer. It used 5,200 vacuum tubes and consumed 125 kW of
power. Its primary storage was serial-access mercury delay lines capable of
storing 1,000 words of 11 decimal digits plus sign (72-bit words).
IBM introduced a smaller, more affordable computer in 1954 that proved very
popular. The IBM 650 weighed over 900 kg, the attached power supply weighed
around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by
0.9 meters by 1.8 meters. It cost US$500,000 ] ($4.76 million as of 2020) or could
be leased for US$3,500 a month ($30 thousand as of 2020). Its drum memory was
originally 2,000 ten-digit words, later expanded to 4,000 words. Memory
limitations such as this were to dominate programming for decades afterward.
Microprogramming
It was widely used in the CPUs and floating-point units of mainframe and other
computers; it was implemented for the first time in EDSAC 2, which also used
multiple identical "bit slices" to simplify design. Interchangeable, replaceable tube
assemblies were used for each bit of the processor.
Magnetic memory
64
Diagram of a 4×4 plane of magnetic core memory in an X/Y line coincident-
current setup. X and Y are drive lines, S is sense, Z is inhibit. Arrows indicate the
direction of current for writing.
Magnetic drum memories were developed for the US Navy during WW II with the
work continuing at Engineering Research Associates (ERA) in 1946 and 1947.
ERA, then a part of Univac included a drum memory in its 1103, announced in
February 1953. The first mass-produced computer, the IBM 650, also announced
in 1953 had about 8.5 kilobytes of drum memory.
Magnetic core memory patented in 1949] with its first usage demonstrated for the
Whirlwind computer in August 1953. Commercialization followed quickly.
Magnetic core was used in peripherals of the IBM 702 delivered in July 1955, and
later in the 702 itself. The IBM 704 (1955) and the Ferranti Mercury (1957) used
magnetic-core memory. It went on to dominate the field into the 1970s, when it
was replaced with semiconductor memory. Magnetic core peaked in volume about
1975 and declined in usage and market share thereafter.
Defining characteristics of some early digital computers of the 1940s (In the history of
computing hardware)
Comput
First
Numeral ing Turing
Name operati Programming
system mechan complete
onal
ism
Arthur H.
Jan Electron Not
Dickinson IB M Decimal No
1940 ic programmable
(US)
Joseph
March Electron Not
Desch NCR (U S) Decimal No
1940 ic programmable
Program-
Electro- controlled by In
Zuse Z3 (Ger May Binary flo
mechani punched theory (199
many) 1941 ating point
cal 35 mm film 8)
stock (but no
65
conditional
branch)
Program-
Colossus Mar k 1 Februar Electron controlled by
Binary No
(UK) y 1944 ic patch cables
and switches
Program-
controlled by
24-
Harvard Mark I – Electro-
May channel punc
IBM ASCC (US) Decimal mechani Debatable
1944 hed paper
cal
tape (but no
conditional
branch)
Program-
In
Colossus Mark June Electron controlled by
Binary theory (201
2 (UK) 1944 ic patch cables [126]
1)
and switches
Program-
Binary Electro- controlled by
Zuse Z4 (Ger March
floating mechani punched Yes
many) 1945
point cal 35 mm film
stock
Program-
Februar Electron controlled by
ENIAC (US) Decimal Yes
y 1946 ic patch cables
and switches
66
Stored-
ARC2 (SEC) May Electron program in rot
Binary Yes
(UK) 1948 ic ating drum
memory
Stored-
program in Wi
Manchester June Electron
Binary lliams cathode Yes
Baby (UK) 1948 ic
ray tube
memory
Read-only
stored
programming
Modified Septemb Electron mechanism
Decimal Yes
ENIAC (US) er 1948 ic using the
Function
Tables as
program ROM
Stored-
program in
Williams
Manchester April Electron
Binary cathode ray Yes
Mark 1 (UK) 1949 ic
tube memory
and magnetic
drum memory
Stored-
May Electron program in
EDSAC (UK) Binary Yes
1949 ic mercury delay
line memory
Stored-
CSIRAC (Aust Novemb Electron program in
Binary Yes
ralia) er 1949 ic mercury delay
line memory
Transistor computers
67
Main article: Transistor computer
The bipolar transistor was invented in 1947. From 1955 onward transistors
replaced vacuum tubes in computer designs, giving rise to the "second generation"
of computers. Compared to vacuum tubes, transistors have many advantages: they
are smaller, and require less power than vacuum tubes, so give off less heat. Silicon
junction transistors were much more reliable than vacuum tubes and had longer
service life. Transistorized computers could contain tens of thousands of binary
logic circuits in a relatively compact space. Transistors greatly reduced computers'
size, initial cost, and operating cost. Typically, second-generation computers were
composed of large numbers of printed circuit boards such as the IBM Standard
Modular System, each carrying one to four logic gates or flip-flops.
68
CADET used 324 point-contact transistors provided by the UK company
Standard Telephones and Cables; 76 junction transistors were used for the first
stage amplifiers for data read from the drum, since point-contact transistors were
too noisy. From August 1956 CADET was offering a regular computing service,
during which it often executed continuous computing runs of 80 hours or more.
Problems with the reliability of early batches of point contact and alloyed junction
transistors meant that the machine's mean time between failures was about 90
minutes, but this improved once the more reliable bipolar junction transistors
became available.
The Manchester University Transistor Computer's design was adopted by the local
engineering firm of Metropolitan-Vickers in their Metrovick 950, the first
commercial transistor computer anywhere. Six Metrovick 950s were built, the first
completed in 1956. They were successfully deployed within various departments
of the company and were in use for about five years. A second generation
computer, the IBM 1401, captured about one third of the world market. IBM
installed more than ten thousand 1401s between 1960 and 1964.
Transistor peripherals
Transistorized electronics improved not only the CPU (Central Processing Unit),
but also the peripheral devices. The second generation disk data storage units
were able to store tens of millions of letters and digits. Next to the fixed disk
storage units, connected to the CPU via high-speed data transmission, were
removable disk data storage units. A removable disk pack can be easily exchanged
with another pack in a few seconds. Even if the removable disks' capacity is smaller
than fixed disks, their interchangeability guarantees a nearly unlimited quantity
of data close at hand. Magnetic tape provided archival capability for this data, at
a lower cost than disk.
69
Transistor supercomputers
The early 1960s saw the advent of supercomputing. The Atlas was a joint development
between the University of Manchester, Ferranti, and Plessey, and was first installed
at Manchester University and officially commissioned in 1962 as one of the world's
first supercomputers – considered to be the most powerful computer in the world at
that time. It was said that whenever Atlas went offline half of the United Kingdom's
computer capacity was lost.] It was a second- generation machine, using discrete
germanium transistors. Atlas also pioneered the Atlas Supervisor, "considered by
many to be the first recognizable modern operating system".
The idea of an integrated circuit was conceived by a radar scientist working for the
Royal Radar Establishment of the Ministry of Defense, Geoffrey W.A. Dummer.
70
The first working integrated circuits were invented by Jack Kilby at Texas
Instruments and Robert Noyce at Fairchild Semiconductor. Kelby recorded his
initial ideas concerning the integrated circuit in July 1958, successfully
demonstrating the first working integrated example on 12 September 1958.
Kelby’s invention was a hybrid integrated circuit (hybrid IC). It had external wire
connections, which made it difficult to mass-produce.
Third generation (integrated circuit) computers first appeared in the early 1960s
in computers developed for government purposes, and then in commercial
computers beginning in the mid-1960s.
Semiconductor memory
Microprocessor computers
The subject of exactly which device was the first microprocessor is contentious,
partly due to lack of agreement on the exact definition of the term
"microprocessor". The earliest multi-chip microprocessors were the Four-Phase
Systems AL-1 in 1969 and Garrett AiResearch MP944 in 1970, developed with
71
multiple MOS LSI chips. The first single-chip microprocessor was the Intel 4004,
developed on a single PMOS LSI chip. It was designed and realized by Ted Hoff,
Federico Faggin, Masatoshi Shima and Stanley Mazor at Intel, and released in 1971.
Tadashi Sasaki and Masatoshi Shima at Busicom, a calculator manufacturer, had
the initial insight that the CPU could be a single MOS LSI chip, supplied by Intel.
The die from an Intel 8742, an 8-bit microcontroller that includes a CPU running
at 12 MHz, RAM, EPROM, and I/O.
While the earliest microprocessor ICs literally contained only the processor, i.e.
the central processing unit, of a computer, their progressive development
naturally led to chips containing most or all of the internal electronic parts of a
computer. The integrated circuit in the image on the right, for example, an
Intel 8742, is an 8-bit microcontroller that includes a CPU running at 12 MHz, 128
bytes of RAM, 2048 bytes of EPROM, and I/O in the same chip.
During the 1960s there was considerable overlap between second and third
generation technologies.[169] IBM implemented its IBM Solid Logic Technology
modules in hybrid circuits for the IBM System/360 in 1964. As late as 1975, Sperry
Univac continued the manufacture of second-generation machines such as the
UNIVAC 494. The Burroughs large systems such as the B5000 were stack
machines, which allowed for simpler programming. These pushdown
automatons were also implemented in minicomputers and microprocessors later,
which influenced programming language design. Minicomputers served as low-
cost computer centers for industry, business and universities. [170] It became
possible to simulate analog circuits with the simulation program with
integrated circuit emphasis, or SPICE (1971) on minicomputers, one of the programs
for electronic design automation (EDA). The microprocessor led to the
development of the microcomputer, small, low-cost computers that could be
owned by individuals and small businesses.
72
Microcomputers, the first of which appeared in the 1970s, became ubiquitous in
the 1980s and beyond.
Altair 8800
In April 1975 at the Hannover Fair, Olivetti presented the P6060, the world's first
complete, pre-assembled personal computer system. The central processing unit
consisted of two cards, code named PUCE1 and PUCE2, and unlike most other
personal computers was built with TTL components rather than a microprocessor.
It had one or two 8" floppy disk drives, a 32-character plasma display, 80-column
graphical thermal printer, 48 Kbytes of RAM, and BASIC language. It
weighed 40 kg (88 lbs.). As a complete system, this was a significant step from the
Altair, though it never achieved the same success. It was in competition with a
similar product by IBM that had an external floppy disk drive.
Architectures, with features added from their larger brethren, now dominant in
most market segments.
73
A NeXT Computer and its object-oriented development tools and libraries were used
by Tim Berners-Lee and Robert Cailliau at CERN to develop the world's first web
server software, CERN httpd, and also used to write the first web browser, World
Wide Web.
An indication of the rapidity of development of this field can be inferred from the
history of the seminal 1947 article by Burks, Goldstine and von Neumann. ] By the
time that anyone had time to write anything down, it was obsolete. After 1945, others
read John von Neumann's First Draft of a Report on the EDVAC, and immediately started
implementing their own systems. To this day, the rapid pace of development has
continued, worldwide.
A 1966 article in Time predicted that: "By 2000, the machines will be producing so
much that everyone in the U.S. will, in effect, be independently wealthy. How to use
leisure time will be a major problem.
74