2 Hardware and Software
2 Hardware and Software
45 42
Mainframe Computers
Mainframe computers are large powerful computers that have high processing power
than normal computers and can serve several terminals
Mainframes use parallel processing
They are multitasking, multi user computer that allow multiple people to work on
different problems at the same time
It can have hundreds of processors core which all share on operating system
Supercomputers
They are large computers which use massively parallel processing that allows them to
complete highly complex tasks quickly
The can have more than 100 000 processing core, where each core has their own
operating system.
Supercomputers use more than one GPU
A supercomputer only runs very few computer programs and its focus is on executing
instructions as quickly as possible for one purpose in order to be used at their maximum
capacity
Mainframe computers have great longevity or life spans; they can run continuously for
very long periods of time with minimum downtime. Shutting them down and then
disposing off their hardware is very expensive
They are made to last for decade and still work well
Supercomputers have a lifespan of about 5 years, they tend to be used by organization
such as meteorology until a much faster supercomputer is introduced to market
RAS
Used when referring to mainframe computers and stands for reliability, availability and
serviceability (this characteristic is not used for supercomputers)
Reliability: most reliable computers since their processors are able to check themselves
or errors and are able to recover without any undue effects on the mainframes
operation. The system software is also very reliable as it is thoroughly tested and
updates are made quickly to overcome any errors
Availability: refers to mainframes being available all time and extended periods, Mean
time between failures (MTBF) is a measure of system, it is the average time that exists
between downtime during a systems normal operation. Mainframes give months, or
years of system availability between downtimes. The period of downtime is also short as
mainframes are able to recover quickly. This is because mainframes usually come with
spare components like extra CPU that are automatically used when a failure occurs, the
operator is alerted and CPU or component is replaced at a later time but throughout the
system continues to work normally.
Serviceability: this is the ability of a mainframe to discover why a failure occurred and
means that hardware and software components can be replace without having too
great an effect on the mainframes operation
Security
Mainframe have greater security than other types of computer system that enables it to
share a company’s data among several users but still be in a position to protect it. A
mainframe has many layers of security including:
o User identification and authentication including multi factor authentication such
as password, physical token, biometric identifier or time restricted PIN
o Levels of access, which means that users can only see sets of data depending on
the their level of security
o Encryption of transmitter data and data within the system
o Secure operating systems
o Continual monitoring by the system for unauthorized access attempts
Supercomputers use end to end encryption which means that only sender or recipient is
able to decrypt and understand the data. This is vital when used for storing sensitive
data like DNA profiles
Performance Metrics
Performance metrics are the measures used to determine how well or fast the
processor deals with data
The speed of a mainframes CPU is measured in millions of instructions per second
(MIPS)
not the best measure as not all instruction used by mainframe are same with some
being simple and straightforward that take less time and other being more complex and
slower to process
MIPS are often linked to cost by calculating how much a mainframe costs per one
million instruction per second
supercomputers make use of FLOPS which is how many floating point operations can be
carried out per second as they are used mainly with scientific calculations
modern supercomputers performance is measured in petaflops/quadrillion of flops
flops can be an unreliable measure as they don’t take into account CPU’s clock speed,
bus speed and the amount of RAM available
Mainframes have specialized hardware, called peripheral processors that deal with all
input, output operations leaving CPU to concentrate on the processing of data
This enables mainframes to deal with large amounts of data being input, records being
accessed and large volumes of outputs being generated. Modern mainframs can carry
out billions of transactions every day
The large number of simultaneous transactions and large volumes of input and output in
a given period is referred to as throughput
A supercomputer is designed for maximum processing power and speed, whereas
throughput is a distinct mainframe characteristic
Fault Tolerance
A computer with fault tolerance means it can continue to operate even when one or
more of its components has failed, it may operate at a reduced level but not fail
completely
Mainframe computers have the characteristics of being fault tolerant in terms of their
hardware.
If a processor fails to function the system automatically switches to another processor
without disrupting the processing of data. The same can be done with a software
error/failure by having two different versions of a software where in case of an error,
the other version is automatically run
Supercomputers have ore components than a mainframe with a lot more processors
which means failure is more likely to occur and consequently interrupt the operation of
the system
The approaches to fault tolerance is same as mainframe but with millions of
components the system can go down at any time even though it tends to be up and
running again quite quickly
Operating System
Most mainframes run more than one operating system at any given time
The OS on a mainframe divides a task into various sub tasks, assigning each one to a
different processor core
When each sub task has been processed the results are recombined to provide a
meaningful output, this is called parallel processing
Supercomputers tend to use only one OS but most supercomputers utilize massively
parallel processing in that they have man processor cores, each one with its own OS
Type of processor
Early mainframes had just one processor (CPU) but modern mainframes have multiple
processors called CPU complexes
The number of processor cores found in mainframe is now measured in hundreds.
Supercomputers have hundreds of thousands of processor cores
Modern supercomputers use more than one GPU or graphics processing unit unlike
mainframes
Heat Maintenance
Transaction Processing:
Industry Statistics:
These are statistics that are recorded regarding trends in different industries such as
those that process raw material, make goods in factories or provide services
Business in certain sectors of industry need mainframes to process the vast amount of
data which helps identify their major competitors
It shows their competitors share of market and the trends in sales and helps identify
those products which could compete profitably with other businesses
Companies could also obtain reports from organizations that collect the data for them
and those companies could use mainframes for this purpose
Consumer Statistics:
Consumer statistics allow businesses to assess the demand for their product.
They can inform them about the range of household incomes and employment status of
those consumers who might be interested in the product so that a price can be set
This data can also inform businesses of major consumer geographical locations for local
sales or how inclined they are to use the internet for shopping
It also allows businesses to know how any similar products are available to consumers
and what price they pay for them
These statistics produce an incredible amount of data and the organizations that
produce these statistics use mainframes
Supercomputer Uses
Quantum Mechanics:
it is the study of the behavior of matter and light on the atomic and subatomic scale
it attempts to describe the properties of the constituent parts of an atom, such as
electrons, protons and neutrons and how they interact with each other
these require very large number of calculations that need great accuracy and thus
require the use of a super computer
Weather Forecasting:
it is based on the use of very complex computer models, data from the sensors at
weather station around the world is input into the model and then many calculations
are performed
they also store records of previous weather conditions over very long periods
using the past weather readings, the computer examines similar patterns of weather to
those being experienced at the moment and is able to predict the resulting weather
variables such as atmospheric pressure, humidity, rainfall , wind and wind direction are
recorded using computerized weather stations around the world
these readings together with observations from radar satellites, surroundings from
space and information from ships and aircraft help supercomputer to produce 3-D
model of the earth’s atmosphere
because of the complexity of the calculations and the very large number of them that
need to be carried out, they can only effectively run on super computers
Climate Research:
Advantages of Mainframes
They are very reliable and rarely have any system downtime. This is why organizations
such as bank use them for 24/7 operations throughout the week
Hardware and software upgrades can occur while the mainframe system is still up and
running
Mainframe are getting faster and more powerful every year, outperforming PCs, laptops
and other devices
Mainframes can deal with huge amounts of data that organization need to store and
process. Mainframes ability to run on different operating system, can allow it to cope
with data coming in a variety of database formats which other platforms would find
problematic.
Mainframes have stronger security than other system which have complex encryption
systems and authorization procedures in place
Disadvantages of Mainframes
Mainframes are very expensive and can only be afford by large organizations such as
multinational banks
There is also high cost for the personnel required to run and maintain them
They require large rooms to house the system which isn’t needed with other system
Recent mainframes are becoming super advanced and hence more heat maintenance is
required which can be expensive as cooling system needed to install and run for them
are complex and costly
The software required to run mainframe system is more expensive to buy than using
cloud hence more organizations are shifting to cloud services as they don’t also need to
hire expertise for them
Advantages of Supercomputers
Disadvantages of Supercomputers
Compilers
Advantages:
Disadvantages:
While a program is being compiled, the programmer has to wait doing nothing before
they can correct errors. This could take along while if it’s a major application
A compiler outputs a list of error at the end of compilation, which could make locating
the errors and finding their cause tricky. The whole program needs to be compiled again
after an error is corrected
A compiled program can only run on a computer with same operating system as the
computer it was originally compiled on
Compiling program makes use of more memory than interpreting as the whole program
must be loaded before translation whereas in an interpreter only a few statements of
the program have to be in memory at any given time.
Unlike an interpreter it doesn’t allow small pieces of code to be tested to make sure
they work before continuing with the rest of the program
It is more likely to crash the computer as it is running directly on the CPU
A cross compiled program can run slower on target machine than if it had been
produced on the native machine, cross compilation also produces more errors and
mistakes than a native compiler.
Interpreters
It translates high level language program one statement at a time into an intermediate
form
Then executes that line/statement before moving on to the next one
It reports on errors as lines of source code are processed and stops as soon as it
encounters one
The error has to be corrected before it can continue working on the next instruction
An interpreter has to be resident in memory in order for the program to run
Only a few lines of the program needs to be in memory at any one time saving memory
space
An interpreted program can be transferred between computers with different operating
system because it remains in the form of source code but needs to be translated in each
computer it is moved to
Advantages:
Interpreters are able to execute each statement as it is entered and are able to generate
helpful error reports which is useful during program development
Debugging is easier with interpreters as error messages are output as soon as an error in
a statement is encountered which gives the opportunity to correct the error there and
then
Interpreted programs are in original source so they can run on any system with the
appropriate interpreter
Interpreter makes use of less memory space when interpreting as only a few statements
of the program needs to be in the memory, this allows small pieces of code to be tested
before continuing with the rest of the program
It can run partially complete programs when developing
Disadvantages:
Linkers
A linker or link editor is a system program that combines object files or modules that
have been created using a compiler into one single executable file
It also replaces symbolic addresses with real addresses
Most program are written in modular forms to help simplify the development process or
due, a linker helps in joining these modules back into a single program file
A larger program may be compiled in small parts due to insufficient space in RAM to
hold the whole program and the compiler program
The parts/modules of the programs can be storage on backing storage one at a time and
each brought into RAM and compiled
The resulting object code is then saved to the backing storage
When all parts have been compiled, the pieces of object code can be brought into the
RAM and the linker can be used to combine them into the complete program.
Advantages:
Programs written in modules requires less RAM hence it saves money by not having to
buy extra storage
Whole program and compiler don’t need to be in memory at the same time which
requires less RAM and again saves cost of memory
A number of programmers can be used to write separate modules which can save time
compared to one person an writing the whole code
If there is an error in the code only that module has to be corrected which prevents
recompilation of other modules
Disadvantages:
There could be problems with different variable names being used for the same variable
in different modules
Documentation also has to be more detailed and takes longer to write or read when
completed
Device Drivers
A device driver is a small program that enables the operating system and application
software to communicate with a hardware device acting as interface
It controls the device connected to the computer
Printer driver, mouse driver and keyboard driver are all examples of it
Upon installation it detects and identifies the peripheral device
It wakes up the device when it is needed and puts it back to sleep when it is not needed
It handles the translation of the request between a device and the computer
It defines where outgoing data must be stored before it can be sent
A printer driver acts as interface between the operating system and the printer
When a document is to be printed, the application tells the printer driver and the
printer driver tells the printer
The user appears o have control of the device in the entire process
Without the required printer driver the printer fails to work as the software used by the
computer tends to be created by different companies to those that manufacture
printers
The printer driver is needed to convert the instruction set so the software is able to
communicate with the hardware
Operating Systems and Utilities
An interpreted program will still be in its original source code so it will work on any
system but it must have an appropriate interpreter
Only make use of high level languages which can be interpreted
With languages which are compiled can use a cross compiler so programs can run on a
computer with a different operating system
Cross compiler can be small version of the compiler which is normally used on the host
computer/native computer
A problem with using a cross compiler is that the compiled program will no longer run
on the computer the program was written on/host computer
The cross compiled code can run more slowly than if it had been originally compiled on
the target machine
A cross computer produces more errors and mistakes than a native compiler
sector: It is the smallest storage unit on a hard disk, typically 512 bits
block: Logical data unit composed of multiple sectors
track: Concentric circles on a disks surface, organizing sectors
cylinder: Set of corresponding tracks across all disk platters, enhancing data access
efficiency
Disadvantages:
Signature based method is only capable of dealing with known threats, a new unknown
virus can do untold damage to the software or data stored on a hard disk because it is
not known within the database
False positives can occur in heuristic based detection, this is when the detection
algorithm is so general that matches can be made with files that don’t contain viruses
but just happen to contain a small part of the sequence of bytes which make up the
virus
Behavioral based malware detection can also generate false positives
Disk Defragmentation
Data stored on a disk may consist of several blocks which might not be next to each
other as they would contain empty sectors between them, this means that the data file
is fragmented around the disk
Data deleted from the disk also leaves empty sectors between series of stored data/file
Defragmentation software is used to organize the data on the disk by moving the data
blocks around to bring all the parts of a file together so they are contiguous
As a result data retrieval is made easier and quicker
It attempts to create larger regions of free spacee
With fragmented files it takes longer for the read-write heads to move over the surfaces
to find all the different fragments of files than if data is held in sequence
Software provides additional areas of free space and more storage capacity.
It can also attempt to keep smaller files which belong in same folder/directory together
by reorganizing other files
Formatting
Disk formatting is the configuring of a data storage medium such as hard disk or SSD for
initial use
There are two levels of formatting; low level formatting and high level formatting
The first stage is low level formatting which divides the disk surface into tracks, sectors
and cylinders.
This is done by magnetizing the disk areas using the write heads.
Track are numbered starting from 0
When the head goes from one track to the next, it leaves a gap
Each track is organized into numbered sectors, starting at 1 and separated by gapes
The purpose of low level formatting is to prepare the disk surface to receive data/allow
user to save data
High level formatting generates a new file system on the disk which allows the operating
system to use the disk space to store and access files
It does not permanently erase data but deletes pointers on the disk that tell the OS
where to find them
Advantages
Disadvantages:
if individuals rather than manufacturer are to carry out low level formatting then it can
become almost impossible to restore data after erasing all the file and if done
repeatedly it would shorten the life of the medium
in low level formatting, the file aren’t retrievable
Anti-Virus
Anti-virus software can be a program or set of programs whose function is to detect and
remove viruses
It monitors a computer in a bid to prevent attacks from many types of malwares such as
viruses, worms, Trojan horses, adware, etc
It is important to keep anti-virus software up to date because new viruses and other
malware appear at frequent intervals and old versions of anti-virus will not detect them
Anti-virus software will either remove the virus and malware or quarantine, asking he
user if they want to delete file or ‘clean’ it
It can also does background scans of files, attachments and folders as well as the whole
disk or computer, and inform user if anything is found
These can take place automatically or scheduled to take place at set time
There are different methods employed by an anti-virus to detect viruses
Signature based detection is a method used by anti-virus which is based on recognizing
existing viruses. When a virus is identified, anti-virus manufactures add its signature to a
database of known viruses. It compares the contents of a file with its database of known
malware signature.
The heuristic based detection method detects malware based on characteristics
typically known in malware code
Behavioral based malware detection looks for abnormal or suspicious behavior such as
sending a large number of emails. It can generate false positives. It is only able to detect
malware after they have starting doing their malicious behavior
Sandbox detection is based on behavioral based detection but doesn’t detect the
behavioral fingerprint run time. It executes the programs in a virtual environment within
the computer whereby the suspected virus infected code is executed in the sandbox so
that it can do no real harm.
Backup
Backup software is a program that is used to keep copies of files from a computer or
copy the content of a servers backing storage
The backup is an exact duplicate of the files
It can be used for restoring original file in case of accidental or deliberate corruption or
deletion
Backup software allows the user to select the type of backup, its time and frequency
and where they want it to take place
Additionally the user can also set up automated backups and retrievals of data.
Backups can be stored on same drive or store on an external storage device; they can
also be stored on a cloud.
Most backup software also allow different types of backup such as incremental backup
where only data that has been added or changed since a specific date is backed up
A differential backup only backs up the data which has changed since the last full
backup. Restoring the system requires the use of two backups in this case
Users can also opt to verify their backup to make sure data matches and choose to
encrypt the back or not.
Data Compression
Data compression is the modifying of data so that it occupies less storage space on a
disk, it can be lossy or lossless
A lot of data in computer files is redundant with the same information listed over and
over again, file compression programs remove this redundancy
Lossless compression is where after compression the file can be converted back to its
original state without loss of quality
In it the repeated sequence of bits are replaced with a single character, then the
repeated count along with the original character is stored making the original file size
smaller
When decompressed the codes are replaced with the original text. This type of
compression is common with spreadsheets, word processed files and databases where
loss of quality is of importance
In lossy compression repeated bits are permanently deleted, but only those data bits
which would have little effect.
Jpeg is an image file format that supports lossy image compression. GIF and PNG use
lossless compression
Advantages:
Data compression software uses a lot of computer memory during the compression
process
The process of loading ac compressed file takes a lot longer than opening the original
file
Lossy compression cause a slight lowering of the quality of the resulting sound an video
files
File-copying
Deleting
The delete utility is a piece of software which deletes the pointers that tell OS where to
find the file
It removes the file name and address from a table which stores data about the file
The space on the disk where the file was located is now available for future use by other
files
Moving a file to trash/recycle bin simple alters the directory path to indicate that the file
has been temporarily deleted
It is only deleted in as much as the software shows it in the recycle bin rather than in a
normal folder
Only when the file is deleted from the recycle bin is the file removed from the table and
the files disk space is marked as available for reuse
Until the operating system writes new data over these the file is still available
A file recovery program can scan a hard drive for deleted files and restore them
If the file has been partially overwritten, the file recovery program can only recovery
part of the file
It works by reinstating pointer in the File Addressing Table (FAT) as long as it is done
before any data is over written
This is the software that is specifically written for a specific task and is developed for a
specific company or business
It is made from scratch and owned by the business
The company might employ someone write a software that solves specific problems or
meets their preferences
The development costs in it have to be paid for by the client/company commissioning it
Support is obtained directly from creators
Advantages:
Disadvantages:
It costs more to pay programmers to write code specifically or the tasks/users needs
Testing is limited to what the programmers think may be required based on how they
think the software will be used
Support is limited to the team of developers only
It can take a long time to develop the software
There may be a lot of bugs as it has not been tested thoroughly/ has not been used
before
Off-the-Shelf Software
This is software which already exists and is available straight away to organizations o,
business and general public, ready for use
It is owned by the company that created, customers purchasing don’t have the rights to
sell it
It has to be adapted to fit the business needs and preferences and even with
modifications, the purchasing company can’t sell it to others
This mean it can occupy large amount of storage space with unnecessary features
The development costs for it are spread amongst several customers so it is relatively
cheaper to custom written software
Support can be obtained from help desks with experienced operators and there are
likely to be user forums available by producers for the software
Advantages:
Disadvantages:
The CLI is a means of interacting with a computer using commands in the form of
successive lines of text
A prompt appears in screen after which the command is typed in
The output form the computer could be to produce a list or take some other action
This interface is mostly used by software developers, system administrators and more
advanced users
It requires few resources than GUI
It requires a keyboard and a screen
It is possible to use CLI in a GUI
Advantages:
Disadvantages:
Users with physical handicaps may not be able to use a keyboard or mouse
For reasons of hygiene a doctor or restaurant chef may not e allowed to touch a display
or device such as keyboard or mouse so gesture based or dialogue interface are more
suitable in this instance
CLI requires user to have to learn many commands unlike the other
CLI commands are more difficult to edit
CLI is more difficult to view different items on one screen when multitasking
Advantages:
GUIs tend to be more accurate than dialogue and gesture based interfaces
Background noises and movements can interfere with dialogue interface and gesture
based interface but it’s not an issue with GUI
GUIs tend to be more user friendly
Incorrect commands entered into GUIs are far more easier to correct than in other
interfaces
Disadvantages:
Users with physical handicaps may not be able to use a keyboard or mouse
For reasons of hygiene a doctor or restaurant chef may not be allowed to touch a
display or device such as keyboard or mouse so gesture based or dialogue interface are
more suitable in this instance
GUIs tend to change regularly which could be problematic for certain social groups like
older people to learn how to use a new system
Greater storage space is required in GUI
Dialogue Interface
A dialogue interface allows a user to communicate with a computer or device using their
voice.
The computer uses speech recognition software to convert the spoken words into
commands it can understand
It requires a microphone to capture users speech, the user can load and run software
packages and files by speaking into the microphone and saying the commands
It requires the device to learn the way the speakers talks by asking the user to repeat
certain sentences until it has learnt the way they talk
The computer matches speech of user with the data it has stored in a database, the
speech is converted into commands and action required takes place
The computer can also responds back with spoken words after the text has been
converted into speech
Noise in the background while user is speaking and the ability to recognize only a limited
vocabulary can cause problems
Many cars have such system to allow the driver to control their phone or features in car
like radio without touching them
Advantages:
For hygienic reasons, a doctor/restaurant chef may not be allowed to touch a display or
device, speaking into a microphone is a more hygienic way to control the device
Dialogue interface allows hands free control which is beneficial in scenarios like driving a
car as no hand has to leave the steering wheel
Dialogue interface doesn’t require the user to be sat in front of the computer/device,
they can operate the device/computer remotely while doing other tasks at the same
time
Users with physical disability may not be able to use a keyboard or mouse or control
their limbs accurately , for this instance dialogue interface would be the best type for
them to use
Disadvantages:
A gesture based interface is designed to interpret human gesture and converts these
into commands
Gestures can be made with any part of the body but it is usually the face or hand that
makes the gestures the computer an interpret such as hand waiving, head nodding,
finder pointing, eyes rolling/blinking
A camera in conjunction with an infrared sensor detects the movements being made in
front of it
The computer using a special type of software, searches through all the gestures it has
stored in a database to match it with the input
Each stored gesture is linked to a specific command which is then executed after the
gesture has been matched.
Advantages:
Gestures may be a quicker way of initiating a response from device in comparison with
other user interfaces
People who have speech impediment can find gesture based interface useful rather
than dialogue
Gestures will still be reliable if there is background noise or user has a cold/strong
dialect/accent/ which would affect a dialogue interface
Dialogue interface often requires a training session with user whereas gestures can be
taught through manuals
Users with physical handicaps may find it easier to make gestures rather than gripping a
mouse
Users don’t have to learn so many commands as with a CLI
For reasons of hygiene doctors/health workers/ restaurant chefs may not be allowed to
touch a display or device using a GUI, appropriate gestures are more hygienic way to
control the device
Disadvantages:
Users with physical disabilities may not be able to make gestures and might find
dialogue interface much easier
Certain gestures may not be socially acceptable and may be judged as inappropriate
Certain gestures could be misunderstood by the computer particularly if the user has
made it without realizing
Gesture based is less effective when several users or background activity is involved
Gesture based interface requires line of sight which isn’t necessary dialogue interface