Chapter 2 - Hardware and Software
Chapter 2 - Hardware and Software
SOFTWARE
Chapter NO 02
AS & A Level Information Technology Chapter 2: Hardware and Software
Mainframe and Super Computers
Mainframe computers
Mainframe computers are large, powerful and highly reliable computers that are designed
to handle massive amounts of data processing and perform complex tasks at high speeds.
Mainframes typically have multiple processors, large amounts of memory, and are
designed to handle multiple tasks concurrently. They are used for applications that require
high-speed processing of large amounts of data, such as financial transaction processing,
airline reservations, scientific research, and large-scale data processing.
Supercomputers
Supercomputers are high-performance computing machines that are designed to perform
complex calculations and handle large-scale data processing tasks. They are typically much
faster and more powerful than traditional computers and are used for scientific,
engineering, and research applications that require enormous computational resources.
• Mainframes, which are typically used for large-scale data processing in organizations
such as banks and government agencies, are designed for reliability and resilience.
• They are built with redundant components such as power supplies and storage
systems, which means that if one component fails, the system can continue to operate
without interruption.
• Mainframes also typically have a longer lifespan than other types of computers due to
their high cost and the significant investment required to replace them.
• Supercomputers, on the other hand, are designed for high-performance computing
and are used in scientific research, weather forecasting, and other complex
applications.
• They are typically built with the latest hardware and software technologies, and as a
result, they tend to become obsolete more quickly than mainframes.
RAS stands for Reliability, Availability, and Serviceability, and it refers to a set of features
and capabilities that aim to ensure that computer systems remain operational and recover
quickly in the event of hardware or software failures.
In mainframe and supercomputers, RAS is a critical requirement because these systems are
used for mission-critical tasks, such as processing large volumes of data, scientific research,
and financial transactions. Therefore, the downtime of such systems can have severe
consequences, including financial losses, reputational damage, and even risk to human life.
To achieve high levels of RAS, mainframe and supercomputer systems employ various
hardware and software features, including:
Overall, RAS is an essential requirement for mainframe and supercomputer systems, and
significant efforts are made to ensure that these systems remain highly reliable, available,
and serviceable.
Security
Mainframe supercomputers are designed to handle large-scale data processing and storage
for critical applications in industries such as finance, healthcare, and government.
Due to the sensitive nature of the data being handled, security is of utmost importance.
Here are some key security considerations:
Performance metrics
Performance metrics for mainframe and supercomputers can vary depending on the
specific system and its intended use. Some common performance metrics used for each
are:
Mainframe
• MIPS (Million Instructions Per Second): A measure of the raw processing power of a
mainframe, calculated by counting how many instructions it can execute in one second.
• IOPS (Input/Output Operations Per Second): A measure of the rate at which a
mainframe can input or output data to/from storage devices, such as hard drives or
tape drives.
• TPS (Transactions Per Second): A measure of the rate at which a mainframe can
process transactions, such as database updates or financial transactions.
• Availability: A measure of how often a mainframe is available and accessible to users,
typically measured as a percentage of uptime over a given period.
Supercomputers
• FLOPS (Floating Point Operations Per Second): A measure of the raw processing power
of a supercomputer, calculated by counting how many floating-point operations it can
perform in one second.
• Memory bandwidth: A measure of the rate at which data can be transferred between
a supercomputer's processor and its memory.
Mainframe computers and supercomputers are designed to handle large amounts of data
and processing power.
• Input volume: Both mainframe and supercomputers can handle a vast amount of
input. Input can come from various sources such as sensors, databases, user input, and
other computing devices. The input volume can range from several gigabytes to
petabytes or even more.
• Output volume: Similarly, mainframe and supercomputers can generate a massive
volume of output, which can include results, reports, logs, and other data formats. The
output volume can also range from several gigabytes to petabytes or more.
• Throughput: The throughput of a computer system refers to the amount of data that
can be processed in a given amount of time. Mainframe and supercomputers are
optimized for high throughput, and they can process billions of instructions per second
(BIPS) or even more. The throughput of a system depends on several factors such as
processor speed, memory bandwidth, and I/O performance.
Fault tolerance
Fault tolerance is the ability of a system to continue functioning even in the presence of
hardware or software failures. Mainframes and supercomputers are designed to provide
high levels of fault tolerance, as they are used in mission-critical applications where
downtime can be very costly.
Mainframe
Supercomputers
• Supercomputers also employ fault tolerance techniques, but the emphasis is often on
data redundancy rather than redundant hardware components. This is because
Operating system
Mainframes and supercomputers use specialized operating systems that are designed to
handle the unique requirements of these types of computing systems.
For mainframes, the most widely used operating systems are IBM's z/OS and z/VM. These
operating systems are highly scalable and can handle massive amounts of data and
processing power. They are designed to provide high levels of reliability, availability, and
security, and are commonly used in industries such as banking, finance, and government.
Supercomputers, on the other hand, typically use specialized operating systems such as
Cray's UNICOS and IBM's AIX. These operating systems are designed to handle the massive
parallel processing required by supercomputers, and to optimize performance for specific
types of scientific and engineering applications.
Heat maintenance
• To maintain the appropriate temperature, several cooling techniques are used, such as
air cooling, water cooling, and immersion cooling. Air cooling is the most common
technique, and it involves using fans and heat sinks to dissipate the heat generated by
the components.
• Water cooling is more effective than air cooling and involves circulating water through
the system to remove heat.
• Immersion cooling is the most efficient cooling technique and involves immersing the
entire system in a dielectric fluid that absorbs the heat generated by the components.
In addition to cooling techniques, there are several other methods used to maintain the
temperature of mainframe and supercomputers.
Mainframe computers have been used extensively in census operations due to their ability
to handle large amounts of data efficiently and reliably. The following are some of the ways
in which mainframes are used in census operations:
• Data storage: Mainframes are used to store massive amounts of data collected during
the census, such as population counts, demographic information, and housing data.
• Data processing: Mainframes are used to process and analyze census data. Census
data needs to be cleaned, standardized, and formatted for use in reports and statistical
analyses, which are typically performed using mainframe software.
• Data security: Mainframes are often used to ensure the security and privacy of census
data. Mainframes can be configured with robust security protocols to protect sensitive
information from unauthorized access or theft.
• Reporting: Mainframes are used to generate reports and statistics from census data.
The data is analyzed to generate reports on population demographics, employment
rates, and other important metrics.
Industry statistics
Mainframe computers are commonly used in industry statistics for a variety of purposes,
including data processing, storage, and analysis. Here are some specific ways in which
mainframe computers are used in industry statistics:
• Data processing: Mainframes are often used for large-scale data processing tasks, such
as processing transactional data for financial institutions or processing large volumes of
customer data for retail companies. Mainframes are particularly well-suited to these
types of tasks because they can handle large volumes of data quickly and efficiently.
• Storage: Mainframes are also commonly used for data storage. Many companies store
their critical business data on mainframes because they are highly reliable and secure.
Mainframes can also handle large volumes of data and provide fast access to that data
when needed.
• Analysis: Mainframes can be used for data analysis tasks, such as statistical analysis,
data mining, and predictive modeling. Mainframes are particularly useful for these
types of tasks because they can process large amounts of data quickly and efficiently.
Mainframe computers have historically been used in consumer statistics to process large
amounts of data related to consumer behavior, preferences, and demographics. With their
powerful processing capabilities, mainframes can efficiently handle massive amounts of
data, making them well-suited for processing and analyzing consumer statistics.
Transaction processing
• Mainframe computers are often used for transaction processing due to their high
processing power, reliability, and scalability. Transaction processing is the process of
handling data that represents individual transactions, such as sales, orders, or financial
transactions, and storing that data in a secure and efficient manner. Mainframes are
ideal for transaction processing because they can handle large volumes of data quickly
and efficiently.
• Mainframe computers are designed to handle high volumes of transactions with low
latency and high availability. They use advanced hardware and software to optimize
processing speed and ensure data integrity. They also have features like fault tolerance,
backup and recovery, and security that make them ideal for handling mission-critical
transactions.
• Mainframe transaction processing systems are typically used by large organizations,
such as financial institutions, airlines, and government agencies, that need to process
high volumes of transactions quickly and securely. They are often used for applications
such as online banking, airline reservations, and inventory management.
Uses of supercomputers
Quantum mechanics
Weather forecasting
Supercomputers are essential in weather forecasting as they are capable of processing vast
amounts of data at incredibly high speeds. These powerful machines enable meteorologists
to simulate and model weather patterns with high accuracy and precision. Here are some
ways in which supercomputers are used in weather forecasting:
• Data assimilation: Supercomputers can collect data from multiple sources such as
satellite imagery, radar, and weather balloons, and assimilate this data to produce
accurate weather forecasts.
• Numerical weather prediction: Supercomputers can run complex numerical weather
prediction models that simulate the earth's atmosphere, ocean, and land surface.
These models help forecasters predict weather patterns and extreme weather events
such as hurricanes, typhoons, and tornadoes.
Climate research
System software
System software refers to a category of computer programs that are designed to manage
and control the hardware and software resources of a computer system. This type of
software provides a platform for application software to run on, and enables
communication between the hardware and software components of a computer system.
Operating system
An operating system (OS) is a program that manages the hardware and software resources
of a computer system. It provides a platform for other software applications to run on top
of it and acts as an intermediary between the user and the computer hardware.
Common examples of operating systems include Windows, macOS, Linux, and Android.
Each operating system has its own unique features and capabilities, and users can choose
an OS based on their specific needs and preferences.
An operating system (OS) is a software that manages computer hardware and software
resources and provides common services for computer programs. The main functions of an
operating system include:
1. Process Management: The operating system manages the processes (i.e., programs)
running on the computer. It schedules processes, assigns system resources, and
provides mechanisms for inter-process communication.
Device drivers
• Device drivers are software programs that allow operating systems to communicate
with and control hardware devices.
• They act as an interface between the hardware and the software, enabling the
operating system to access and use the device's functionality.
• Device drivers are essential for hardware devices to function correctly and efficiently.
Without drivers, the operating system would not be able to communicate with
hardware components such as printers, scanners, network adapters, graphics cards,
sound cards, and other peripherals.
• In general, device drivers are developed and provided by the device manufacturer, but
some drivers are included with the operating system or can be downloaded from the
manufacturer's website or other sources.
• The process of installing device drivers is usually automatic, but in some cases, it may
require manual installation or configuration.
Translators
Compilers
The compiler performs a number of tasks, including lexical analysis, parsing, semantic
analysis, optimization and code generation.
• During lexical analysis, the compiler identifies and categorizes the different elements of
the source code, such as keywords, identifiers, and operators.
• During parsing, the compiler checks that the code is syntactically correct.
• During semantic analysis, the compiler checks that the code is semantically correct and
generates an intermediate representation.
• During optimization, the compiler applies various optimizations to the code to improve
its performance.
• Finally, during code generation, the compiler produces machine code that can be
executed on the target machine.
Interpreters
•
Interpreters are often used in scripting languages, where developers need to write
short programs quickly and don't want to go through the lengthy process of compiling
them.
• Interpreted languages are also commonly used in web development, where server-
side scripts are executed on the fly in response to user requests.
Linkers
Computer programs often consist of several modules of programming code. Each module
carries out a specific task within the program. Each module will have been compiled into a
separate object file.
Linkers, also known as link editors or linkers, are programs that are part of the software
development process. They are responsible for linking together the object files produced
by the compiler to create the final executable file or library.
• During the compilation process, the source code is first converted into object code by
the compiler.
• The object code contains machine instructions and data, but it is not yet executable.
• The linker then takes the object files produced by the compiler and combines them
into a single executable file or library that can be executed or linked to by other
programs.
Utility software
Utility software refers to a type of software designed to perform specific tasks that are
related to system management, optimization, and maintenance of a computer. These
programs typically help users manage their computer's hardware, software, and data more
efficiently.
Anti-virus
Backup
• A backup utility is a software program that enables users to create backup copies of
their important data, files, and software applications.
• The primary purpose of a backup utility is to provide a means of restoring data in case
of loss or damage due to system failure, user error, virus attack, or other unforeseen
events.
• Backup utilities can be used to create full backups, incremental backups, or differential
backups.
• A full backup creates a copy of all the data and files on a system.
• An incremental backup only copies the data that has changed since the last backup.
• A differential backup, on the other hand, copies all the data that has changed since the
last full backup.
• Backup utilities can also be used to schedule backups automatically, so that users do
not have to remember to back up their data manually.
• They can be set to run at specific times or intervals, and can be configured to back up
specific folders or entire drives.
• A data compression utility is a software tool that is used to reduce the size of
computer files or data in order to save storage space or to reduce the time required to
transfer data over a network.
• Data compression is the process of encoding data in such a way that it requires less
space to store or less time to transmit.
Compression is achieved by eliminating redundancy in the data or by representing the
data in a more efficient format.
• Data compression utilities typically use one of two types of compression: lossless
compression and lossy compression.
• Lossless compression algorithms reduce the size of the data without losing any
information
• while lossy compression algorithms sacrifice some information in order to achieve
greater compression ratios.
• The purpose of data compression is to reduce the storage space required for files, and
to make it easier to transfer them over the internet or other networks.
• Compressed files take less time to transfer and require less bandwidth, which can be
particularly useful for users with limited internet connectivity.
• Some common data compression utilities include WinZip, 7-Zip and WinRAR.
Disk formatting
• A disk formatting utility is a software tool used to prepare a disk for use by creating a
file system on it.
• Formatting is the process of organizing a disk to store data by setting up the necessary
structures, such as a boot sector, file allocation table (FAT), or master file table (MFT).
• Formatting is typically done when a disk is first purchased, but it may also be
necessary when the disk becomes corrupted or needs to be prepared for a different
operating system.
• Disk formatting utilities can be either built-in to an operating system or third-party
applications, and they usually offer options for selecting the type of file system to be
created.
• Formatting a disk erases all data stored on the device, so it is important to back up any
important files before running the utility.
• Once the formatting process is complete, the storage device will be completely empty
and ready to be used for storing new data.
Disk defragmentation
•
• Over time, as files are added, deleted, and modified on a hard drive, the data can
become fragmented, meaning it is scattered across the drive in non-contiguous
chunks.
• This fragmentation can slow down access times, as the disk drive has to seek out each
piece of the file separately, which can lead to longer load times and decreased system
performance.
• A disk defragmentation utility is a software program designed to reorganize the data
on a hard drive to minimize fragmentation and improve performance.
• The utility scans the hard drive, identifies fragmented files, and moves the fragments
so that they are contiguous.
• This process can take some time, but once it is complete, the computer can access
files more quickly and efficiently.
• Many modern operating systems, such as Windows and macOS, include built-in disk
defragmentation utilities, which can be scheduled to run automatically or manually
initiated by the user.
File copying
• A file copying utility is a software program or tool that allows users to duplicate or
move files from one location to another.
• This tool is often used to make backups of important files or to transfer files between
different devices or storage media.
• File copying utilities typically offer a range of features, such as the ability to copy
entire directories, select specific files to copy, preserve file attributes and permissions,
verify the integrity of the copied files, and handle errors or conflicts that may arise
during the copying process.
Deleting
Off-the-shelf software
•
When an organization develops its own Custom software is more prone to errors and
software, it has complete control over the bugs than off-the-shelf solutions, as it has not
development process, which can result in been tested and used by a large number of
higher quality software and a more reliable users.
final product.
• Open source software refers to computer software whose source code is available to
anyone for viewing, modifying, and distributing.
• This means that anyone can access, use, and modify the source code of the software
without having to pay for it or ask for permission from the original creator.
• Open source software is typically developed collaboratively by a community of
developers who share a common goal of creating high-quality, free software that can
be used by anyone.
• This collaborative approach can lead to software that is more secure, stable, and
adaptable than proprietary software that is developed by a single company or
individual.
• Proprietary software refers to software that is privately owned and distributed under a
specific license that limits its use, modification and distribution.
• This means that the source code of the software is not freely available, and users must
agree to the terms of the license to use the software.
• Proprietary software is usually developed and sold by companies, and they retain full
control over its development, distribution, and support.
• One of the main characteristics of proprietary software is that the license agreement
typically restricts users from modifying, copying, or distributing the software without
permission from the owner.
• Additionally, the software is usually not free and users must pay a license fee to use it.
User Interfaces
Command line interface
• A Graphical User Interface (GUI) is a type of user interface that allows users to interact
with electronic devices such as computers, smartphones and other digital devices
through graphical elements such as icons, buttons, and windows, instead of using
textbased commands.
• GUIs provide an intuitive and visually appealing way for users to perform tasks on a
computer, making it easier for users to operate and navigate various applications and
programs.
• In a GUI, users can perform actions such as opening, closing, and manipulating files
and folders, and accessing different applications and settings through menus and
icons displayed on the screen.
Dialogue interface
• A gesture-based interface is a user interface (UI) that allows users to interact with a
device or system through physical movements or gestures, instead of using traditional
input devices like a keyboard, mouse, or touchpad.
• This type of interface can be found in various devices, such as smartphones, tablets,
gaming consoles, and even some household appliances. For example, a common
gesture-based interface is touchscreen technology, where users interact with the
device by tapping, swiping, pinching or zooming.
• Another example is the Kinect sensor for Xbox, which uses a camera and microphone
to detect and respond to user gestures and voice commands.
• With this type of interface, users can control games, browse the web, and perform
various tasks without the need for physical input devices.