0% found this document useful (0 votes)
9 views18 pages

Cryptography Module 4

The document discusses the CNSS Security Model, also known as the McCumber Cube, which provides a framework for securing information systems through its three axes: security goals, information states, and security countermeasures. It also outlines the components of information system security, the Security Systems Development Life Cycle (SecSDLC), key security concepts, and critical characteristics of information. Additionally, it highlights various attacks on e-commerce applications, the business needs for information security, and common threats to intellectual property and software.

Uploaded by

nimithbe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views18 pages

Cryptography Module 4

The document discusses the CNSS Security Model, also known as the McCumber Cube, which provides a framework for securing information systems through its three axes: security goals, information states, and security countermeasures. It also outlines the components of information system security, the Security Systems Development Life Cycle (SecSDLC), key security concepts, and critical characteristics of information. Additionally, it highlights various attacks on e-commerce applications, the business needs for information security, and common threats to intellectual property and software.

Uploaded by

nimithbe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Module 4

CNSS Security Model (McCumber Cube)


• Definition: The CNSS Security Model is a comprehensive framework for understanding and
securing information systems, based on the NSTISSI No. 4011 document.
• Origin: Developed by John McCumber in 1991; now known as the McCumber Cube.
• Structure: A 3D cube model with three axes, forming 27 cells (3 × 3 × 3), each representing a
critical area of security focus.

Axes of the McCumber Cube:


1. Security Goals:
o Confidentiality: Ensuring only authorized access to information.
o Integrity: Ensuring information is accurate and unaltered.
o Availability: Ensuring reliable access to information and systems.
2. Information States:
o Storage: Data at rest (e.g., databases, disks).
o Transmission: Data in transit across networks.
o Processing: Data being used or manipulated.
3. Security Countermeasures:
o Technology: Tools and software/hardware solutions.
o Policy: Rules and procedures governing security.
o Education: Training and awareness for users.
• Purpose: All 27 intersections must be addressed to ensure full information system security.
• Example: Using technology to ensure integrity of stored data, such as host intrusion detection
systems (HIDS).

Cryptography Module 4 Ashwin Mysore 1


Enumerate and explain the various components of an Information System.
Components of Information System Security
Information systems consist of six interdependent components, each requiring tailored security strategies:
1. Software:
o Includes applications, OS, and utilities.
o Most vulnerable due to bugs and rushed development.
o Security often treated as an afterthought.
2. Hardware:
o Physical tech that stores, processes, and interfaces data.
o Must be protected from theft or damage.
o Example: Laptop theft in public places like airports.
3. Data:
o Most valuable and targeted asset.
o Needs strong access control and encryption.
o Improper database security can lead to vulnerabilities.
4. People:
o Often the weakest link due to human error or manipulation (social engineering).
o Requires policy enforcement, training, and awareness.
5. Procedures:
o Documented instructions for tasks.
o Can be exploited if leaked (e.g., wire transfer procedures).
o Should be protected and shared only on a need-to-know basis.
6. Networks:
o Connect systems, creating more vulnerabilities.
o Must implement firewalls, IDS/IPS, and encryption.
o Physical security alone is not enough once systems are networked.

Cryptography Module 4 Ashwin Mysore 2


Briefly describe the Security System Development Life Cycle
Security Systems Development Life Cycle (SecSDLC)
The SecSDLC adapts the traditional Software Development Life Cycle (SDLC) to focus on information
security, providing a structured process to identify threats and implement controls.

Phases of SecSDLC
1. Investigation
• Initiated by top management through an Enterprise Information Security Policy (EISP).
• Defines project goals, scope, budget, and constraints.
• Involves assembling a team and conducting an organizational feasibility analysis.
Unique to SecSDLC: Establishing the security policy framework and assessing management commitment.

2. Analysis
• Study documents from investigation phase.
• Review existing security policies, current threats, and controls.
• Evaluate legal and regulatory issues, especially data privacy laws.
• Begin risk management: identify, assess, and evaluate risks.
Unique to SecSDLC: Legal analysis and initiation of risk assessment.

3. Logical Design
• Develop a security blueprint.
• Plan for:
o Business Continuity
o Incident Response
o Disaster Recovery
• Evaluate feasibility of continuing or outsourcing the project.
Unique to SecSDLC: Focus on continuity planning, incident handling, and risk response strategies.

4. Physical Design
• Choose specific technologies and hardware to support the blueprint.
• Consider make or buy decisions for components.
• Design physical security measures.
• Conduct final feasibility analysis and seek approval from stakeholders.
Unique to SecSDLC: Designing both technological and physical controls for security.

5. Implementation
• Acquire or develop security solutions.
• Perform rigorous testing, installation, and training.
• Ensure user awareness and system documentation.
• Present the fully tested system to management.
Unique to SecSDLC: Continuous testing of security-specific features and user security training.

6. Maintenance and Change


• Ongoing monitoring, updating, and patching to counter emerging threats.
• Regular compliance testing and threat response.
• Must adapt to a constantly evolving threat landscape.
Unique to SecSDLC: Treat security as a continuous, defensive process rather than a one-time deployment.

Cryptography Module 4 Ashwin Mysore 3


Key Security Concepts
1. Access:
The ability to interact with a system, resource, or data (e.g., read, write, modify, execute).

2. Asset:
Anything valuable to an organization that requires protection—data, hardware, software, networks,
personnel, etc.

3. Attack:
An attempt to exploit a system’s vulnerability to breach security, gain unauthorized access, or cause
disruption.

4. Control / Safeguard / Countermeasure:


Security mechanisms used to reduce risk and protect assets (e.g., firewalls, encryption, MFA).

5. Exploit:
A method or technique used to take advantage of a vulnerability to compromise a system.

6. Exposure:
A state where an asset lacks protection, making it more susceptible to threats.

7. Loss:
The damage or negative consequence resulting from a security incident (e.g., financial loss,
reputational harm).

8. Protection Profile / Security Posture:


The overall security readiness of an organization, including policies, controls, and risk mitigation
strategies.

9. Risk:
The possibility of damage or loss when a threat exploits a vulnerability; calculated by likelihood ×
impact.

10. Subjects and Objects:

• Subjects: Active entities (e.g., users, applications) that perform actions.

• Objects: Passive entities (e.g., files, databases) that are acted upon.

11. Threat:
A potential cause of an unwanted incident that may result in harm to a system or asset.

12. Threat Agent:


An actor or entity (e.g., hacker, malware, insider) that initiates a threat.

13. Vulnerability:
A weakness in a system that can be exploited by a threat agent to cause harm.

Cryptography Module 4 Ashwin Mysore 4


Critical Characteristics of Information
1. Confidentiality
Ensures that sensitive information is accessed only by authorized individuals.
Example: Encryption, access controls.

2. Integrity
Assures that information remains accurate and unaltered except by authorized sources.
Example: Hashing, checksums, digital signatures.

3. Availability
Ensures that information and systems are accessible to authorized users when needed.
Example: Redundancy, failover systems, backups.

Expanded Characteristics (Beyond the core C.I.A. triangle):

4. Authenticity
Confirms the source of information is genuine.
Example: Digital certificates, user authentication.

5. Accountability
Ensures actions can be traced to a specific user or system component.
Example: Audit logs, user IDs.

6. Non-repudiation
Prevents a party from denying previous actions or commitments.
Example: Digital signatures, secure logging.

7. Privacy
Protects personal or sensitive data from unauthorized access or disclosure .
Example: Data masking, GDPR compliance.

8. Accuracy
Ensures data is correct, precise, and free from error.

9. Utility
Refers to the usefulness or relevance of the data for a specific purpose.

10. Possession (or Control)


Describes the physical or logical control over data or an asset. Loss of possession may not mean loss
of confidentiality, but it is still a threat.

When any of these characteristics are compromised, the value and trustworthiness of the information
usually decreases. Protecting these characteristics is the foundation of information security.

Cryptography Module 4 Ashwin Mysore 5


Your team has developed a e-commerce application for the XYZ company selling shoes. Identify the various attacks
possible on this application
Software Attacks
1. Malware-Based Attacks
Malware (malicious software) is designed to harm, steal, or disrupt computer systems. Common types
include:
• Virus: Attaches to files and spreads when the file is opened. Can delete or corrupt data.
• Worm: Self-replicating and spreads through networks without user action.
• Trojan Horse: Disguises itself as legitimate software but has hidden malicious functions.
• Ransomware: Encrypts files and demands ransom to unlock them (e.g., WannaCry).
• Spyware: Secretly monitors user actions to steal data like passwords and keystrokes.

2. Network-Based Attacks
These attacks target communication over networks:
• Man-in-the-Middle (MitM): The attacker intercepts communication between two parties.
• Denial-of-Service (DoS) / Distributed DoS (DDoS): Overloads a server with traffic, making it
unavailable.
• Session Hijacking: Takes over an active session between a user and a system to gain access.

3. Code Injection Attacks


These attacks inject malicious code into applications:
• SQL Injection: Injects SQL commands into input fields to access or modify databases.
• Cross-Site Scripting (XSS): Injects malicious scripts into web pages to steal data from users.
• Command Injection: Executes arbitrary system commands on a server through user inputs.

4. Exploit-Based Attacks
These attacks take advantage of software flaws:
• Zero-Day Exploit: Targets unknown or unpatched vulnerabilities.
• Buffer Overflow: Overloads program memory to crash or run malicious code.
• Privilege Escalation: Gains higher access rights than intended (e.g., becoming an admin)

5. Credential & Identity Attacks


These attacks focus on stealing or guessing login credentials:
• Phishing: Fake emails or websites trick users into revealing passwords or personal information.
• Brute Force Attack: Tries every possible password combination until it succeeds.
• Credential Stuffing: Uses leaked login details from previous breaches to access other accounts.

6. Insider Threats
Not all attacks come from outside—some are internal:
• Malicious Insider: Authorized users misuse their access to steal or harm data.
• Negligent Insider: Careless actions (like weak passwords or losing devices) lead to data exposure.

Cryptography Module 4 Ashwin Mysore 6


Business Needs for Information Security
Information security is essential to sustain business functionality and protect assets in a threat-prone
digital environment.
1. Protecting the Organization’s Functionality
• It’s a management issue, not just technical.
• Requires policies and strategic enforcement.
• Ensures business continuity and reduces interruption risk.
2. Enabling Safe Operation of Applications
• Applications like OS, email, IM must run securely.
• Security must be integrated into IT operations, not left solely to the IT department.
3. Protecting Organizational Data
• Safeguards data at rest and in motion.
• Maintains data integrity, protects against theft, sabotage, corruption.
• Data is a core organizational asset.
4. Safeguarding Technology Assets
• Security should scale with organizational growth.
• Examples of scalable security tools:
o PKI (Public Key Infrastructure) – for secure communications.
o Firewalls – to filter network traffic.
o Caching appliances – to improve speed and reduce risk.

Threats to Information Security


In the realm of cybersecurity, a threat refers to any object, person, or event that poses a potential danger to
an asset. Drawing from Sun Tzu's principle in The Art of War, protecting information requires understanding
both the systems to be protected and the threats they face.
As organizations become more interconnected via the Internet, the number and complexity of threats have
grown. For instance, a 2009 CSI survey revealed that 64% of organizations experienced malware attacks,
although overall losses were decreasing, indicating improving security practices.
Common Threat Categories
Threats can be grouped into 14 broad categories, each posing distinct risks:
Category Examples
1. Compromises to Intellectual Property Piracy, copyright infringement
2. Software Attacks Viruses, worms, DoS attacks
3. Service Quality Deviations Power or ISP issues
4. Espionage or Trespass Unauthorized access or data collection
5. Natural Disasters Fires, floods, earthquakes
6. Human Error or Failure Mistakes by users or employees
7. Information Extortion Blackmail using stolen data
8. Inadequate Planning Poor recovery plans, backup failure
9. Inadequate Controls Lack of firewalls, weak policies
10. Sabotage or Vandalism System or data destruction
11. Theft Stolen hardware or data
12. Hardware Failures Equipment malfunctions
13. Software Failures Bugs or programming errors
14. Technological Obsolescence Outdated systems and tools

Cryptography Module 4 Ashwin Mysore 7


Compromises to Intellectual Property (IP)
Organizations often create or use intellectual property (IP)—including trade secrets, copyrights,
trademarks, and patents—as part of their business. IP involves the ownership and control of ideas and
their representations, and using someone else’s IP without permission is a serious threat to information
security.
Common Threats to IP:
• Piracy: Unauthorized copying or distribution of software.
• Copyright Infringement: Using protected work without proper licensing or credit.
Software Piracy:
A widespread issue where organizations or individuals use software without purchasing the proper licenses.
Even installing licensed software on multiple systems without additional licenses is a violation.
Case Study: Violating Software Licenses
A community college in Las Vegas was caught using unlicensed software. After an anonymous tip, the
Software Publishers Association (SPA) investigated and found violations. The college faced fines and had
to destroy illegal copies, repurchase software at double the retail price, and improve internal controls.
Blame was placed on unauthorized downloads by faculty/students and poor IT supervision.
Anti-Piracy Enforcement:
• Watchdog Organizations:
o SIIA (Software & Information Industry Association)
o BSA (Business Software Alliance)
• Tools Used:
o Digital watermarks, embedded codes, corrupted sectors
o License agreements and online registration to restrict unauthorized use
Despite such efforts, a 2006 BSA survey found that one-third of global software use was pirated. Online
registration and license pop-ups help enforce rules but raise privacy concerns.

Cryptography Module 4 Ashwin Mysore 8


Deliberate Software Attacks – Summary
Deliberate software attacks involve intentional use of malicious software (malware) to harm, disable, or gain
unauthorized access to computer systems. These attacks can disrupt services, steal information, or damage
hardware/software components.
Common Types of Malware:
1. Viruses:
o Code segments that attach to legitimate programs and spread by replicating themselves.
o Often spread through email attachments or infected media.
o Examples: Macro viruses (embedded in word processors/spreadsheets), Boot viruses (infect
boot sectors).
o Prevention includes antivirus software (e.g., Norton, McAfee) and blocking risky
attachments.
2. Worms:
o Standalone programs that replicate across systems and networks without user intervention.
o Can consume system resources and spread through email or network shares.
o Examples: Code Red, Sircam, Nimda, Klez, MS-Blaster, MyDoom.
o Often exploit OS/application vulnerabilities.
3. Trojan Horses:
o Malicious code hidden inside seemingly useful programs (e.g., Happy99.exe).
o Do not replicate but can install other malware or steal data.
4. Back Doors / Trap Doors:
o Secret entry points into a system installed by attackers or malicious software.
o Allow repeated unauthorized access, often used in conjunction with other malware.
o Examples: Back Orifice, Subseven.
5. Polymorphic Threats:
o Malware that constantly changes its code to avoid detection by antivirus software.
o Makes signature-based detection difficult.
6. Virus and Worm Hoaxes:
o False alarms about non-existent threats.
o Cause wasted resources and panic.
o Best handled by verifying with official sources like www.cert.org or Hoax-Slayer.
Notable Incidents:
• Mafiaboy (2000): Targeted major websites (Amazon, Yahoo, Dell) using DoS attacks.
• Cloudnine ISP (2002): Believed to be first company shut down due to a DoS attack.
• Robert Morris Worm (1988): Caused massive disruption due to a self-replicating worm exploiting
known vulnerabilities.

Cryptography Module 4 Ashwin Mysore 9


Deviations in Quality of Service (QoS)
Organizations depend on many interdependent support systems—like power, telecom, internet, vendors, and
janitorial services—to ensure smooth functioning. Any disruption in these services, even from minor
incidents such as weather or accidents (e.g., a backhoe cutting a fiber-optic line), can lead to availability
issues.
• Internet Service Failures: These disrupt communication and remote access. Even with backup
providers, bandwidth may be insufficient. When hosted services fail to meet agreed service levels
(SLAs), penalties may apply but often don’t cover business losses.
• Utility Failures: Services like water, waste, and even custodial work are critical. For example, no
water may mean no air conditioning, halting business operations.
• Power Issues: Fluctuations such as:
o Spikes/Surges: Damage equipment.
o Sags/Brownouts: Cause system resets or slowdowns.
o Faults/Blackouts: Lead to complete system shutdowns.
• Solutions include surge suppressors and Uninterruptible Power Supplies (UPS) to protect
systems and maintain availability.

Espionage or Trespass
This refers to unauthorized access to sensitive or confidential information. It can be either legal
(competitive intelligence) or illegal (industrial espionage).
• Shoulder Surfing: A low-tech method where attackers gather information by observing users in
public settings (e.g., ATMs or phones).
• Trespassing: Virtual or physical entry into systems or premises without authorization, bypassing
authentication and authorization boundaries.
• Hackers:
o Expert Hackers (Elite): Skilled in programming, OS, and networking. They develop tools
used by less skilled hackers.
o Script Kiddies & Packet Monkeys: Use pre-made scripts/tools to conduct attacks like
DoS/DDoS without deep technical knowledge.
• Case Study – Hack PCWeek: A hacker exploited a CGI script vulnerability on a Linux web server
in 1999. Using tools like port scanning, HTTP header tricks, and analysis of exposed web content,
the attacker bypassed controls—not by breaching the OS, but through an insecure add-on script.

Cryptography Module 4 Ashwin Mysore 10


Forces of Nature as Threats to Information Systems
Forces of Nature, also known as force majeure or acts of God, are unpredictable natural events that can
significantly disrupt individuals, infrastructure, and information systems. These threats are often sudden and
beyond human control, making them particularly dangerous. Although such events cannot be prevented,
organizations can mitigate damage through insurance and proactive contingency planning.
Common Natural Threats:
1. Fire: Typically structural, causing damage to computer equipment and infrastructure. It may also
include smoke or water damage from firefighting. Mitigation: Fire casualty and business
interruption insurance.
2. Flood: Overflowing water that damages systems or disrupts access. Common in low-lying areas.
Mitigation: Flood and business interruption insurance.
3. Earthquake: Sudden ground movements damaging buildings and equipment. Can restrict access.
Mitigation: Specific casualty and business interruption insurance.
4. Lightning: Sudden electrical discharges that damage electronic components or cause fires.
Mitigation: Casualty insurance and surge protectors.
5. Landslides/Mudslides: Earth movement causing physical damage or blocked access to buildings.
Mitigation: Casualty and business interruption insurance.
6. Tornadoes/Windstorms: High-speed rotating winds that damage infrastructure and systems.
Mitigation: Casualty and business interruption insurance.
7. Hurricanes/Typhoons: Large tropical storms causing wind and flood damage. Disrupts operations
and access. Mitigation: Casualty and business interruption insurance.
8. Tsunamis: Massive ocean waves triggered by underwater earthquakes/eruptions. Affects coastal
areas severely. Mitigation: Casualty and business interruption insurance.
9. Electrostatic Discharge (ESD): Static electricity damaging sensitive electronics. Can cause costly
interruptions. Mitigation: ESD protection methods; not usually insurable.
10. Dust Contamination: Shortens equipment lifespan, causing failures and downtime. More prevalent
in poor environmental conditions.

Mitigation Strategies:
Since these events are uncontrollable, organizations must:
• Invest in appropriate insurance policies.
• Maintain disaster recovery plans (DRP) and business continuity plans (BCP).
• Implement incident response plans (IRP).
• Design facilities with environmental protections (e.g., surge protectors, clean rooms, waterproofing).
Proper planning and risk assessment are essential to minimizing the impact of natural forces on information
systems.

Human Error or Failure


• Involves unintentional mistakes by authorized users.
• Causes: inexperience, lack of training, incorrect assumptions.
• Example: A keyboard error in 1997 led to global Internet outages affecting 45% of users.
• Employee mistakes can lead to:
o Disclosure of confidential data
o Accidental data modification or deletion
o Poor data storage practices
• Solution: Training, awareness programs, user verification steps, and expert systems.

Cryptography Module 4 Ashwin Mysore 11


Information Extortion
• Occurs when attackers or insiders steal data and demand compensation to avoid exposing or
returning it.
• Example: Russian hacker “Maxus” stole credit card data from CD Universe.
• Another case: Express Scripts hacked, extorted, and forced to offer rewards and customer
notifications.
• Impact: Legal costs, reputation loss, and regulatory consequences.

Missing/Inadequate Policy or Planning


• Absence of strategic security governance makes the organization vulnerable.
• Issue: No clear security strategy or poorly defined responsibilities.
• Impact: Other attacks become more damaging due to weak policy foundations.

Missing/Inadequate Controls
• Poorly designed or missing safeguards (e.g., outdated hardware, misconfigured security systems).
• Example: Using SOHO equipment in growing businesses without proper upgrades.
• Solution: Regular audits and control reviews to ensure continuous protection.

Sabotage or Vandalism
• Intentional acts to damage systems or reputations.
• Examples:
o SANS Institute’s site defaced by Fluffi Bunni hackers.
o Cyberactivism (e.g., Greenpeace campaigns, anti-fascist hacktivists).
• Rising Threat: Hacktivism and cyberterrorism aiming to disrupt systems politically or ideologically.

Theft
• Can be physical, electronic, or intellectual.
• Physical theft: Easier to detect and prevent.
• Electronic theft: Harder to detect; can happen without knowledge.
• Prevention: Surveillance, access control, and intrusion detection systems.

Technical Hardware Failures or Errors


• Includes manufacturing defects and hardware bugs.
• Example: Intel Pentium II floating-point division bug (FDIV).
• Failures can be:
o Terminal (equipment loss)
o Intermittent (inconsistent malfunction)
• Lesson: Failures are inevitable—design systems with resilience in mind.

Cryptography Module 4 Ashwin Mysore 12


Attacks
1. Malicious Code
• Viruses, Worms, Trojan Horses, and Web Scripts: These attacks involve malicious code designed
to steal or destroy data.
• Polymorphic Worms: Modern worms that use multiple vectors to exploit vulnerabilities, such as the
Nimda outbreak in 2001.
• Bots, Spyware, and Adware: Covert software used for gathering user data without consent,
including web bugs and tracking cookies.
2. Hoaxes
• Virus Hoaxes: Malicious hoaxes that trick users into distributing a virus by misrepresenting it as a
warning.
3. Backdoors
• Trapdoors: Hidden access points intentionally left by system designers or attackers for unauthorized
system entry.
4. Password Attacks What are the types of password attacks? What can a systems administrator do to protect
against them?
• Cracking: Attempting to reverse-engineer hashed passwords to gain unauthorized access.
• Brute Force: Trying all possible password combinations until one works.
• Dictionary Attacks: Using common passwords to gain access, often targeting weak passwords.
5. Denial-of-Service (DoS) and Distributed Denial-of-Service (DDoS)
• DoS: Overloading a system with excessive requests, causing it to crash or become unresponsive.
• DDoS: A large-scale attack from multiple compromised systems, making it harder to defend against.
6. Spoofing
• IP Spoofing: Forging the source IP address of packets to make them appear to come from a trusted
source.
7. Man-in-the-Middle Attacks
• TCP Hijacking: Attacks where the attacker intercepts and alters data between two communicating
parties.
8. Spam
• Unsolicited Commercial E-mail: Spam can be a minor nuisance or a medium for delivering
malicious code. It's mostly a resource drain for organizations.
9. Mail Bombing
• DoS via Email: Sending excessive unsolicited emails to overload the recipient’s email system.
10. Sniffers
• Packet Sniffers: Unauthorized tools used to capture and monitor data across a network, often
stealing sensitive information.
11. Social Engineering: Discuss the various social engineering attacks
Social engineering in the context of information security is the manipulation of people to divulge
confidential information or perform actions that compromise security. Attackers often use deception and
psychological manipulation to exploit the natural tendency of individuals to trust others. The most common
forms include:
• Impersonation: Attackers may pose as a trusted figure (e.g., an executive or IT support) to obtain
sensitive information.
• Phishing: Attackers send fraudulent messages, often via email, that appear legitimate to deceive
victims into revealing personal or financial information.
• Pretexting: The attacker creates a fabricated scenario to convince the victim to release confidential
information, such as claiming to be from a bank requesting verification.
• Baiting: The attacker offers something enticing (e.g., free software) to trick the victim into
downloading malicious software or disclosing personal details.

Cryptography Module 4 Ashwin Mysore 13


12. Pharming:
Pharming is a cyberattack technique that redirects legitimate website traffic to fraudulent sites without the
user's knowledge. It can occur in two main ways:
• Browser Pharming: Attackers use malware (e.g., Trojans, worms) to alter the victim’s browser
settings, such as modifying the address bar to lead to a fake website.
• DNS Cache Poisoning: In this attack, the attacker manipulates the Domain Name System (DNS)
server to direct users to a malicious website when they type a legitimate URL.
The goal of pharming is typically to steal personal data, such as login credentials or financial information,
by making the fraudulent site appear authentic.
13. Timing Attack:
A timing attack involves analyzing the time it takes a system to perform operations in order to extract
sensitive information. It is commonly used to exploit systems that perform cryptographic operations, such as
encryption or decryption.
• Web-based Timing Attack: An attacker can study the response times of a website to gain insights
into the data being processed, potentially exposing passwords or other sensitive information stored in
the browser cache.
• Cryptographic Timing Attack: Attackers measure the time it takes to execute encryption
algorithms to find patterns and recover cryptographic keys or break encryption schemes.

Cryptography Module 4 Ashwin Mysore 14


Professional, Legal, Ethical Issues in information Security.
1. Ethics in Information Technology
• Lack of a Binding Code: Unlike fields such as law or medicine, the IT and information security
fields do not have a binding code of ethics. However, various professional associations and
certification agencies, like the ACM, ISSA, and (ISC)², work to set ethical standards.
• Ethical Violations in Information Security: Unlike professions that can expel violators (e.g.,
doctors or lawyers), information security professionals face challenges in enforcing ethical conduct
because the regulatory bodies lack the authority to remove violators from practice.
2. The Ten Commandments of Computer Ethics
• Summary: These are guidelines designed to govern the ethical use of computers. Some notable
commandments:
o Do not harm others using computers.
o Do not steal or snoop around other people's data.
o Do not use others' resources without permission.
o Be mindful of social consequences when writing programs or designing systems.
o Respect intellectual property rights.
o Use computers in a way that shows respect for others.
3. Ethical Differences Across Cultures
• Cultural Perspectives on Ethics: Different cultures perceive computer use and ethics differently.
This can cause conflicts, especially regarding intellectual property and software piracy.
• Example: In Western cultures, software piracy is generally condemned, while in some Asian
cultures, there’s more tolerance for piracy due to a tradition of collective ownership.
• Global Study: A 1999 study showed differences in attitudes toward software piracy, illicit use, and
misuse of corporate resources across various countries like the US, Netherlands, Singapore, Hong
Kong, and Australia.
4. Software License Infringement
• Software Piracy: The study revealed that attitudes toward software piracy varied across countries.
For example:
o The US showed a stricter stance on piracy.
o The Netherlands was more permissive, yet still ranked high in piracy rates.
o Asian countries like Singapore and Hong Kong had moderate views on piracy, showing a
lack of legal enforcement or punitive measures.
5. Illicit Use of Computers
• Hacking, Viruses, and System Abuse: Participants in the study were universally opposed to
hacking, virus creation, and other forms of system abuse. However, there were cultural differences in
the tolerance for illicit activities. For example, respondents from Singapore and Hong Kong were
more tolerant than those from Western countries like the US or Australia.
6. Misuse of Corporate Resources
• Personal Use of Company Resources: The study explored whether personal use of corporate
resources (computers, time, etc.) was seen as ethical. Findings:
o Generally, people believe it's acceptable to use company resources for personal reasons,
unless explicitly prohibited by the organization.
o Only participants from Singapore and Hong Kong had a more rigid view, seeing personal use
of company resources as unethical.

Cryptography Module 4 Ashwin Mysore 15


7. Ethical Decision Evaluation Scenarios
• Several case studies are presented for ethical evaluation:
o Scientist Not Acknowledging Programmer: Failure to acknowledge the programmer is
unethical.
o Programmer Not Pointing Out Design Flaws: Failure to address flaws in the design is also
unethical.
o Student Accessing Records: Accessing others’ records without permission is unethical, even
if reported.
o User Keeping Extra Software: Keeping software sent by mistake is unethical.
o Programmer Modifying Bank Account: Modifying a bank’s system for personal gain is
highly unethical.
o Programmer Using Company Computer for Personal Projects: Using company resources
without permission for personal gain is unethical, unless explicitly allowed by the
organization.
8. Viruses and Hacking
• Creating and Spreading a Virus: Writing and spreading a virus, especially one that destroys data or
disrupts systems, is highly unethical and illegal. Even if the virus message is harmless, infecting
other users' systems is wrong.
• Engineer Asking Programmer to Assume Liability: The ethical position of the programmer, who
refuses to take responsibility for calculation errors made by the engineer, is justifiable. The engineer
should not pass on liability for their own mistakes.
9. Ethics in Education
• Importance of Education in Ethics: Ethical behavior in information security is heavily influenced
by education. Employees may not always recognize when their actions are unethical or illegal due to
a lack of proper training.
• Training: It’s essential for organizations to train employees in ethical standards, especially in
information security, to prevent risky or illegal behavior.

Cryptography Module 4 Ashwin Mysore 16


What are the five essential criteria that a policy must meet to become enforceable?
What are the five essential criteria that a policy must meet to become enforceable?
1. Clarity: The policy must be clear, concise, and easy to understand.
2. Consistency: The policy must align with the organization's goals, objectives, and other policies.
3. Communication: The policy must be effectively communicated to all relevant stakeholders.
4. Compliance: The policy must comply with applicable laws and regulations.
5. Enforceability: The policy must be backed by mechanisms for monitoring and enforcement.

What are the various types of force majeure? Which type might be of greatest concern to an
organization in Bangalore? Chennai? Mumbai? Kolkata?
Types of Force Majeure:
• Natural Events: Earthquakes, floods, hurricanes, etc.
• Human Actions: War, terrorism, riots, strikes.
• Government Actions: Changes in laws, embargoes, government shutdowns.
• Industrial or Technological Disasters: Power outages, system failures.
Greatest Concerns:
• Bangalore: Flooding and water-related issues, as well as occasional protests or strikes.
• Chennai: Cyclones and floods are major concerns.
• Mumbai: Flooding during monsoon season, especially due to heavy rains.
• Kolkata: Flooding due to heavy rains, as well as occasional strikes.

What is the difference between law and ethics?


• Law: Laws are formal rules set by governments that are legally binding. They are enforceable by
legal institutions such as courts, and failure to comply with them results in legal penalties.
• Ethics: Ethics are moral principles or values that guide behavior, often based on societal norms.
Unlike laws, ethics are not legally enforceable, but violations can lead to social or professional
consequences.

What is civil law, and what does it accomplish?


• Civil Law: A branch of law dealing with disputes between individuals or organizations, often
concerning rights, obligations, and liabilities. It does not involve criminal sanctions but focuses on
remedies such as compensation or injunctions.
• Accomplishments: Civil law aims to provide resolution for disputes, protect individual rights, and
offer compensation for harm or loss caused by others.

What are the primary examples of public law?


• Constitutional Law: Governs the structure of government and its relationship with individuals.
• Administrative Law: Governs the actions of government agencies.
• Criminal Law: Defines offenses and prescribes punishments.
• Tax Law: Regulates the taxation system and compliance.
• Environmental Law: Covers the regulations related to the environment.

What is due care? Why should an organization make sure to exercise due care in its usual course of
operations?
• Due Care: The effort made by an organization to prevent harm by taking necessary precautions and
measures to protect its assets, employees, and stakeholders.
• Importance: Exercising due care helps avoid negligence claims and ensures the organization
complies with legal and ethical standards. It also minimizes risks related to data breaches, accidents,
and other potential liabilities.

Cryptography Module 4 Ashwin Mysore 17


How is due diligence different from due care? Why are both important?
• Due Diligence: The investigation and research an organization conducts to assess the risks associated
with decisions, such as mergers, acquisitions, or investments. It involves gathering and analyzing
relevant information to make informed decisions.
• Due Care: The ongoing actions to ensure safety, compliance, and the protection of assets in daily
operations.
• Importance: Both are crucial because due diligence ensures that informed decisions are made, while
due care ensures those decisions are responsibly implemented to avoid negligence or harm.

What is a policy? How is it different from a law?


• Policy: A policy is a guiding principle or rule set by an organization to direct actions and decisions in
specific circumstances.
• Difference: A policy is internal to an organization and is not legally binding, whereas a law is a
legally enforceable rule established by a governing authority.

What are the three general categories of unethical and illegal behavior?
• Fraud and Theft: Dishonestly taking assets or funds.
• Corruption and Bribery: Offering or receiving bribes to influence decisions or actions.
• Violating Rights: Breaching privacy, intellectual property, or personal rights.

What is the best method for preventing illegal or unethical activity?


• Education and Training: Regular training on legal and ethical standards helps prevent unethical
behavior.
• Clear Policies: Ensuring employees are aware of policies and the consequences of violations.
• Whistleblower Protection: Encouraging employees to report unethical activities without fear of
retaliation.
• Effective Enforcement: Monitoring compliance and applying consistent consequences for
violations.

What is intellectual property (IP)? Is it afforded the same protection in every country of the world?
What laws currently protect it in India?
• Intellectual Property (IP): A legal concept that protects the creations of the mind, such as
inventions, designs, trademarks, and copyrights.
• Protection Across Countries: IP protection is not the same in every country. International treaties
like the TRIPS Agreement attempt to standardize IP laws globally, but local regulations vary.
• Laws in India: Key IP laws in India include:
o Patents Act, 1970: Protects inventions.
o Copyright Act, 1957: Protects creative works like literature, music, and art.
o Trade Marks Act, 1999: Protects brand names and logos.
o Designs Act, 2000: Protects the visual design of objects.

Cryptography Module 4 Ashwin Mysore 18

You might also like