Unit 1 | HTCS401 Notes | Information Theory for Cybersecurity Notes | Aktu Notes


Unit 1 Information Theory for Cybersecurity Notes | HTCS401 Notes | Aktu Notes

Unit 1 Information Theory for Cybersecurity Notes | HTCS401 Notes | Aktu Notes











    Shannon’s Foundation of Information Theory

    - Developed by Claude Shannon in 1948, it provides a mathematical framework for quantifying information and communication.
    - Fundamental to understanding data compression, transmission, and encryption.

    Key Concepts:
      - Entropy: Measures the uncertainty or information content of a message.
      - Mutual Information: Measures the amount of information gained about one random variable through another.

    Example:
    In a communication system, Shannon's theory helps determine the optimal coding scheme to minimize transmission errors and maximize data efficiency.

    Random Variables

    - Random variables represent outcomes of uncertain events in a probabilistic framework.
    - Central to modeling unpredictable phenomena in cybersecurity contexts.

    Types:
    - Discrete: Finite or countably infinite outcomes (e.g., dice rolls).
    - Continuous: Infinite possible outcomes within a range (e.g., measurement errors).

    Example:
    A random variable might represent the outcome of a firewall decision: allow or deny traffic based on predefined rules.

    Probability Distribution Factors

    - Describes the likelihood of different outcomes of a random variable.
    - Helps assess risks and vulnerabilities in cybersecurity scenarios.

    Parameters:
    - Mean (Expected Value): Average outcome weighted by probabilities.
    - Variance: Measure of the spread or dispersion of outcomes.
    - Skewness and Kurtosis: Higher moments describing asymmetry and tail behavior.

    Example:
    Understanding the distribution of network traffic patterns helps in detecting anomalies or potential cyber attacks.

    Uncertainty/Entropy Information Measures

    - Entropy quantifies the amount of uncertainty or randomness in data.
    - Key for designing secure cryptographic algorithms and protocols.

    Calculation:
    Shannon Entropy: \( H(X) = -\sum_{i} p_i \log_2 p_i \), where \( p_i \) is the probability of outcome \( i \).

    Example:
      - Assessing the entropy of password policies helps ensure they are robust against dictionary attacks and brute-force methods.

    Leakage

    - Unintended disclosure of sensitive information in a system.
    - Can occur through side-channel attacks, insecure protocols, or poor data handling practices.

    Types:
    - Information Leakage: Disclosure of sensitive data.
    - Timing Attacks: Exploiting variations in timing to infer secret information.

    Example:
    Monitoring power consumption to deduce cryptographic keys in smart cards is an example of a side-channel attack.

    Quantifying Leakage and Partitions

    - Methods to measure and mitigate information leakage in systems.
    - Partitioning data and applying access controls limit exposure to sensitive information.

    Techniques:
    - Data Masking: Hiding sensitive information in datasets.
    - Access Controls: Role-based access and authentication mechanisms.

    Example:
    Encrypting sensitive files and restricting access based on user roles ensures confidentiality and limits data leakage.

    Lower Bounds on Key Size: Secrecy, Authentication, and Secret Sharing

    - Theoretical limits on the size of cryptographic keys ensure adequate security.
    - Key size impacts resistance against brute-force attacks and cryptographic strength.

    Considerations:
    - Cryptographic Strength: Larger key sizes increase security but also computational overhead.
    - Key Management: Balancing security needs with operational efficiency.

    Example:
    RSA encryption uses key sizes ranging from 1024 to 4096 bits to ensure secure communication and data protection.

    Provable Security, Computationally-Secure, Symmetric Cipher

    - Concepts in cryptographic security guarantee the resilience of cryptographic systems under specified conditions.
    - Differentiates between theoretical security (provable) and practical security (computationally secure).

    Examples:
    - Provable Security: Mathematical proofs demonstrating resistance against specific types of attacks.
    - Computationally Secure: Security relies on the difficulty of computational problems, such as factoring large numbers (RSA).

    Application:
    AES (Advanced Encryption Standard) is widely used due to its computational security and resistance to known cryptographic attacks.

    These detailed notes provide a comprehensive understanding of information theory concepts in the context of cybersecurity, essential for analyzing and designing secure systems and protocols.

    No comments:

    Post a Comment