The document discusses the implications of quantum cryptography in light of recent NSA spying revelations and outlines the evolution of computing from classical to quantum systems. It highlights the limitations of Moore's Law, the challenges faced in traditional cryptography, and the advantages of quantum cryptography based on the principles of quantum mechanics. Additionally, it touches on fundamental concepts of quantum mechanics such as the uncertainty principle and quantum entanglement.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
4 views34 pages
course
The document discusses the implications of quantum cryptography in light of recent NSA spying revelations and outlines the evolution of computing from classical to quantum systems. It highlights the limitations of Moore's Law, the challenges faced in traditional cryptography, and the advantages of quantum cryptography based on the principles of quantum mechanics. Additionally, it touches on fundamental concepts of quantum mechanics such as the uncertainty principle and quantum entanglement.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34
Why quantum cryptography?
• ELYSÉE SPYING" - According to WikiLeaks
documents, published Tuesday evening by Libération and Mediapart, the American NSA is said to have spied on three presidents between 2006 and May 2012. Hollande will convene a Defense Council on this issue on Wednesday morning at the Elysée. at least between 2006 and May 2012: Jacques Chirac, Nicolas Sarkozy and François Hollande • Geometric optics (11th and 18th century): images, optical instruments, • Wave optics, 19th century (interference, spectroscopy, etc.) • Quantum optics, 20th century: particle- photon interaction What Is Moore’s Law?
• In 1965 Gordon Moore observed the number
of transistors in a dense integrated circuit will double every 18 month • Moore’s Law Definition Moore’s Law refers to the observation that the number of transistors in a dense integrated circuit doubles about every two years. (process nodes) •20 μm – 1968 •10 μm – 1971 •6 μm – 1974 •3 μm – 1977 • 1.5 μm – 1981 •1 μm – 1984 •800 nm – 1987 •600 nm – 1990 •350 nm – 1993 •250 nm – 1996 •180 nm – 1999 •130 nm – 2001 •90 nm – 2003 •65 nm – 2005 •45 nm – 2007 •32 nm – 2009 •28 nm – 2010 •22 nm – 2012 •14 nm – 2014 •10 nm – 2016 •7 nm – 2018 •5 nm – 2020 •3 nm – 2022 •Future2 nm ~ 2025 • Physical Limitations: As transistors get smaller (approaching the size of atoms), it becomes more challenging to further shrink them without encountering physical and technical limitations. Quantum effects start to play a significant role at these scales, leading to issues like electron tunneling, which can cause the transistors to behave unpredictably. • Heat Dissipation Problems: Smaller transistors mean more power in a smaller space, leading to significant heat generation. Efficiently dissipating this heat is a major challenge, and failure to do so can affect performance and reliability. • Economic Challenges: The cost of building cutting-edge fabrication facilities, known as fabs, is rising exponentially. These fabs are required to produce smaller and more complex chips. As a result, the economic benefits of shrinking transistors (as per Moore's Law) are diminishing. • Material Limitations: The materials currently used in chip manufacturing have their limitations. Finding new materials or methods that allow for continued transistor miniaturization is a significant challenge. • Increased Design Complexity: Designing and manufacturing extremely small and complex chips require sophisticated and expensive technology, making it more challenging and costly to keep up with the pace of Moore's Law. • Emerging Computing Models: There's a growing interest in alternative computing models like quantum computing and neuromorphic computing, which may eventually supplant traditional transistor-based computing for certain applications Quantum computer Classical computing versus quantum computing • Quantum computing is built on the principles of quantum mechanics, which describe how subatomic particles behave differently from macrolevel physics. But because quantum mechanics provides the foundational laws for our entire universe, on a subatomic level, every system is a quantum system. • For this reason, we can say that while conventional computers are also built on top of quantum systems, they fail to take full advantage of the quantum mechanical properties during their calculations. Quantum computers take better advantage of quantum mechanics to conduct calculations that even high- performance computers cannot. What is a classical computer?
• From antiquated punch-card adders to modern
supercomputers, traditional (or classical) computers essentially function in the same way. These machines generally perform calculations sequentially, storing data by using binary bits of information. Each bit represents either a 0 or 1. • When combined into binary code and manipulated by using logic operations, we can use computers to create everything from simple operating systems to the most advanced supercomputing calculations. What is a quantum computer?
• Quantum computers function similarly to classical
computers, but instead of bits, quantum computing uses qubits. These qubits are special systems that act like subatomic particles made of atoms, superconducting electric circuits or other systems that data in a set of amplitudes applied to both 0 and 1, rather than just two states (0 or 1). This complicated quantum mechanical concept is called a superposition. Through a process called quantum entanglement, those amplitudes can apply to multiple qubits simultaneously. • Quantum computers are able to solve certain types of problems faster than classical computers by taking advantage of quantum mechanical effects, such as superposition and quantum interference. Some applications where quantum computers can provide such a speed boost include machine learning (ML), optimization, and simulation of physical systems. Eventual use cases could be portfolio optimization in finance or the simulation of chemical systems, solving problems that are currently impossible for even the most powerful supercomputers on the market IBM/Quantum computer in the world to be uniquely dedicated to healthcare research with an aim to help Cleveland Clinic accelerate biomedical discoveries • Alice wants to send a message to Bob without Eve being able to intercept the message to rewrite it or modify it. • Alice must therefore ensure the identity of Bob and vice versa and encrypt the message • two types of encryption: with private key with public keys • most encryption protocols are known and public: security therefore lies in the protection of the encryption keys a different approach: • cryptographers have always directed their research towards algorithms that make cracking the code as difficult as possible (factorization and prime numbers –RSA) • quantiquz cryptography does not seek to prevent cracking but to stop or detect the interception of messages • classical cryptography is based on mathematics while quantum cryptography is based on the laws of physics problems posed by my classic cryptography: • In public key algorithms based on factorization and the laws of large numbers, the strength lies in the size of the keys and the computing power necessary to break the codes: to crack a 128-bit key, 1038 possibility; billions of years with these current computers but soon there will be quantum computers and potentially four minutes enough; it is also possible to find workarounds or shortcuts using new mathematical tools; some keys may be weaker than others; In private key algorithms, it is the key distribution process that is the most risky • Based on Shannon's demonstration in 1948 of disposable mask encryption or Vernam cipher the principle of interest concerns the distribution of encryption keys (private key encryption) we are talking about quantum basic notions of quantum mechanics • The quantum state system: In quantum mechanics, a quantum state system is described by an appropriate mathematical structure, associated with a complex vector space, called Hilbert space, Heisenberg uncertainty principle • This principle was announced by Werner Heisenberg in 1926, who states that it is not possible to simultaneously and more precisely obtain information about the position and momentum of one particle without disturbing the other. That is to say if we determine the position of a following particle x of the momentum. Single slit experiment used to show the uncertainty principle in action There is a simple experiment physicists commonly use to show the uncertainty principle in action. It’s called the single slit experiment, and it goes as follows: A laser beam is fired through a single vertical wide slit and is reflected in a projection screen. What we see with the wide slit is exactly what we suspect, a dot projected on the screen. Now, if we make the width of the slit narrower and narrower, the sides of the dot start to get narrower too. Nevertheless, at around 1/100 of an inch, the uncertainty principle kicks in, and the direction of the beam becomes uncertain, according to Heisenberg. Thus now we observe the light to spread becoming wider and wider! Sounds crazy, how can the light become wider if we are making the slit narrower! It is extremely nonintuitive, but that’s how things work. Duality wave particle Einstein devised a box that he thought will be able to register the precise moment a particle of light was emitted from a small opening in the side of the box and at the same time measure its weight
Einstein’s experimental box to disprove the
uncertainty principle Bell’s Inequality: A Test for Entanglement