The concept of quantum mechanics. Quantum mechanics. a different look

The word "quantum" comes from the Latin quantum(“how much, how much”) and English quantum(“quantity, portion, quantum”). “Mechanics” has long been the name given to the science of the movement of matter. Accordingly, the term “quantum mechanics” means the science of the movement of matter in portions (or, in modern scientific language, the science of movement quantized matter). The term “quantum” was coined by the German physicist Max Planck ( cm. Planck's constant) to describe the interaction of light with atoms.

Quantum mechanics often contradicts our common sense concepts. And all because common sense tells us things that are taken from everyday experience, and in our everyday experience we have to deal only with large objects and phenomena of the macroworld, and at the atomic and subatomic level, material particles behave completely differently. The Heisenberg uncertainty principle precisely outlines the meaning of these differences. In the macroworld, we can reliably and unambiguously determine the location (spatial coordinates) of any object (for example, this book). It doesn't matter whether we use a ruler, radar, sonar, photometry or any other measurement method, the measurement results will be objective and independent of the position of the book (of course, provided you are careful in the measurement process). That is, some uncertainty and inaccuracy are possible - but only due to the limited capabilities of measuring instruments and observation errors. To get more accurate and reliable results, we just need to take a more accurate measuring device and try to use it without errors.

Now, if instead of the coordinates of a book we need to measure the coordinates of a microparticle, for example an electron, then we can no longer neglect the interactions between the measuring device and the object of measurement. The force of influence of a ruler or other measuring device on a book is negligible and does not affect the measurement results, but in order to measure the spatial coordinates of an electron, we need to launch a photon, another electron or another elementary particle of energies comparable to the measured electron in its direction and measure its deviation. But at the same time, the electron itself, which is the object of measurement, will change its position in space as a result of interaction with this particle. Thus, the very act of measurement leads to a change in the position of the measured object, and the inaccuracy of the measurement is determined by the very fact of the measurement, and not by the degree of accuracy of the measuring device used. This is the situation we are forced to put up with in microcosm. Measurement is impossible without interaction, and interaction is impossible without influencing the measured object and, as a consequence, distorting the measurement results.

Only one thing can be stated about the results of this interaction:

uncertainty of spatial coordinates × uncertainty of particle velocity > h/m,

or, in mathematical terms:

Δ x × Δ v > h/m

where Δ x and Δ v— uncertainty of the spatial position and velocity of the particle, respectively, h - Planck's constant, and m— particle mass.

Accordingly, uncertainty arises when determining the spatial coordinates of not only an electron, but also any subatomic particle, and not only coordinates, but also other properties of particles, such as speed. The measurement error of any such pair of mutually related characteristics of particles is determined in a similar way (an example of another pair is the energy emitted by an electron and the period of time during which it is emitted). That is, if we, for example, managed to measure the spatial position of an electron with high accuracy, then we at the same moment in time we have only the vaguest idea of ​​its speed, and vice versa. Naturally, in real measurements it does not reach these two extremes, and the situation is always somewhere in the middle. That is, if we were able, for example, to measure the position of an electron with an accuracy of 10 -6 m, then we can simultaneously measure its speed, at best, with an accuracy of 650 m/s.

Due to the uncertainty principle, the description of objects of the quantum microworld is of a different nature than the usual description of objects of the Newtonian macroworld. Instead of spatial coordinates and speed, which we are used to describing mechanical movement, for example, a ball on a billiard table, in quantum mechanics objects are described by the so-called wave function. The crest of the “wave” corresponds to the maximum probability of finding a particle in space at the moment of measurement. The movement of such a wave is described by the Schrödinger equation, which tells us how the state of a quantum system changes over time.

The picture of quantum events in the microworld, drawn by the Schrödinger equation, is such that particles are likened to individual tidal waves propagating along the surface of the ocean-space. Over time, the crest of the wave (corresponding to the peak probability of finding a particle, such as an electron, in space) moves through space in accordance with the wave function, which is the solution to this differential equation. Accordingly, what we traditionally think of as a particle, at the quantum level, exhibits a number of characteristics characteristic of waves.

Coordination of wave and corpuscular properties of microworld objects ( cm. De Broglie's relation) became possible after physicists agreed to consider the objects of the quantum world not as particles or waves, but as something intermediate and possessing both wave and corpuscular properties; There are no analogues to such objects in Newtonian mechanics. Although even with such a solution, there are still plenty of paradoxes in quantum mechanics ( cm. Bell's theorem), no one has yet proposed a better model for describing the processes occurring in the microworld.

Representations in the physics of the atomic nucleus

The emergence of quantum mechanics.

Quantum mechanics is a physical theory that studies motion at the micro level.

Even at the end of the 19th century, most scientists were inclined to the point of view that the physical picture of the world was basically built and would remain unshakable in the future. Only details remain to be clarified. But for the first time in the decades of the 20th century, physical views changed radically. This was the consequence of a "cascade" of scientific discoveries made during an extremely short historical period spanning last years XIX centuries and the first decades of the XX century.

In 1896 French physicist Antoine Henri Becquerel (1852-1908) discovered the phenomenon of spontaneous radiation of uranium salt.

His research included French physicists, the spouses Pierre Curie (1859-1906) and Marie Skłodowska-Curie (1867-1934). In 1898, new elements were discovered that also had the property of emitting “Becquerel rays” - polonium and radium. The Curies called this property radioactivity.

And a year earlier, in 1897, in the Cavendish laboratory in Cambridge, while studying electric discharge in gases (cathode rays), the English physicist Joseph John Thomson (1856-1940) discovered the first elementary particle - the electron.

In 1911, the famous English physicist Ernest Rutherford (1871-1937) proposed his own model of the atom, which was called planetary.

N. Bohr, knowing about Rutherford's model and accepting it as the initial one, developed the quantum theory of atomic structure in 1913.

Principles of Quantum Mechanics

Heisenberg Uncertainty Principle: “It is impossible to simultaneously accurately determine the coordinates and speed of a quantum particle”

In the first quarter of the twentieth century, this was precisely the reaction of physicists when they began to study the behavior of matter at the atomic and subatomic levels.

The Heisenberg principle plays a key role in quantum mechanics, if only because it quite clearly explains how and why the microworld differs from the material world we are familiar with.

To find, for example, a book, when you enter a room, you glance around it until it stops on it. In the language of physics, this means that you made a visual measurement (you found a book by looking) and got the result - you recorded its spatial coordinates (you determined the location of the book in the room).



In the early 1920s, during the explosion of creative thought that led to the creation of quantum mechanics, the young German theoretical physicist Werner Heisenberg was the first to recognize this problem. He formulated uncertainty principle, now named after him:

The term “spatial coordinate uncertainty” precisely means that we do not know the exact location of the particle. For example, if you use the global GPS system to determine the location of a book, the system will calculate them with an accuracy of 2-3 meters. And here we come to the most fundamental difference between the microworld and our everyday physical world. IN ordinary world, measuring the position and speed of a body in space, we have practically no influence on it. So ideally we can simultaneously measure both the speed and coordinates of an object absolutely accurately (in other words, with zero uncertainty). Let us assume that we need to fix the spatial location of the electron. We still need a measuring tool that will interact with an electron and will return a signal to the detectors with information about its location.

If we manage to determine one of the measured quantities with zero error (absolutely accurately), the uncertainty of the other quantity will be equal to infinity, and we will not know anything about it at all. In other words, if we were able to absolutely accurately establish the coordinates of a quantum particle, we would not have the slightest idea about its speed; If we could accurately record the speed of a particle, we would have no idea where it is.

The uncertainty principle does not prevent us from measuring each of these quantities with any desired accuracy. He only claims that we unable reliably know both at the same time.

The key to the Heisenberg relation is the interaction between the particle-object of measurement and the measurement instrument, which influences its results.

N. Bohr's principle of complementarity: “ Objects of the microworld are described both as particles and as waves, and one description complements the other.”

In everyday life, there are two ways to transfer energy in space - through particles or waves. To, say, knock a domino off a table that was balancing on its edge, you can give it the necessary energy in two ways. First, you can throw another domino at it (that is, transfer a point impulse using a particle). Secondly, you can build a row of dominoes in a chain leading to the one on the edge of the table, and drop the first one onto the second: in this case, the impulse will be transmitted along the chain - the second domino will topple the third, the third will topple the fourth, and so on. This is the wave principle of energy transfer. In everyday life, there are no visible contradictions between the two mechanisms of energy transfer. So, a basketball is a particle, and sound is a wave, and everything is clear.

However, in quantum mechanics things are not so simple. Even from the simplest experiments with quantum objects, it very soon becomes clear that in the microworld the principles and laws of the macroworld that we are familiar with do not apply. Light, which we are accustomed to thinking of as a wave, sometimes behaves as if it consists of a stream of particles ( photons), and elementary particles such as an electron or even a massive proton often exhibit the properties of a wave. If you “shoot” electrons one at a time, each of them will leave a clear mark on the screen - that is, behave like a particle. The most interesting thing is that the same thing will happen if instead of a beam of electrons you take a beam of photons: in the beam they will behave like waves, and individually - like particles

In other words, in the microworld, objects that behave like particles, at the same time, seem to “remember” their wave nature, and vice versa. This strange property of microworld objects is called quantum wave duality.

The principle of complementarity is a simple statement of this fact. According to this principle, if we measure the properties of a quantum object as a particle, we see that it behaves like a particle. If we measure its wave properties, for us it behaves like a wave. Both ideas do not contradict each other at all - they are precisely complement one another, which is reflected in the name of the principle.

The structure of the atom.

The planetary model of the structure of the atom was proposed as a result of the discovery of the atomic nucleus by Rutherford:
1. At the center of the atom there is a positively charged nucleus, occupying an insignificant part of the space inside the atom.
2. The entire positive charge and almost the entire mass of an atom are concentrated in its nucleus (the mass of an electron is 1/1823 amu).
3.Electrons rotate around the nucleus in closed orbits. Their number is equal to the charge of the nucleus.
Atomic nucleus

The nucleus of an atom consists of protons and neutrons ( common name- nucleons). It is characterized by three parameters: A - mass number, Z - nuclear charge, equal to the number protons, and N is the number of neutrons in the nucleus. These parameters are related to each other by the relationship:
A = Z + N.
The number of protons in the nucleus is equal to the atomic number of the element.
The nuclear charge is usually written at the bottom left of the element symbol, and the mass number at the top left (the nuclear charge is often omitted).
Example 40 18 Ar: The nucleus of this atom contains 18 protons and 22 neutrons.
Atoms whose nuclei contain the same number of protons and different numbers of neutrons are called isotopes, for example: 12/6C and 13/6C. Hydrogen isotopes have special symbols and names: 1 H - protium, 2 D - deuterium, 3 T - tritium. Chemical properties isotopes are identical, some physical properties differ very slightly.

Radioactivity

Radioactivity- this is a spontaneous, spontaneous transformation of unstable atomic nuclei into the nuclei of other elements, accompanied by the emission of particles. The corresponding elements were called radioactive or radionucleides.

In 1899, E. Rutherford, as a result of experiments, discovered that radioactive radiation is inhomogeneous and, under the influence of a strong magnetic field, breaks up into two components, a- and b-rays. The third component, g-rays, was discovered by the French physicist P. Villard in 1900.

Gamma rays cause the ionization of atoms of matter. The main processes that occur when gamma radiation passes through matter:

Photoelectric effect - the energy of a gamma ray is absorbed by an electron in the shell of an atom, and the electron, performing a work function, leaves the atom (which becomes ionized, i.e. turns into an ion).

The knocking out of electrons from the surface of conductive materials by light is a phenomenon widely used in everyday life today. For example, some alarm systems work by transmitting visible or infrared light beams to photovoltaic cell, from which electrons are knocked out, providing electrical conductivity of the circuit in which it is included. If an obstacle appears in the path of the light beam, the light stops coming to the sensor, the flow of electrons stops, the circuit is broken - and an electronic alarm is triggered.

Irradiation with γ-rays, depending on the dose and duration, can cause chronic and acute radiation sickness. The effects of radiation include various types of cancer. At the same time, gamma irradiation suppresses the growth of cancer and other rapidly dividing cells. Gamma radiation is a mutagenic factor.

Application gamma radiation:

Gamma flaw detection, inspection of products by transillumination with γ-rays.

Food preservation.

Sterilization of medical materials and equipment.

Radiation therapy.

Level gauges

Gamma altimeters, measuring the distance to the surface when landing spacecraft.

Gamma sterilization of spices, grains, fish, meat and other products to increase shelf life.

Types of radioactivity

The fission of an atomic nucleus can be spontaneous (spontaneous) and forced (as a result of interaction with other particles, primarily with neutrons). The fission of heavy nuclei is an exothermic process, as a result of which a large amount of energy is released in the form of kinetic energy of reaction products, as well as radiation. Nuclear fission serves as a source of energy in nuclear reactors and nuclear weapons. It has been established that all chemical elements of CC with a serial number greater than 82 are radioactive (that is, starting with bismuth), and some lighter elements (promethium and technetium do not have stable isotopes, and some elements, such as indium, potassium or calcium, have only natural isotopes stable, others are radioactive).

In the spring of 1913, Soddy formulated the rule:

The emission of α-particles reduces the atomic mass by 4 and shifts it 2 places to the left along the PS.

The emission of β-particles shifts the element to the right by 1 place, almost without changing its mass

PLAN

INTRODUCTION 2

1. HISTORY OF THE CREATION OF QUANTUM MECHANICS 5

2. THE PLACE OF QUANTUM MECHANICS AMONG OTHER SCIENCES ABOUT MOTION. 14

CONCLUSION 17

LITERATURE 18

Introduction

Quantum mechanics is a theory that establishes the method of description and laws of motion of microparticles (elementary particles, atoms, molecules, atomic nuclei) and their systems (for example, crystals), as well as the connection between quantities characterizing particles and systems with physical quantities directly measured in macroscopic experiments . The laws of quantum mechanics (hereinafter referred to as QM) form the foundation for the study of the structure of matter. They made it possible to clarify the structure of atoms, establish the nature of chemical bonds, explain the periodic system of elements, understand the structure of atomic nuclei, and study the properties of elementary particles.

Since the properties of macroscopic bodies are determined by the movement and interaction of the particles of which they are composed, the laws of quantum mechanics underlie the understanding of most macroscopic phenomena. Calculus made it possible, for example, to explain the temperature dependence and calculate the heat capacity of gases and solids, determine the structure and understand many properties of solids (metals, dielectrics, semiconductors). Only on the basis of quantum mechanics was it possible to consistently explain such phenomena as ferromagnetism, superfluidity, and superconductivity, to understand the nature of such astrophysical objects as white dwarfs and neutron stars, and to elucidate the mechanism of thermonuclear reactions in the Sun and stars. There are also phenomena (for example, the Josephson effect) in which the laws of quantum mechanics are directly manifested in the behavior of macroscopic objects.

Thus, quantum mechanical laws underlie the operation of nuclear reactors, determine the possibility of thermonuclear reactions under terrestrial conditions, manifest themselves in a number of phenomena in metals and semiconductors used in the latest technology, etc. The foundation of such a rapidly developing field of physics as quantum electronics is the quantum mechanical theory of radiation. The laws of quantum mechanics are used in the targeted search and creation of new materials (especially magnetic, semiconductor, and superconducting materials). Quantum mechanics is becoming to a large extent an “engineering” science, the knowledge of which is necessary not only for research physicists, but also for engineers.

1. History of the creation of quantum mechanics

At the beginning of the 20th century. two (seemingly unrelated) groups of phenomena were discovered that indicate the inapplicability of the usual classical theory electromagnetic field(classical electrodynamics) to the processes of interaction of light with matter and to the processes occurring in the atom. The first group of phenomena was associated with the experimental establishment of the dual nature of light (light dualism); the second is the impossibility of explaining, on the basis of classical concepts, the stable existence of an atom, as well as the spectral patterns discovered in the study of the emission of light by atoms. The establishment of connections between these groups of phenomena and attempts to explain them on the basis of a new theory ultimately led to the discovery of the laws of quantum mechanics.

For the first time, quantum concepts (including the quantum constant h) were introduced into physics in the work of M. Planck (1900), devoted to the theory of thermal radiation.

The theory of thermal radiation that existed at that time, built on the basis of classical electrodynamics and statistical physics, led to the meaningless result that thermal (thermodynamic) equilibrium between radiation and matter could not be achieved, because All energy must sooner or later turn into radiation. Planck resolved this contradiction and obtained results that were in excellent agreement with experiment, based on an extremely bold hypothesis. In contrast to the classical theory of radiation, which considers the emission of electromagnetic waves as a continuous process, Planck suggested that light is emitted in certain portions of energy - quanta. The magnitude of such an energy quantum depends on the light frequency n and is equal to E=h n. From this work of Planck, two interconnected lines of development can be traced, culminating in the final formulation of quantum mechanics in its two forms (1927).

The first begins with the work of Einstein (1905), in which the theory of the photoelectric effect was given - the phenomenon of light ejecting electrons from matter.

In developing Planck's idea, Einstein suggested that light is not only emitted and absorbed in discrete portions - radiation quanta, but also the propagation of light occurs in such quanta, i.e. that discreteness is inherent in light itself - that light itself consists of separate portions - light quanta (which were later called photons). Photon energy E is related to the oscillation frequency of the n wave by Planck's relation E= hn.

Further evidence of the corpuscular nature of light was obtained in 1922 by A. Compton, who showed experimentally that the scattering of light by free electrons occurs according to the laws of elastic collision of two particles - a photon and an electron. The kinematics of such a collision is determined by the laws of conservation of energy and momentum, and the photon, along with energy, E= hn impulse should be attributed p = h / l = h n / c, Where l- light wavelength.

The energy and momentum of a photon are related by E = cp , valid in relativistic mechanics for a particle with zero mass. Thus, it was experimentally proven that, along with the known wave properties (manifested, for example, in the diffraction of light), light also has corpuscular properties: it consists, as it were, of particles - photons. This reveals the dualism of light, its complex corpuscular-wave nature.

Dualism is already contained in the formula E= hn, which does not allow choosing any one of two concepts: on the left side of the equality the energy E refers to the particle, and on the right - frequency n is a characteristic of the wave. A formal logical contradiction arose: to explain some phenomena it was necessary to assume that light has a wave nature, and to explain others - corpuscular. Essentially, the resolution of this contradiction led to the creation of the physical foundations of quantum mechanics.

In 1924, L. de Broglie, trying to find an explanation for the conditions for the quantization of atomic orbits postulated in 1913 by N. Bohr, put forward a hypothesis about the universality of wave-particle duality. According to de Broglie, each particle, regardless of its nature, should be associated with a wave whose length L related to the momentum of the particle R ratio. According to this hypothesis, not only photons, but also all “ordinary particles” (electrons, protons, etc.) have wave properties, which, in particular, should manifest themselves in the phenomenon of diffraction.

In 1927, K. Davisson and L. Germer first observed electron diffraction. Later, wave properties were discovered in other particles, and the validity of de Broglie's formula was confirmed experimentally

In 1926, E. Schrödinger proposed an equation describing the behavior of such “waves” in external force fields. This is how wave mechanics arose. The Schrödinger wave equation is the basic equation of nonrelativistic quantum mechanics.

In 1928, P. Dirac formulated a relativistic equation describing the motion of an electron in an external force field; The Dirac equation became one of the basic equations of relativistic quantum mechanics.

The second line of development begins with the work of Einstein (1907), devoted to the theory of heat capacity of solids (it is also a generalization of Planck’s hypothesis). Electromagnetic radiation, which is a set of electromagnetic waves of different frequencies, is dynamically equivalent to a certain set of oscillators (oscillatory systems). The emission or absorption of waves is equivalent to the excitation or damping of the corresponding oscillators. The fact that the emission and absorption of electromagnetic radiation by matter occurs as energy quanta h n. Einstein generalized this idea of ​​quantizing the energy of an electromagnetic field oscillator to an oscillator of arbitrary nature. Since the thermal motion of solids is reduced to vibrations of atoms, a solid is dynamically equivalent to a set of oscillators. The energy of such oscillators is also quantized, i.e. the difference between neighboring energy levels (energies that the oscillator may have) must be equal to h n, where n is the vibration frequency of atoms.

Einstein's theory, refined by P. Debye, M. Born and T. Carman, played an outstanding role in the development of the theory of solids.

In 1913, N. Bohr applied the idea of ​​energy quantization to the theory of atomic structure, the planetary model of which followed from the results of experiments by E. Rutherford (1911). According to this model, at the center of the atom there is a positively charged nucleus, in which almost the entire mass of the atom is concentrated; Negatively charged electrons rotate in orbits around the nucleus.

Consideration of such motion on the basis of classical concepts led to a paradoxical result - the impossibility of stable existence of atoms: according to classical electrodynamics, an electron cannot move stably in an orbit, since a rotating electric charge must emit electromagnetic waves and, therefore, lose energy. The radius of its orbit should decrease and in a time of about 10 –8 sec the electron should fall onto the nucleus. This meant that the laws of classical physics are not applicable to the movement of electrons in an atom, because atoms exist and are extremely stable.

To explain the stability of atoms, Bohr suggested that of all the orbits allowed by Newtonian mechanics for the motion of an electron in the electric field of an atomic nucleus, only those that satisfy certain quantization conditions are actually realized. That is, in an atom there are (as in an oscillator) discrete energy levels.

These levels obey a certain pattern, derived by Bohr on the basis of a combination of the laws of Newtonian mechanics with quantization conditions requiring that the magnitude of the action for the classical orbit be an integer multiple of Planck's constant.

Bohr postulated that, being at a certain energy level (i.e., performing the orbital motion allowed by the quantization conditions), the electron does not emit light waves.

Radiation occurs only when an electron moves from one orbit to another, i.e. from one energy level E i, to another with less energy E k, in this case a light quantum is born with an energy equal to the difference in the energies of the levels between which the transition occurs:

h n= E i- E k. (1)

This is how a line spectrum arises - the main feature of atomic spectra. Bohr obtained the correct formula for the frequencies of the spectral lines of the hydrogen atom (and hydrogen-like atoms), covering a set of previously discovered empirical formulas.

The existence of energy levels in atoms was directly confirmed by Frank-Hertz experiments (1913-14). It was found that electrons bombarding a gas lose only certain portions of energy when colliding with atoms, equal to the difference in the energy levels of the atom.

N. Bohr, using the quantum constant h, reflecting the dualism of light, showed that this quantity also determines the movement of electrons in an atom (and that the laws of this movement differ significantly from the laws of classical mechanics). This fact was later explained on the basis of the universality of wave-particle duality contained in de Broglie's hypothesis. The success of Bohr's theory, like the previous successes of quantum theory, was achieved by violating the logical integrity of the theory: on the one hand, Newtonian mechanics was used, on the other hand, artificial quantization rules alien to it were used, which also contradicted classical electrodynamics. In addition, Bohr's theory was unable to explain the movement of electrons in complex atoms and the emergence of molecular bonds.

Bohr’s “semiclassical” theory also could not answer the question of how an electron moves when transitioning from one energy level to another.

Further intensive development of questions of atomic theory led to the conviction that, while preserving the classical picture of the motion of an electron in orbit, it is impossible to construct a logically coherent theory.

Awareness of the fact that the movement of electrons in an atom is not described in terms (concepts) of classical mechanics (as movement along a certain trajectory) led to the idea that the question of the movement of an electron between levels is incompatible with the nature of the laws that determine the behavior of electrons in an atom, and that a new theory is needed, which would include only quantities related to the initial and final stationary states of the atom.

In 1925, W. Heisenberg managed to construct a formal scheme in which, instead of the coordinates and velocities of the electron, certain abstract algebraic quantities - matrices - appeared; the connection between matrices and observable quantities (energy levels and intensities of quantum transitions) was given by simple consistent rules. Heisenberg's work was developed by M. Born and P. Jordan. This is how matrix mechanics arose. Soon after the appearance of the Schrödinger equation, the mathematical equivalence of wave (based on the Schrödinger equation) and matrix mechanics was shown. In 1926 M. Born gave a probabilistic interpretation of de Broglie waves (see below).

The works of Dirac dating back to the same time played a major role in the creation of quantum mechanics. The final formation of quantum mechanics as a consistent physical theory with clear foundations and a harmonious mathematical apparatus occurred after the work of Heisenberg (1927), in which the uncertainty relation was formulated - the most important relationship that illuminates the physical meaning of the equations of quantum mechanics, its connection with classical mechanics and other fundamental issues and qualitative results of quantum mechanics. This work was continued and generalized in the works of Bohr and Heisenberg.

A detailed analysis of the spectra of atoms led to the concept (first introduced by J. Yu. Uhlenbeck and S. Goudsmit and developed by W. Pauli) that an electron, in addition to charge and mass, should be assigned one more internal characteristic (quantum number) - spin.

An important role was played by the so-called exclusion principle discovered by W. Pauli (1925), which has fundamental significance in the theory of the atom, molecule, nucleus, and solid body.

Within a short time, quantum mechanics was successfully applied to a wide range of phenomena. The theories of atomic spectra, molecular structure, chemical bonding, D.I. Mendeleev's periodic system, metallic conductivity and ferromagnetism were created. These and many other phenomena have become (at least qualitatively) clear.

A. SHISHLOVA. based on materials from the journals "Advances in Physical Sciences" and "Scientific American".

The quantum mechanical description of the physical phenomena of the microworld is considered the only correct one and most fully consistent with reality. Objects of the macrocosm obey the laws of another, classical mechanics. The boundary between the macro and micro world is blurred, and this causes a number of paradoxes and contradictions. Attempts to eliminate them lead to the emergence of other views on quantum mechanics and the physics of the microworld. Apparently, the American theorist David Joseph Bohm (1917-1992) was able to express them best.

1. A thought experiment on measuring the components of the spin (momentum of motion) of an electron using a certain device - a “black box”.

2. Consecutive measurement of two spin components. The “horizontal” spin of the electron is measured (on the left), then the “vertical” spin (on the right), then again the “horizontal” spin (below).

3A. Electrons with a “right” spin after passing through a “vertical” box move in two directions: up and down.

3B. In the same experiment, we will place a certain absorbing surface on the path of one of the two beams. Further, only half of the electrons participate in the measurements, and at the output, half of them have a “left” spin, and half have a “right” spin.

4. The state of any object in the microworld is described by the so-called wave function.

5. Thought experiment by Erwin Schrödinger.

6. The experiment proposed by D. Bohm and Ya. Aharonov in 1959 was supposed to show that a magnetic field inaccessible to a particle affects its state.

To understand what difficulties modern quantum mechanics is experiencing, we need to remember how it differs from classical, Newtonian mechanics. Newton created a general picture of the world in which mechanics acted as a universal law of motion of material points or particles - small lumps of matter. Any objects could be built from these particles. It seemed that Newtonian mechanics was capable of theoretically explaining all natural phenomena. However, at the end of the last century it became clear that classical mechanics is unable to explain the laws of thermal radiation of heated bodies. This one would seem private question led to the need to revise physical theories and required new ideas.

In 1900, the work of the German physicist Max Planck appeared, in which these new ideas appeared. Planck suggested that radiation occurs in portions, quanta. This idea contradicted classical views, but perfectly explained the results of experiments (in 1918 this work was awarded the Nobel Prize in Physics). Five years later, Albert Einstein showed that not only radiation, but also absorption of energy should occur discretely, in portions, and was able to explain the features of the photoelectric effect (Nobel Prize 1921). According to Einstein, a light quantum - photon, having wave properties, at the same time in many ways resembles a particle (corpuscle). Unlike a wave, for example, it is either completely absorbed or not absorbed at all. This is how the principle of wave-particle duality of electromagnetic radiation arose.

In 1924, the French physicist Louis de Broglie put forward a rather “crazy” idea, suggesting that all particles without exception - electrons, protons and entire atoms - have wave properties. A year later, Einstein said of this work: “Although it seems like it was written by a madman, it was written solidly,” and in 1929 de Broglie received the Nobel Prize for it...

At first glance, everyday experience rejects de Broglie’s hypothesis: there seems to be nothing “wave” in the objects around us. Calculations, however, show that the de Broglie wavelength of an electron accelerated to an energy of 100 electron-volts is 10 -8 centimeters. This wave can be easily detected experimentally by passing a stream of electrons through a crystal. Diffraction of their waves will occur on the crystal lattice and a characteristic striped pattern will appear. But for a speck of dust weighing 0.001 grams at the same speed, the de Broglie wavelength will be 10 24 times smaller, and it cannot be detected by any means.

De Broglie waves are unlike mechanical waves - vibrations of matter propagating in space. They characterize the probability of detecting a particle at a given point in space. Any particle appears to be “smeared” in space, and there is a non-zero probability of finding it anywhere. A classic example of a probabilistic description of objects in the microworld is the experiment on electron diffraction by two slits. An electron passing through the slit is recorded on a photographic plate or on a screen in the form of a speck. Each electron can pass through either the right slit or the left slit in a completely random manner. When there are a lot of specks, a diffraction pattern appears on the screen. The blackening of the screen turns out to be proportional to the probability of an electron appearing in a given location.

De Broglie's ideas were deepened and developed by the Austrian physicist Erwin Schrödinger. In 1926, he derived a system of equations - wave functions that describe the behavior of quantum objects in time depending on their energy (Nobel Prize 1933). From the equations it follows that any impact on the particle changes its state. And since the process of measuring the parameters of a particle is inevitably associated with an impact, the question arises: what does the measuring device record, introducing unpredictable disturbances into the state of the measured object?

Thus, the study of elementary particles has made it possible to establish at least three extremely surprising facts regarding the general physical picture of the world.

Firstly, it turned out that the processes occurring in nature are controlled by pure chance. Secondly, it is not always possible in principle to indicate the exact position of a material object in space. And thirdly, what is perhaps most strange, the behavior of such physical objects as a “measuring device” or an “observer” is not described by fundamental laws that are valid for other physical systems.

For the first time, the founders of quantum theory themselves - Niels Bohr, Werner Heisenberg, Wolfgang Pauli - came to such conclusions. Later, this point of view, called the Copenhagen interpretation of quantum mechanics, was accepted in theoretical physics as official, which was reflected in all standard textbooks.

It is quite possible, however, that such conclusions were made too hastily. In 1952, the American theoretical physicist David D. Bohm created a deeply developed quantum theory, different from the generally accepted one, which also well explains all the currently known features of the behavior of subatomic particles. It represents a unified set of physical laws that allows us to avoid any randomness in describing the behavior of physical objects, as well as the uncertainty of their position in space. Despite this, Bohm's theory was almost completely ignored until very recently.

To better imagine the complexity of describing quantum phenomena, let’s conduct several thought experiments to measure the spin (intrinsic angular momentum) of an electron. Mental because no one has yet succeeded in creating a measuring device that allows accurately measuring both components of spin. Equally unsuccessful are attempts to predict which electrons will change their spin during the experiment described and which will not.

These experiments include the measurement of two spin components, which we will conventionally call “vertical” and “horizontal” spins. Each of the components, in turn, can take one of the values, which we will also conventionally call “upper” and “lower”, “right” and “left” spins, respectively. The measurement is based on the spatial separation of particles with different spins. Devices that carry out separation can be imagined as some kind of “black boxes” of two types - “horizontal” and “vertical” (Fig. 1). It is known that different components of the spin of a free particle are completely independent (physicists say they do not correlate with each other). However, during the measurement of one component, the value of another may change, and in a completely uncontrollable manner (2).

Trying to explain the results obtained, traditional quantum theory came to the conclusion that it is necessary to completely abandon the deterministic, that is, completely determining state

object, description of microworld phenomena. The behavior of electrons is subject to the uncertainty principle, according to which the spin components cannot be accurately measured simultaneously.

Let's continue our thought experiments. Now we will not only split electron beams, but also make them reflect from certain surfaces, intersect and reconnect into one beam in a special “black box” (3).

The results of these experiments contradict conventional logic. Indeed, let us consider the behavior of any electron in the case when there is no absorbing wall (3 A). Where will he go? Let's say it's down. Then, if the electron initially had a “right-handed” spin, it will remain right-handed until the end of the experiment. However, applying the results of another experiment (3 B) to this electron, we will see that its “horizontal” spin at the output should be “right” in half the cases, and “left” in half the cases. An obvious contradiction. Could the electron go up? No, for the same reason. Perhaps he was moving not down, not up, but in some other way? But by blocking the upper and lower routes with absorbing walls, we will get nothing at all at the exit. It remains to assume that the electron can move in two directions at once. Then, having the opportunity to fix its position at different times, in half the cases we would find it on the way up, and in half - on the way down. The situation is quite paradoxical: a material particle can neither bifurcate nor “jump” from one trajectory to another.

What does traditional quantum theory say in this case? It simply declares all the situations considered impossible, and the very formulation of the question about a certain direction of motion of the electron (and, accordingly, the direction of its spin) is incorrect. The manifestation of the quantum nature of the electron lies in the fact that in principle there is no answer to this question. The electron state is a superposition, that is, the sum of two states, each of which has a certain value of “vertical” spin. The concept of superposition is one of the fundamental principles of quantum mechanics, with the help of which for more than seventy years it has been possible to successfully explain and predict the behavior of all known quantum systems.

To mathematically describe the states of quantum objects, a wave function is used, which in the case of a single particle simply determines its coordinates. The square of the wave function is equal to the probability of detecting a particle at a given point in space. Thus, if a particle is located in a certain region A, its wave function is zero everywhere except for this region. Similarly, a particle localized in region B has a wave function that is nonzero only in B. If the state of the particle turns out to be a superposition of its presence in A and B, then the wave function describing such a state is nonzero in both regions of space and is equal to zero everywhere outside of them. However, if we set up an experiment to determine the position of such a particle, each measurement will give us only one value: in half the cases we will find the particle in region A, and in half - in B (4). This means that when a particle interacts with its environment, when only one of the states of the particle is fixed, its wave function seems to collapse, “collapse” into a point.

One of the fundamental claims of quantum mechanics is that physical objects are completely described by their wave functions. Thus, the whole point of the laws of physics comes down to predicting changes in wave functions over time. These laws fall into two categories depending on whether the system is left to itself or whether it is directly observed and measured.

In the first case, we are dealing with linear differential “equations of motion”, deterministic equations that completely describe the state of microparticles. Therefore, knowing the wave function of a particle at some point in time, one can accurately predict the behavior of the particle at any subsequent moment. However, when trying to predict the results of measurements of any properties of the same particle, we will have to deal with completely different laws - purely probabilistic ones.

A natural question arises: how to distinguish the conditions of applicability of one or another group of laws? The creators of quantum mechanics point to the need for a clear division of all physical processes into “measurements” and “physical processes themselves,” that is, into “observers” and “observeds,” or, in philosophical terminology, into subject and object. However, the difference between these categories is not fundamental, but purely relative. Thus, according to many physicists and philosophers, quantum theory in such an interpretation becomes ambiguous and loses its objectivity and fundamentality. The "measurement problem" has become a major stumbling block in quantum mechanics. The situation is somewhat reminiscent of Zeno's famous aporia "Heap". One grain is clearly not a heap, but a thousand (or, if you prefer, a million) is a heap. Two grains are also not a heap, but 999 (or 999999) are a heap. This chain of reasoning leads to a certain number of grains at which the concepts of “heap - not heap” become vague. They will depend on the subjective assessment of the observer, that is, on the method of measurement, even by eye.

All macroscopic bodies surrounding us are assumed to be point (or extended) objects with fixed coordinates, which obey the laws of classical mechanics. But this means that the classical description can be continued down to the smallest particles. On the other hand, coming from the side of the microworld, all objects should be included in the wave description bigger size up to the Universe as a whole. The boundary between the macro- and microworld is not defined, and attempts to define it lead to a paradox. The most clear example of this is the so-called “Schrödinger's cat problem,” a thought experiment proposed by Erwin Schrödinger in 1935 (5).

A cat is sitting in a closed box. There is also a bottle of poison, a radiation source and a charged particle counter connected to a device that breaks the bottle at the moment the particle is detected. If the poison spills, the cat will die. Whether the counter has registered a particle or not, we cannot know in principle: the laws of quantum mechanics are subject to the laws of probability. And from this point of view, until the counter has made measurements, it is in a superposition of two states - “registration - non-registration”. But then at this moment the cat finds itself in a superposition of states of life and death.

In reality, of course, there can be no real paradox here. Registration of a particle is an irreversible process. It is accompanied by a collapse of the wave function, followed by a mechanism that breaks the bottle. However, orthodox quantum mechanics does not consider irreversible phenomena. The paradox that arises in full agreement with its laws clearly shows that between the quantum microworld and the classical macroworld there is a certain intermediate region in which quantum mechanics does not work.

So, despite the undoubted success of quantum mechanics in explaining experimental facts, at the moment it can hardly claim to be a complete and universal description of physical phenomena. One of the most daring alternatives to quantum mechanics was the theory proposed by David Bohm.

Having set out to build a theory free from the uncertainty principle, Bohm proposed to consider a microparticle as a material point capable of occupying an exact position in space. Its wave function receives the status not of a characteristic of probability, but of a very real physical object, a kind of quantum mechanical field that exerts an instantaneous force effect. In the light of this interpretation, for example, the “Einstein-Podolsky-Rosen paradox” (see “Science and Life” No. 5, 1998) ceases to be a paradox. All laws governing physical processes become strictly deterministic and have the form of linear differential equations. One group of equations describes the change in wave functions over time, the other - their effect on the corresponding particles. The laws apply to all physical objects without exception - both “observers” and “observed”.

Thus, if at some moment the position of all particles in the Universe and the complete wave function of each are known, then in principle it is possible to accurately calculate the position of the particles and their wave functions at any subsequent moment in time. Consequently, there can be no talk of any randomness in physical processes. Another thing is that we will never be able to have all the information necessary for accurate calculations, and the calculations themselves turn out to be insurmountably complex. Fundamental ignorance of many system parameters leads to the fact that in practice we always operate with certain averaged values. It is this “ignorance,” according to Bohm, that forces us to resort to probabilistic laws when describing phenomena in the microworld (a similar situation arises in classical statistical mechanics, for example, in thermodynamics, which deals with a huge number of molecules). Bohm's theory provides certain rules for averaging unknown parameters and calculating probabilities.

Let us return to the experiments with electrons shown in Fig. 3 A and B. Bohm's theory gives them the following explanation. The direction of motion of the electron at the exit from the “vertical box” is completely determined by the initial conditions - the initial position of the electron and its wave function. While the electron moves either up or down, its wave function, as follows from the differential equations of motion, will split and begin to propagate in two directions at once. Thus, one part of the wave function will be “empty”, that is, it will propagate separately from the electron. Having been reflected from the walls, both parts of the wave function will reunite in the “black box”, and at the same time the electron will receive information about that part of the path where it was not. The content of this information, for example about an obstacle in the path of the “empty” wave function, can have a significant impact on the properties of the electron. This removes the logical contradiction between the results of the experiments shown in the figure. It is necessary to note one curious property of “empty” wave functions: being real, they nevertheless do not affect foreign objects in any way and cannot be recorded by measuring instruments. And the “empty” wave function exerts a force on “its” electron regardless of the distance, and this influence is transmitted instantly.

Many researchers have made attempts to “correct” quantum mechanics or explain the contradictions that arise in it. For example, de Broglie tried to build a deterministic theory of the microworld, who agreed with Einstein that “God does not play dice.” And the prominent Russian theorist D.I. Blokhintsev believed that the features of quantum mechanics stem from the impossibility of isolating a particle from the surrounding world. At any temperature above absolute zero, bodies emit and absorb electromagnetic waves. From the standpoint of quantum mechanics, this means that their position is continuously “measured”, causing the collapse of wave functions. “From this point of view, there are no isolated, left to themselves “free” particles,” wrote Blokhintsev. “It is possible that in this connection between particles and the environment the nature of the impossibility of isolating a particle, which manifests itself in the apparatus of quantum mechanics, is hidden.”

And yet, why has the interpretation of quantum mechanics proposed by Bohm still not received due recognition in the scientific world? And how to explain the almost universal dominance of traditional theory, despite all its paradoxes and “dark places”?

For a long time, they did not want to consider the new theory seriously on the grounds that in predicting the outcome of specific experiments it completely coincides with quantum mechanics, without leading to significantly new results. Werner Heisenberg, for example, believed that “for any experiment of his (Bohm’s) the results coincide with the Copenhagen interpretation. Hence the first consequence: Bohm’s interpretation cannot be refuted by experiment...” Some consider the theory to be erroneous, since it gives a predominant role to the position of the particle in space. In their opinion, this contradicts physical reality, because phenomena in quantum world fundamentally cannot be described by deterministic laws. There are many other, no less controversial arguments against Bohm's theory, which themselves require serious evidence. In any case, no one has really been able to completely refute it yet. Moreover, many researchers, including domestic ones, continue to work on its improvement.

Send

Quantum mechanics

What is quantum mechanics?

Quantum mechanics (QM; also known as quantum physics or quantum theory), including quantum field theory, is a branch of physics that studies the laws of nature that occur at small distances and at low energies of atoms and subatomic particles. Classical physics - physics that existed before quantum mechanics, follows from quantum mechanics as its limiting transition, valid only at large (macroscopic) scales. Quantum mechanics differs from classical physics in that energy, momentum, and other quantities are often limited to discrete values ​​(quantization), objects have characteristics of both particles and waves (wave-particle duality), and there are limits to the precision with which quantities can be measured. determined (uncertainty principle).

Quantum mechanics follows successively from Max Planck's 1900 solution to the black-body radiation problem (published 1859) and Albert Einstein's 1905 work proposing quantum theory to explain the photoelectric effect (published 1887). Early quantum theory was deeply rethought in the mid-1920s.

The rethought theory is formulated in the language of specially developed mathematical formalisms. In one, a mathematical function (wave function) provides information about the probability amplitude of the particle's position, momentum, and other physical characteristics.

Important areas of application of quantum theory are: quantum chemistry, superconducting magnets, light-emitting diodes, as well as laser, transistor and semiconductor devices such as the microprocessor, medical and research imaging such as magnetic resonance imaging and electron microscopy, and explanations of many biological and physical phenomena.

History of quantum mechanics

Scientific research The wave nature of light began in the 17th and 18th centuries, when scientists Robert Hooke, Christian Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803, Thomas Young, an English general scientist, conducted the famous double-slit experiment, which he later described in a paper entitled The Nature of Light and Colors. This experiment played an important role in the general acceptance of the wave theory of light.

In 1838, Michael Faraday discovered cathode rays. These studies were followed by Gustav Kirchhoff's formulation of the blackbody radiation problem in 1859, Ludwig Boltzmann's proposal in 1877 that the energy states of a physical system could be discrete, and Max Planck's quantum hypothesis in 1900. Planck's hypothesis that energy is emitted and absorbed in a discrete "quantum" (or packets of energy) matches exactly the observed patterns of blackbody radiation.

In 1896, Wilhelm Wien empirically determined the law of distribution of black body radiation, named after him, Wien's law. Ludwig Boltzmann independently came to this result by analyzing Maxwell's equations. However, the law only applied to high frequencies and reduced emission at low frequencies. Planck later corrected this model with a statistical interpretation of Boltzmann's thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics.

Following Max Planck's solution in 1900 to the problem of black body radiation (published 1859), Albert Einstein proposed quantum theory to explain the photoelectric effect (1905, published 1887). In the years 1900-1910, atomic theory and the corpuscular theory of light began to be widely accepted as scientific fact for the first time. Accordingly, these latter theories can be considered quantum theories of matter and electromagnetic radiation.

Among the first to study quantum phenomena in nature were Arthur Compton, C. W. Raman, and Peter Zeeman, each of whom has several quantum effects named after them. Robert Andrews Millikan studied the photoelectric effect experimentally, and Albert Einstein developed a theory for it. At the same time, Ernest Rutherford experimentally discovered the nuclear model of the atom, according to which Niels Bohr developed his theory of atomic structure, which was later confirmed by the experiments of Henry Moseley. In 1913, Peter Debye expanded Niels Bohr's theory of atomic structure by introducing elliptical orbits, a concept also proposed by Arnold Sommerfeld. This stage in the development of physics is known as old quantum theory.

According to Planck, the energy (E) of a radiation quantum is proportional to the radiation frequency (v):

where h - Planck's constant.

Planck carefully insisted that this was simply a mathematical expression of the processes of absorption and emission of radiation and had nothing to do with physical reality the radiation itself. In fact, he considered his quantum hypothesis to be a mathematical trick performed in order to get the right answer, rather than a major fundamental discovery. However, in 1905, Albert Einstein gave Planck's quantum hypothesis a physical interpretation and used it to explain the photoelectric effect, in which shining light on certain substances can cause electrons to be emitted from the substance. For this work, Einstein received the 1921 Nobel Prize in Physics.

Einstein then expanded on this idea to show that an electromagnetic wave, which is what light is, can also be described as a particle (later called a photon), with discrete quantum energy that depends on the frequency of the wave.

During the first half of the 20th century, Max Planck, Niels Bohr, Werner Heisenberg, Louis de Broglie, Arthur Compton, Albert Einstein, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wilhelm Wien, Shatyendranath Bose, Arnold Sommerfeld and others laid the foundations of quantum mechanics. Niels Bohr's Copenhagen interpretation has received universal recognition.

In the mid-1920s, developments in quantum mechanics led to it becoming the standard formulation for atomic physics. In the summer of 1925, Bohr and Heisenberg published results that closed the old quantum theory. Out of respect for their particle-like behavior in certain processes and measurements, light quanta came to be called photons (1926). From Einstein's simple postulate arose a flurry of discussions, theoretical constructions and experiments. Thus, entire fields of quantum physics emerged, leading to its widespread recognition at the Fifth Solvay Congress in 1927.

It was found that subatomic particles and electromagnetic waves are neither simply particles nor waves, but have certain properties of each. This is how the concept of wave-particle duality arose.

By 1930, quantum mechanics was further unified and formulated in the work of David Hilbert, Paul Dirac, and John von Neumann, which placed great emphasis on measurement, the statistical nature of our knowledge of reality, and philosophical reflections on the “observer.” It has subsequently penetrated into many disciplines, including quantum chemistry, quantum electronics, quantum optics, and quantum information science. Her theoretical modern developments include string theory and theories of quantum gravity. It also provides a satisfactory explanation of many features of the modern periodic table of elements and describes the behavior of atoms when chemical reactions and the movement of electrons in computer semiconductors, and therefore plays a critical role in many modern technologies.

Although quantum mechanics was built to describe the microscopic world, it is also needed to explain some macroscopic phenomena such as superconductivity and superfluidity.

What does the word quantum mean?

The word quantum comes from the Latin "quantum", which means "how much" or "how much". In quantum mechanics, a quantum means a discrete unit associated with certain physical quantities, such as the energy of an atom at rest. The discovery that particles are discrete packets of energy with wave-like properties led to the creation of the branch of physics that deals with atomic and subatomic systems, which is now called quantum mechanics. She lays the foundation mathematical basis for many areas of physics and chemistry, including condensed matter physics, solid state physics, atomic physics, molecular physics, computational physics, computational chemistry, quantum chemistry, particle physics, nuclear chemistry, and nuclear physics. Some fundamental aspects of the theory are still being actively studied.

The meaning of quantum mechanics

Quantum mechanics is essential for understanding the behavior of systems at atomic and smaller distance scales. If the physical nature of the atom were described solely by classical mechanics, then the electrons should not orbit the nucleus, since orbital electrons should emit radiation (due to circular motion) and eventually collide with the nucleus due to the loss of energy through radiation. Such a system could not explain the stability of atoms. Instead, electrons reside in uncertain, non-deterministic, smeared, probabilistic wave-particle orbitals around the nucleus, contrary to traditional concepts of classical mechanics and electromagnetism.

Quantum mechanics was originally developed to better explain and describe the atom, especially the differences in the spectra of light emitted by different isotopes of the same chemical element, as well as to describe subatomic particles. In short, the quantum mechanical model of the atom has been amazingly successful in an area where classical mechanics and electromagnetism have failed.

Quantum mechanics includes four classes of phenomena that classical physics cannot explain:

  • quantization of individual physical properties
  • quantum entanglement
  • uncertainty principle
  • wave-particle duality

Mathematical foundations of quantum mechanics

In the mathematically rigorous formulation of quantum mechanics developed by Paul Dirac, David Hilbert, John von Neumann, and Hermann Weyl, the possible states of a quantum mechanical system are symbolized by unit vectors (called state vectors). Formally, they belong to the complex separable Hilbert space - otherwise, the state space or the associated Hilbert space of the system, and are defined up to a product of complex number with a single module (phase multiplier). In other words, possible states are points in the projective space of Hilbert space, typically called complex projective space. The exact nature of this Hilbert space depends on the system - for example, the state space of position and momentum is the space of square-integrable functions, while the state space for the spin of a single proton is just the direct product of two complex planes. Each physical quantity is represented by a hypermaximally Hermitian (more precisely: self-adjoint) linear operator, acting on the state space. Each eigenstate of a physical quantity corresponds to an eigenvector of the operator, and its associated eigenvalue corresponds to the value of the physical quantity in that eigenstate. If the spectrum of the operator is discrete, the physical quantity can only take discrete eigenvalues.

In the formalism of quantum mechanics, the state of a system at a given moment is described by a complex wave function, also called a state vector in a complex vector space. This abstract mathematical object allows you to calculate the probabilities of the outcomes of specific experiments. For example, it allows you to calculate the probability of an electron being in a certain area around the nucleus at a certain time. Unlike classical mechanics, simultaneous predictions with arbitrary precision can never be made for conjugate variables such as position and momentum. For example, electrons can be assumed to be (with some probability) located somewhere within a given region of space, but their exact location is unknown. You can draw regions of constant probability, often called “clouds,” around the nucleus of an atom to represent where the electron is most likely to be. The Heisenberg uncertainty principle quantifies the inability to accurately localize a particle with a given momentum, which is the conjugate of position.

According to one interpretation, as a result of the measurement, the wave function containing information about the probability of the state of the system decays from a given initial state to a certain eigenstate. Possible results of the measurement are the eigenvalues ​​of the operator representing the physical quantity - which explains the choice of the Hermitian operator, in which all eigenvalues ​​are real numbers. Probability distribution of a physical quantity in this state, can be found by calculating the spectral decomposition of the corresponding operator. The Heisenberg uncertainty principle is represented by a formula in which operators corresponding to certain quantities do not commute.

Measurement in quantum mechanics

The probabilistic nature of quantum mechanics thus follows from the act of measurement. This is one of the most difficult aspects of quantum systems to understand, and was a central theme in Bohr's famous debate with Einstein, in which both scientists attempted to clarify these fundamental principles through thought experiments. In the decades following the formulation of quantum mechanics, the question of what constitutes a “measurement” was widely studied. New interpretations of quantum mechanics have been formulated to do away with the concept of wave function collapse. The basic idea is that when a quantum system interacts with a measuring apparatus, their respective wave functions become entangled, so that the original quantum system ceases to exist as an independent entity.

The probabilistic nature of quantum mechanics predictions

As a rule, quantum mechanics does not assign specific values. Instead, it makes a prediction using a probability distribution; that is, it describes the probability of receiving possible results from the measurement of a physical quantity. Often these results are distorted, like probability density clouds, by many processes. Probability density clouds are an approximation (but better than the Bohr model) in which the location of the electron is given by a probability function, wave functions corresponding to the eigenvalues, such that the probability is the square of the modulus of the complex amplitude, or quantum state of nuclear attraction. Naturally, these probabilities will depend on the quantum state at the “moment” of measurement. Consequently, uncertainty is introduced into the measured value. There are, however, some states that are associated with certain values ​​of a particular physical quantity. They are called eigenstates (eigenstates) of a physical quantity ("eigen" can be translated from German as "inherent" or "inherent").

It is natural and intuitive that everything in everyday life (all physical quantities) has its own values. Everything seems to have a certain position, a certain moment, a certain energy, and a certain time of occurrence. However, quantum mechanics does not specify the precise values ​​of a particle's position and momentum (since they are conjugate pairs) or its energy and time (since they are also conjugate pairs); more precisely, it provides only the range of probabilities with which that particle can have a given momentum and the probability of momentum. Therefore, it is advisable to distinguish between states that have uncertain values ​​and states that have definite values ​​(eigenstates). As a rule, we are not interested in a system in which the particle does not have its own value of a physical quantity. However, when measuring a physical quantity, the wave function instantly takes on the eigenvalue (or "generalized" eigenvalue) of that quantity. This process is called wave function collapse, a controversial and much discussed process in which the system under study is expanded by adding a measuring device to it. If you know the corresponding wave function immediately before the measurement, you can calculate the probability that the wave function will go to each of the possible eigenstates. For example, the free particle in the previous example typically has a wave function, which is a wave packet centered around some average position x0 (having no position and momentum eigenstates). When the position of a particle is measured, it is impossible to predict the result with certainty. It is likely, but not certain, that it will be near x0, where the amplitude of the wave function is large. After performing a measurement, having obtained some result x, the wave function collapses into its own function of the position operator centered at x.

Schrödinger equation in quantum mechanics

The time evolution of a quantum state is described by the Schrödinger equation, in which the Hamiltonian (the operator corresponding to the total energy of the system) generates the time evolution. The time evolution of wave functions is deterministic in the sense that - given what the wave function was at the initial time - one can make a clear prediction of what the wave function will be at any time in the future.

On the other hand, during the measurement, the change from the original wave function to another, later wave function will not be deterministic, but will be unpredictable (i.e. random). An emulation of time evolution can be seen here.

Wave functions change over time. The Schrödinger equation describes the change in wave functions over time, and plays a role similar to the role of Newton's second law in classical mechanics. The Schrödinger equation, applied to the above example of a free particle, predicts that the center of the wave packet will move through space at a constant speed (like a classical particle in the absence of forces acting on it). However, the wave packet will also spread out over time, meaning the position becomes more uncertain over time. This also has the effect of turning the position eigenfunction (which can be thought of as an infinitely sharp peak of the wave packet) into an extended wave packet that no longer represents the (defined) position eigenvalue.

Some wave functions produce probability distributions that are constant or independent of time—for example, when in a stationary state with constant energy, time disappears from the modulus of the square of the wave function. Many systems that are considered dynamic in classical mechanics are described in quantum mechanics by such “static” wave functions. For example, a single electron in an unexcited atom is represented classically as a particle moving in a circular path around the atomic nucleus, while in quantum mechanics it is described by a static, spherically symmetric wave function surrounding the nucleus (Fig. 1) (note, however, that only the lowest states of orbital angular momentum, denoted s, are spherically symmetric).

The Schrödinger equation acts on the entire probability amplitude, and not just on its absolute value. While the absolute value of the probability amplitude contains information about probabilities, its phase contains information about the mutual influence between quantum states. This gives rise to "wave-like" behavior of quantum states. As it turns out, analytical solutions to the Schrödinger equation are possible only for a very small number of Hamiltonians of relatively simple models, such as the quantum harmonic oscillator, the particle in a box, the hydrogen molecule ion and the hydrogen atom - these are the most important representatives of such models. Even the helium atom, which contains only one more electron than the hydrogen atom, has defied any attempt at a purely analytical solution.

However, there are several methods for obtaining approximate solutions. An important technique known as perturbation theory uses an analytical result obtained for a simple quantum mechanical model and from this generates a result for a more complex model that differs from the simpler model (for example) by adding weak potential field energy. Another approach is the "quasi-classical approximation" method, which is applied to systems for which quantum mechanics applies only to weak (small) deviations from classical behavior. These deviations can then be calculated from the classical motion. This approach is especially important when studying quantum chaos.

Mathematically equivalent formulations of quantum mechanics

There are numerous mathematically equivalent formulations of quantum mechanics. One of the oldest and most commonly used formulations is the "transformation theory" proposed by Paul Dirac, which combines and generalizes the two earliest formulations of quantum mechanics - matrix mechanics (created by Werner Heisenberg) and wave mechanics (created by Erwin Schrödinger).

Given that Werner Heisenberg was awarded the Nobel Prize in Physics in 1932 for the development of quantum mechanics, Max Born's role in the development of QM was overlooked until he was awarded the Nobel Prize in 1954. This role is mentioned in a 2005 biography of Born, which talks about his role in the matrix formulation of quantum mechanics, as well as the use of probability amplitudes. In 1940, Heisenberg himself admitted in a commemorative volume in honor of Max Planck that he learned about matrices from Born. In the matrix formulation, the instantaneous state of a quantum system determines the probabilities of its measurable properties or physical quantities. Examples of quantities include energy, position, momentum, and orbital momentum. Physical quantities can be either continuous (eg the position of a particle) or discrete (eg the energy of an electron bound to a hydrogen atom). Feynman path integrals are an alternative formulation of quantum mechanics in which the quantum mechanical amplitude is considered to be the sum over all possible classical and non-classical trajectories between the initial and final states. This is the quantum mechanical analogue of the principle of least action in classical mechanics.

Laws of quantum mechanics

The laws of quantum mechanics are fundamental. It is stated that the state space of a system is Hilbertian, and the physical quantities of that system are Hermitian operators acting on that space, although it is not stated which exactly these Hilbert spaces are or which exactly these operators are. They can be selected accordingly to obtain a quantitative characteristic of the quantum system. An important guideline for making these decisions is the correspondence principle, which states that the predictions of quantum mechanics reduce to classical mechanics when the system moves into the region of high energies or, equivalently, into the region of large quantum numbers, that is, while an individual particle has a certain degree of randomness; in systems containing millions of particles, average values ​​predominate and, when approaching the high-energy limit, the statistical probability of random behavior tends to zero. In other words, classical mechanics is simply quantum mechanics large systems. This "high energy" limit is known as the classical or correspondence limit. Thus, the solution can even begin with an established classical model of a particular system, and then try to guess the underlying quantum model that would generate such a classical model when passing to the matching limit.

When quantum mechanics was originally formulated, it was applied to models whose limit of correspondence was non-relativistic classical mechanics. For example, the well-known quantum harmonic oscillator model uses an explicitly non-relativistic expression for the kinetic energy of the oscillator and is thus a quantum version of the classical harmonic oscillator.

Interaction with other scientific theories

Early attempts to combine quantum mechanics with special relativity involved replacing the Schrödinger equation with covariant equations such as the Klein-Gordon equation or the Dirac equation. Although these theories were successful in explaining many experimental results, they had certain unsatisfactory qualities arising from the fact that they did not take into account the relativistic creation and destruction of particles. A fully relativistic quantum theory required the development of a quantum field theory that involves quantizing a field (rather than a fixed set of particles). The first full-fledged quantum field theory, quantum electrodynamics, provides a complete quantum description of electromagnetic interaction. The full apparatus of quantum field theory is often not required to describe electrodynamic systems. A simpler approach, used since the creation of quantum mechanics, is to consider charged particles as quantum mechanical objects that are subject to a classical electromagnetic field. For example, the elementary quantum model of the hydrogen atom describes the electric field of the hydrogen atom using the classical expression for the Coulomb potential:

E2/(4πε0r)

This “quasi-classical” approach does not work if quantum fluctuations of the electromagnetic field play an important role, for example, when photons are emitted by charged particles.

Quantum field theories for strong and weak nuclear forces were also developed. The quantum field theory for strong nuclear interactions is called quantum chromodynamics and describes the interactions of subnuclear particles such as quarks and gluons. The weak nuclear and electromagnetic forces were unified in their quantized forms into a unified quantum field theory (known as the electroweak force) by physicists Abdus Salam, Sheldon Glashow and Steven Weinberg. For this work, all three received the Nobel Prize in Physics in 1979.

It has proven difficult to build quantum models for the fourth remaining fundamental force, gravity. Semiclassical approximations have been performed, leading to predictions such as Hawking radiation. However, the formulation of a complete theory of quantum gravity is hampered by apparent incompatibilities between general relativity (which is the most exact theory gravity as currently known) and some of the fundamental principles of quantum theory. Resolving these incompatibilities is an area of ​​active research and theory, such as string theory, one of the possible candidates for a future theory of quantum gravity.

Classical mechanics was also extended into the complex field, with complex classical mechanics beginning to behave similarly to quantum mechanics.

The connection between quantum mechanics and classical mechanics

The predictions of quantum mechanics have been confirmed experimentally with a very high degree of accuracy. According to the correspondence principle between classical and quantum mechanics, all objects obey the laws of quantum mechanics, and classical mechanics is only an approximation for large systems of objects (or statistical quantum mechanics for a large set of particles). Thus, the laws of classical mechanics follow from the laws of quantum mechanics as a statistical average when tending to a very large limiting value of the number of elements of the system or the values ​​of quantum numbers. However, chaotic systems lack good quantum numbers, and quantum chaos studies the connection between classical and quantum descriptions of these systems.

Quantum coherence is an essential difference between classical and quantum theories, exemplified by the Einstein–Podolsky–Rosen (EPR) paradox, and has become an attack on the established philosophical interpretation of quantum mechanics by appealing to local realism. Quantum interference involves the addition of probability amplitudes, while classical "waves" involve the addition of intensities. For microscopic bodies, the extent of the system is much less than the coherence length, which leads to entanglement over long distances and other nonlocal phenomena characteristic of quantum systems. Quantum coherence does not usually occur on macroscopic scales, although exceptions to this rule may occur at extremely low temperatures(i.e., approaching absolute zero), at which quantum behavior can manifest itself on a macroscopic scale. This is in accordance with the following observations:

Many macroscopic properties of a classical system are a direct consequence of the quantum behavior of its parts. For example, the stability of the bulk of matter (consisting of atoms and molecules, which under the influence of electrical forces alone would quickly collapse), the rigidity of solids, as well as the mechanical, thermal, chemical, optical and magnetic properties of matter are the result of the interaction of electric charges in accordance with rules of quantum mechanics.

While the seemingly "exotic" behavior of matter postulated by quantum mechanics and relativity becomes more apparent when dealing with very small particles or traveling at speeds approaching the speed of light, the laws of classical, often called "Newtonian" physics remain accurate when predicting the behavior of the overwhelming number of “large” objects (on the order of the size of large molecules or even larger) and at speeds much lower than the speed of light.

What is the difference between quantum mechanics and classical mechanics?

Classical and quantum mechanics are very different in that they use very different kinematic descriptions.

According to the well-established opinion of Niels Bohr, the study of quantum mechanical phenomena requires experiments with a complete description of all the devices of the system, preparatory, intermediate and final measurements. Descriptions are presented in macroscopic terms expressed in ordinary language, supplemented by concepts of classical mechanics. The initial conditions and final state of the system are respectively described by a position in configuration space, such as coordinate space, or some equivalent space such as momentum space. Quantum mechanics does not allow for a completely accurate description, both in terms of position and momentum, of an exact deterministic and causal prediction of the final state from the initial conditions or "state" (in the classical sense of the word). In this sense, promoted by Bohr in his mature works, a quantum phenomenon is a process of transition from an initial to a final state, and not an instantaneous "state" in the classical sense of the word. Thus, there are two types of processes in quantum mechanics: stationary and transient. For stationary processes, the initial and final positions are the same. For transitional ones, they are different. It is obvious by definition that if only the initial condition is given, then the process is not defined. Given the initial conditions, prediction of the final state is possible, but only on a probabilistic level, since the Schrödinger equation is deterministic for the evolution of the wave function, and the wave function describes the system only in a probabilistic sense.

In many experiments it is possible to take the initial and final state of the system as a particle. In some cases, it turns out that there are potentially multiple spatially distinct paths or trajectories along which a particle can transition from an initial to a final state. An important feature of the quantum kinematic description is that it does not allow us to unambiguously determine which of these paths produces the transition between states. Only the initial and final conditions are defined, and, as stated in the previous paragraph, they are defined only as accurately as the description by spatial configuration or its equivalent allows. In every case for which a quantum kinematic description is needed, there is always a good reason for this limitation of kinematic accuracy. The reason is that for a particle to be experimentally found in a certain position, it must be stationary; to experimentally find a particle with a certain momentum, it must be in free movement; these two requirements are logically incompatible.

Initially, classical kinematics are not required experimental description its phenomena. This makes it possible to completely accurately describe the instantaneous state of the system by position (point) in phase space - the Cartesian product of configuration and momentum spaces. This description simply assumes, or imagines, the state as a physical entity, without worrying about its experimental measurability. This description of the initial state, together with Newton's laws of motion, allows an accurate deterministic and cause-and-effect prediction of the final state, along with a defined trajectory of the system's evolution, to be made. For this purpose, Hamiltonian dynamics can be used. Classical kinematics also allows a description of the process, similar to the description of the initial and final state used by quantum mechanics. Lagrangian mechanics allows us to do this. For processes in which it is necessary to take into account the magnitude of the action of the order of several Planck constants, classical kinematics is not suitable; this requires the use of quantum mechanics.

General theory of relativity

Even though the defining postulates of general relativity and Einstein's quantum theory are unequivocally supported by rigorous and repeatable empirical evidence, and although they do not contradict each other theoretically (at least in relation to their primary statements), they have proven extremely difficult to integrate into one coherent , a single model.

Gravity can be neglected in many areas of particle physics, so the unification between general relativity and quantum mechanics is not a pressing issue in these particular applications. However, the lack of a correct theory of quantum gravity is an important issue in physical cosmology and physicists' search for an elegant "Theory of Everything" (TV). Therefore, resolving all the inconsistencies between both theories is one of the main goals for 20th and 21st century physics. Many eminent physicists, including Stephen Hawking, have labored over the years in an attempt to discover the theory behind it all. This TV will combine not only different models of subatomic physics, but also derive the four fundamental forces of nature - the strong force, electromagnetism, the weak force and gravity - from a single force or phenomenon. While Stephen Hawking initially believed in TV, after considering Gödel's incompleteness theorem, he concluded that such a theory was not feasible and stated this publicly in his lecture "Gödel and the End of Physics" (2002).

Basic theories of quantum mechanics

The quest to unify fundamental forces through quantum mechanics is still ongoing. Quantum electrodynamics (or "quantum electromagnetism"), which is currently (at least in the perturbative regime) the most accurate tested physical theory in rivalry with general relativity, successfully unifies the weak nuclear forces into the electroweak force and is currently being worked on to combine the electroweak and strong interactions into the electrostrong interaction. Current predictions state that around 1014 GeV the three above-mentioned forces merge into a single unified field. In addition to this "grand unification", it is proposed that gravity can be unified with the other three gauge symmetries, which is expected to occur at about 1019 GeV. However - and while special relativity is carefully incorporated into quantum electrodynamics - extended general relativity, currently the best theory describing gravitational forces, is not fully incorporated into quantum theory. One of the people developing a coherent theory of everything, Edward Witten, a theoretical physicist, formulated M-theory, which is an attempt to expound supersymmetry on the basis of superstring theory. M-theory suggests that our apparent 4-dimensional space is actually an 11-dimensional space-time continuum, containing ten space dimensions and one time dimension, although the 7 space dimensions at low energies are completely "densified" (or infinitely curved) and are not easily measured or researched.

Another popular theory is Loop quantum gravity (LQG), a theory first proposed by Carlo Rovelli that describes the quantum properties of gravity. It is also a theory of quantum space and quantum time, since in general relativity the geometric properties of space-time are a manifestation of gravity. LQG is an attempt to unify and adapt standard quantum mechanics and standard general relativity. The main result of the theory is a physical picture in which space is granular. Graininess is a direct consequence of quantization. It has the same granularity of photons in quantum theory of electromagnetism or discrete energy levels of atoms. But here the space itself is discrete. More precisely, space can be considered as an extremely thin fabric or network, “woven” from finite loops. These loop networks are called spin networks. The evolution of a spin network over time is called spin foam. The predicted size of this structure is the Planck length, which is approximately 1.616 × 10-35 m. According to theory, there is no point in a shorter length than this. Therefore, LQG predicts that not only matter, but also space itself, has an atomic structure.

Philosophical aspects of quantum mechanics

Since its inception, the many paradoxical aspects and results of quantum mechanics have given rise to intense philosophical debate and a variety of interpretations. Even fundamental questions, such as Max Born's basic rules regarding probability amplitude and probability distribution, took decades to be appreciated by society and many leading scientists. Richard Feynman once said, “I think I can safely say that no one understands quantum mechanics.” In the words of Steven Weinberg, “There is, in my opinion, no completely satisfactory interpretation of quantum mechanics now.

The Copenhagen interpretation - largely thanks to Niels Bohr and Werner Heisenberg - remains the most acceptable among physicists for 75 years after its proclamation. According to this interpretation, the probabilistic nature of quantum mechanics is not a temporary feature that will eventually be replaced by a deterministic theory, but should be seen as a final rejection of the classical idea of ​​"causation". In addition, it is believed that any well-defined applications of quantum mechanical formalism must always make reference to the experimental design due to the interconnected nature of the evidence obtained in different experimental situations.

Albert Einstein, while one of the founders of quantum theory, himself did not accept some of the more philosophical or metaphysical interpretations of quantum mechanics, such as the rejection of determinism and causation. His most quoted famous response to this approach is: “God doesn’t play dice.” He rejected the concept that the state of a physical system depends on the experimental measurement setup. He believed that natural phenomena occur according to their own laws, regardless of whether and how they are observed. In this regard, it is supported by the currently accepted definition of a quantum state, which remains invariant under an arbitrary choice of the configuration space for its representation, that is, the method of observation. He also believed that the basis of quantum mechanics should be a theory that carefully and directly expresses a rule that rejects the principle of action at a distance; in other words, he insisted on the principle of locality. He considered, but theoretically justifiably rejected, the particular idea of ​​hidden variables to avoid uncertainty or lack of cause-and-effect relationships in quantum mechanical measurements. He believed that quantum mechanics was valid at that time, but not the final and unshakable theory of quantum phenomena. He believed that its future replacement would require deep conceptual advances and that it would not happen quickly or easily. The Bohr-Einstein discussions provide a clear critique of the Copenhagen interpretation from an epistemological point of view.

John Bell showed that this "EPR" paradox led to experimentally testable differences between quantum mechanics and theories that rely on the addition of hidden variables. Experiments have been carried out to prove the accuracy of quantum mechanics, thereby demonstrating that quantum mechanics cannot be improved by adding hidden variables. Alain Aspect's initial experiments in 1982 and many subsequent experiments since then have definitively confirmed quantum entanglement.

Entanglement, as Bell's experiments showed, does not violate cause-and-effect relationships, since no information transfer occurs. Quantum entanglement forms the basis of quantum cryptography, which is proposed for use in highly secure commercial applications in banking and government.

Everett's many-worlds interpretation, formulated in 1956, holds that all the possibilities described by quantum theory simultaneously arise in a multiverse consisting primarily of independent parallel universes. This is not achieved by introducing some “new axiom” into quantum mechanics, but on the contrary, it is achieved by removing the wave packet decay axiom. All possible sequential states of the measured system and the measuring device (including the observer) are present in a real physical - and not just a formal mathematical, as in other interpretations - quantum superposition. Such a superposition of successive combinations of states of different systems is called an entangled state. While the multiverse is deterministic, we perceive non-deterministic behavior, random in nature, since we can only observe the universe (i.e., the contribution of a compatible state to the above superposition) in which we, as observers, inhabit. Everett's interpretation fits perfectly with John Bell's experiments and makes them intuitive. However, according to the theory of quantum decoherence, these “parallel universes” will never be accessible to us. Inaccessibility can be understood this way: once a measurement is made, the system being measured becomes entangled with both the physicist who measured it and with a huge number of other particles, some of which are photons, flying away at the speed of light to the other end of the universe. To prove that the wave function has not decayed, it is necessary to bring all these particles back and measure them again along with the system that was originally measured. Not only is this completely impractical, but even if it could theoretically be done, it would have to destroy any evidence that the original measurement took place (including the physicist's memory). In light of these Bell experiments, Cramer formulated his transactional interpretation in 1986. In the late 1990s, relational quantum mechanics emerged as a modern derivative of the Copenhagen interpretation.

Quantum mechanics has had enormous success in explaining many features of our Universe. Quantum mechanics is often the only tool available that can reveal the individual behavior of the subatomic particles that make up all forms of matter (electrons, protons, neutrons, photons, etc.). Quantum mechanics has greatly influenced string theory, a contender for the Theory of Everything.

Quantum mechanics is also critical to understanding how individual atoms form covalent bonds to form molecules. The application of quantum mechanics to chemistry is called quantum chemistry. Relativistic quantum mechanics can, in principle, mathematically describe most chemistry. Quantum mechanics can also provide a quantitative understanding of the processes of ionic and covalent bonding by explicitly showing which molecules energetically match with other molecules and at what energy values. In addition, most calculations in modern computational chemistry rely on quantum mechanics.

In many industries, modern technologies operate at scales where quantum effects are significant.

Quantum physics in electronics

Many modern electronic devices are designed using quantum mechanics. For example, laser, transistor (and thus microchip), electron microscope and magnetic resonance imaging (MRI). The study of semiconductors led to the invention of the diode and transistor, which are essential components modern electronic systems, computer and telecommunication devices. Another application is the light-emitting diode, which is a highly efficient light source.

Many electronic devices operate under the influence of quantum tunneling. It is even present in a simple switch. The switch would not work if electrons could not quantum tunnel through the oxide layer on the metal contact surfaces. Flash memory chips, the main component of USB storage devices, use quantum tunneling to erase information in their cells. Some negative differential resistance devices, such as the resonant tunnel diode, also use the quantum tunneling effect. Unlike classical diodes, the current in it flows under the influence of resonant tunneling through two potential barriers. Its mode of operation with negative resistance can only be explained by quantum mechanics: as the energy of the state of bound carriers approaches the Fermi level, the tunneling current increases. As you move away from the Fermi level, the current decreases. Quantum mechanics is vital to understanding and designing these types of electronic devices.

Quantum cryptography

Researchers are currently looking for reliable methods to directly manipulate quantum states. Efforts are being made to fully develop quantum cryptography, which theoretically will guarantee the secure transmission of information.

Quantum computing

A more distant goal is the development of quantum computers, which are expected to perform certain computational tasks exponentially faster than classical computers. Instead of classical bits, quantum computers use qubits, which can exist in a superposition of states. Another active research topic is quantum teleportation, which deals with methods for transmitting quantum information over arbitrary distances.

Quantum effects

While quantum mechanics primarily applies to atomic systems with smaller amounts of matter and energy, some systems exhibit quantum mechanical effects on larger scales. Superfluidity - the ability of a fluid flow to move without friction at a temperature near absolute zero, is one famous example such effects. Closely related to this phenomenon is the phenomenon of superconductivity - the flow of electron gas ( electricity), moving without resistance in a conductive material at sufficiently low temperatures. The fractional quantum Hall effect is a topological ordered state that corresponds to models of quantum entanglement operating over long distances. States with different topological order (or different configurations long-range entanglement) cannot introduce state changes into each other without phase transformations.

Quantum theory

Quantum theory also contains precise descriptions of many previously unexplained phenomena, such as blackbody radiation and the stability of orbital electrons in atoms. She also provided insight into the work of many different biological systems, including olfactory receptors and protein structures. Recent research into photosynthesis has shown that quantum correlations play an important role in this fundamental process occurring in plants and many other organisms. However, classical physics can often provide good approximations to the results obtained by quantum physics, usually in conditions of large numbers of particles or large quantum numbers. Because classical formulas are much simpler and easier to compute than quantum formulas, the use of classical approximations is preferred when the system is large enough to make the effects of quantum mechanics negligible.

Movement of a free particle

For example, consider a free particle. In quantum mechanics, wave-particle duality is observed, so that the properties of a particle can be described as the properties of a wave. Thus, a quantum state can be represented as a wave of arbitrary shape and extending through space as a wave function. The position and momentum of a particle are physical quantities. The uncertainty principle states that position and momentum cannot be accurately measured at the same time. However, it is possible to measure the position (without measuring momentum) of a moving free particle by creating a position eigenstate with a wave function (Dirac delta function) that has a very great importance at a certain position x, and zero in other positions. If you perform a position measurement with such a wave function, then the result will be x with a probability of 100% (that is, with complete confidence, or with complete accuracy). This is called the eigenvalue (state) of the position or, specified in mathematical terms, the eigenvalue of the generalized coordinate (eigendistribution). If a particle is in its own state of position, then its momentum is absolutely indeterminable. On the other hand, if the particle is in its own state of momentum, then its position is completely unknown. In an eigenstate of a pulse whose eigenfunction is in the form of a plane wave, it can be shown that the wavelength is equal to h/p, where h is Planck's constant and p is the momentum of the eigenstate.

Rectangular potential barrier

This is a model of the quantum tunneling effect, which plays an important role in the production of modern technological devices such as flash memory and scanning tunneling microscopes. Quantum tunneling is a central physical process occurring in superlattices.

Particle in a one-dimensional potential box

A particle in a one-dimensional potential box is the simplest mathematical example in which spatial constraints lead to quantization of energy levels. A box is defined as having zero potential energy everywhere inside a certain region and infinite potential energy everywhere outside that region.

Final potential well

A finite potential well is a generalization of the infinite potential well problem, which has a finite depth.

The problem of a finite potential well is mathematically more complex than the problem of a particle in an infinite potential box, since the wave function does not vanish at the walls of the well. Instead, the wave function must satisfy more complex mathematical boundary conditions since it is nonzero in the region outside the potential well.



What else to read