By Dean Johnstone
In physics, it is common to expect the emergence of quantum effects at microscopic scales. Atoms and subatomic particles can behave both as a classical particle and quantum matter wave in accordance to Wave-particle duality. As such, all particles have a corresponding de-Brogile wavelength which depends on the mass and velocity of the particle. This wavelength can effectively be viewed as the spatial extent of the probabilistic, wavelike nature. Consequently, the small masses associated with microscopic particles means that their de-Brogile wavelength is sufficiently large compared to macroscopic objects, hence reflecting the appearance of quantum properties. For comparison, the de-Brogile wavelength of a snooker ball is about
times smaller than an electrons de-Brogile wavelength when moving at the same speed, which explains why objects above and at the macroscopic scale (visible to the eye) do not act in a quantum manner and interfere or diffract with one another.
However, as it turns out, collective quantum effects from a gas of Ultracold Atoms can in fact behave as quantum matter waves at macroscopic scales. A gas of atoms near absolute zero can effectively act as a single, giant quantum atom. This macroscopic object can then be manipulated to emulate the structure of crystals with light, fabricate an ideal quantum simulator or even generate artificial black holes in laboratories.
In this article, we shall discuss some of the important differences between classical and quantum particles before turning our attention to a gas of cold atoms. From there, we will then talk about the history and important properties of Ultracold Gases, before concluding with some of the most significant applications they hold within Physics and Technology.
Classical & Quantum Particles: A Comparison
Before we begin our discussion on the subject of Ultracold Atoms, it is important to first outline the notion between classical and quantum particles.
In the classical scenario, one may envisage particles as small billiard balls with a well-defined position in space. Its response to a set of applied forces can be described with the models of classical mechanics, hence we can map out the trajectory of a particle with an absolute, deterministic path. As it turns out, this description can still be deduced from Quantum Mechanics under the correct conditions and simplifications. However, at and below atomic length scales, the quantum description is necessary in order to capture fundamental properties from nature, such as the Photoelectric Effect and stability of electron orbitals around a nucleus of an atom.
Contrary to the classical setup, quantum particles are inherently non-deterministic. Instead of simple billiard balls, a quantum particle is represented by a wavefunction over space, which essentially assigns probabilities to the existence of the particle at a particular spatial position.
The dynamics of this particle is no longer described by the laws of classical mechanics, but rather Erwin Schrödingers famous Wave Equation. In other words, the particle behaves as a probability wave. Just as the path of a classical particle is deterministic, so too, is the wavefunction evolution of the quantum particle. However, the key difference in the quantum scenario is that the wavefunction evolution over time represents probabilities for the particle to take a particular path from start to finish, instead of just one absolute path in the classical case.
The concept of a quantum probability wave can seem rather peculiar and raises a natural question: How does one measure such a wave? In order to observe quantum effects, we require a particle with a sufficiently large de-Brogile wavelength (that is, a microscopic object) and the absence of external interactions with the environment. In this context, interactions can refer to influences from external forces such as magnetic fields, high temperatures or collective decoherence effects from classical objects. If these interactions are too strong, then wave function collapse is likely to occur, which is to say that the particle would behave as it would in the classical description and lose its quantum properties.
As such, it is necessary to typically constrain particles within a vacuum chamber. After letting the particle evolve in some predefined manner, a measurement of the system can then be taken, such as its position. The act of measurement implies some form of controlled interaction mechanism with the quantum particle, hence, wave function collapse will occur and one obtains a classical result; the position of the particle is known. This process can then be repeated several times to obtain a distribution of measured positions over space. From this information, one can then infer the most statistically likely probability wave prior to measurement that produces the given distribution of spatial points.
With the notion of quantum particles as probability waves made clear, we can now focus our attention on how these effects are vital in understanding the properties of Ultracold Atoms and collective properties of a gas.
Atoms are composite particles that form the building blocks of all matter. Despite their composite nature, atoms can still collectively be described by probability waves similar to the single particle case.
Suppose we now have a gas of atoms in a box at some particular temperature. Atoms will be moving in many different directions with a range of speeds. As discussed earlier, a different atomic velocity corresponds to a different de-Brogile wavelength. In this case, it is natural to define the Thermal de-Brogile Wavelength of the gas; which is approximately the average of the individual de-Brogile Wavelengths for each atom. Similar to before, the thermal wavelength can effectively be viewed as the spatial range over which the probability wave of an atom extends. This range can be tuned by changing the temperature of the atoms, which corresponds to a change in their thermal speed. Faster atoms have a higher temperature, whereas slow atoms have a lower one.
If the temperature is hot, the thermal wavelength is small, usually much smaller than the average separation between atoms and so the atoms are said to be localised. The gas in this case behaves as it would classically: as a Maxwell-Boltzmann gas with distinguishable atoms.
When considering colder temperatures, the situation becomes more interesting as there will be a larger thermal wavelength, meaning that atoms become delocalised across a much larger region of space. Atoms are now indistinguishable in the sense that their wavefunction extent is comparable to the interatomic separation, thus quantum interference effects become significant due to an overlap of wavefunctions. At this stage, it is now vital to consider what types of atoms we have in order to understand the statistical properties of the gas. Since atoms are made up of different particles, we may refer to them as either Bosons (even total number of particles) or Fermions (odd total number of particles).
Bosons are particles that can occupy the same quantum state. Such particles contrast Fermions which obey the Pauli Exclusion Principle; stating that no two Fermions can occupy the same quantum state. In other words, no two Fermions can exist in the same position state and tend to avoid one another, as opposed to Bosons which do not have this limitation.
Depending on whether the atoms are Bosonic or Fermionic, we refer to the gas as either a Bose or Fermi gas respectively. By further reducing the temperature of the gas, we can open up the possibility for new phase transitions based on collective quantum effects.
Quantum Behaviour at Macroscopic Scales
In 1924, Satyendra Nath Bose was able to derive Planck’s Law of electromagnetic radiation using a quantum statistical description of photons; the quanta of light. This remarkable outcome would then inspire Albert Einstein to extend the idea to an ideal gas of Bosonic particles in 1925, leading to the concept of a Bose quantum gas and Bose-Einstein statistics. From this framework, Einstein would also propose that below a critical temperature near absolute zero (0K or -273.15 ֯C), a large number of particles would condense to a single quantum mechanical ground state; the lowest possible energy state for the atoms. This has profound implications on the overall behaviour of the gas, since the wavefunction extent of individual Bosons is now much larger than the average atomic separation. In this scenario, atoms will no longer act independently, but rather as a single macroscopic entity. The entire gas can be described by a single, giant probability wave: The Bose-Einstein Condensate phase of matter.
Although not known at the time, the condensation of Bosons was closely related to the superfluid flow of Helium-4 discovered in 1937. At around 2.2 K , Pyotr Kapitsa, John Allen and Don Misener found that liquid Helium would behave as a fluid with no resistance to movement (zero viscosity): the superfluid. A superfluid can flow without losing kinetic energy, which means that it can flow indefinitely if set in motion. Not only that, but a superfluid may climb the sides of a container to find its level and form quantised vortices if stirred.
However, it was not until 1995 in which the pioneering efforts of Eric Cornell, Carl Wieman and Wolfgang Ketterle would provide the first experimental verification of Bose-Einstein condensation in a gas of Rubidium atoms, leading to Nobel Prizes in 2001. By using lasers to induce radiation pressure on the gas of atoms, the average atomic speed is significantly reduced, allowing for the condensation transition near absolute zero. Their work would trigger great interest in the study of macroscopic quantum phenomena and ultracold atomic systems, including the formation of condensates with Fermions.
Unlike Bosons, a gas of Fermions can not condense to the same quantum state at low temperature. Such a gas would instead behave as a Fermi Gas, in accordance to Fermi-Dirac statistics first introduced in 1926 by Enrico Fermi and Paul Dirac. However, if the Fermions are able to interact between one another by means of some mutually attractive force, then this picture changes considerably.
Electrons (or Fermions) in a solid can flow without resistance in the superconducting state, just like Bosons in a quantum gas can flow without resistance in a superfluid Bose-Einstein condensate. Following the discovery of mercury superconductivity at 4.2 K by Heike Onnes during 1911, various theories were developed in order to explain this unconventional property of matter. The most prevalent to today is the BCS theory, named after John Bardeen, Leon Cooper and John Schrieffer in 1957. This microscopic theory would describe superconductivity by the condensation of Cooper pairs.
In this context, Cooper pairs are a paired state of Fermions due to an attractive electron-phonon interaction in the superconductor. Despite this paired state being made up of two Fermions/electrons, the Cooper pair is in fact a composite Boson, hence the Cooper pairs may collectively occupy the same quantum state.
Using the mechanism of Cooper pairs, Deborah Jin, Markus Greiner and Cindy Regal would create the first atomic Fermionic condensate with ultracold Potassium atoms in 2003. By using a magnetic field to tune the strength of interactions between atoms from repulsive to attractive, Cooper pairs could then be stabilised, allowing for the condensation transition at low temperature.
Applications of Ultracold Gases
So far, we have seen that near absolute zero a gas of atoms can effectively behave as a single, giant atom at macroscopic length scales due the dominance of quantum effects. Depending on the exact properties of the atoms, the atomic condensate may display novel macroscopic features such as superfluidity and superconductivity, both characterised by the lack of resistance to particle flow. In this final section, we will provide a brief overview on some of the most significant applications of Ultracold Gases to both Physics and Technology.
In Physics, it may be difficult or simply impossible to directly measure predictions from certain models. These difficulties may manifest themselves as, for example, limitations in fabrication processes or inherent measurement quantities such as Hawking radiation from a cosmological black hole. In either case, it may then be useful to deduce an analogue model: a system in which the mathematical laws between the two models are equivalent but defined under different measurement bases.
Returning to the previous example, an analogue black-hole may then be realised using Bose-Einstein condensates accelerated to supersonic speeds with lasers, as was first verified in 2010 by Oren Lahav and colleagues. The accelerated condensate would then contain regions of both subsonic and supersonic flow, with the boundary acting as the analogue event horizon. Just as light can not escape the event horizon of a cosmological black hole, sound waves (phonons) can not escape the event horizon of the sonic black hole and are therefore trapped within the supersonic region. With an analogue model in place, a question to ask is then whether or not we can still observe unique properties as predicted by the original model, such as Hawking radiation?
In 1975 Stephen Hawking would show that due to quantum fluctuations, virtual photon pairs would appear continuously even in a vacuum. At the event horizon of a cosmological black hole, a pair may have components that exist within and outwith the boundary. In this case, the pair can no longer immediately destroy one another and the photons become real particles. One falls into the black hole and the other is emitted in the form of blackbody emission: Hawking radiation.
The analogue model also allows for the observation of the illusive Hawking radiation, which is practically impossible to observe in the cosmological scenario due to the very weak intensity of the radiated photons. Using the ideas of sonic black holes, Jeff Steinhauer would then be the first to observe artificial Hawking radiation from Bose-Einstein condensates in 2015. The quantum fluctuations in the condensate would produce virtual phonons, similar to the cosmological case, that separate at the sonic event horizon. By showing that phonons were entangled within and outwith the event horizon, Jeff demonstrated that they did indeed arise from the same fluctuation and thus revealed the presence of Hawking radiation from a condensate.
One of the most prevalent applications of ultracold gases is their use as an analogue to Solid-State Physics. If we have two laser beams travelling towards each other in an ultracold gas of atoms, the beams will interfere and form a standing wave pattern. Atoms will now effectively feel a confining force that traps them in the valleys of the standing wave, meaning that we have generated an Optical Lattice: an artificial crystal of light.
As was shown in 1998 by Dieter Jaksch and colleauges, it is precisely this lattice system that mimics the structure of crystals in Solid-State Physics, hence we have an analogue model. Compared to the Solid-State counterpart, the optical lattice inherently produces no lattice defects that may arise during the growth process of a crystal structure, making it more efficient. In addition, the lattice spacing and structure may be altered by simply changing the laser wavelength, polarisation and intensity, thus providing an ideal experimental system where parameters can be tuned at ease.
With an efficient analogue model in place, it is then possible to explore a number of interesting physical properties that arise in these lattice systems such as quantum phase transitions, high temperature superconductivity and topological matter. Not only that, but the effective system can also act as a quantum simulator for the model of interest.
Many problems in physics are difficult or practically impossible to simulate even with the most powerful supercomputer. An example includes the previously discussed ultracold optical lattice, where thousands of interacting atoms in a gas will be trapped on a lattice of possibly several hundreds of sites. While the governing Hubbard Hamiltonian, a mathematical representation of the total energy in the system is known, its solution is impossible to computationally determine for such a vast number of atoms and lattice sites due to the exponentially large Hilbert space of the quantum system. A solution is then to either simplify the treatment of the model with approximations or simulate it with some form of quantum computer.
The initial ideas of a quantum simulator were proposed by Yuri Manin and Richard Feynmann during the 1980’s as a form of quantum computer to simulate a particular quantum system. Since the ultracold optical lattice system is well described by a Solid-State analogue model, it is therefore an experimental quantum simulator. The synthesised model can be used to directly measure properties that would otherwise be impossible to find even with the most powerful supercomputer.
Due to the high control and tunability of the optical lattice environment, it is also possible to construct quantum logic gates and entangle atoms for use in quantum information processing and quantum computing. The highly ordered atoms essentially act as ideal quantum bits, which may then be further manipulated using trapping lasers and internal degrees of freedom such as spin and site occupancy.
In conclusion, we have seen that collective quantum phenomena can emerge at macroscopic scales provided the temperature of a gas is near absolute zero. From this, the condensation of atoms can effectively be described as a single quantum entity, which may then be formulated as an analogue model for cosmological or solid-state systems. With further developments in the field of ultracold gases, our fundamental understanding of complex condensed-matter systems may be further enriched, including the technological advent of efficient quantum computational devices.
Black Hole Illustration – http://astronomy.com/-/media/Images/News%20and%20Observing/News/2018/10/BlackHole.jpg?mw=600
Fermionic Condensation – https://physicsworld.com/wp-content/uploads/2004/01/condensate.jpg