Rss

Archives for : Big History Leeds

Behemoth Black Hole Found in an Unlikely Place

Astronomers have uncovered a near-record breaking supermassive black hole, weighing 17 billion suns, in an unlikely place: in the center of a galaxy in a sparsely populated area of the universe. The observations, made by NASA’s Hubble Space Telescope and the Gemini Telescope in Hawaii, may indicate that these monster objects may be more common than once thought.

Until now, the biggest supermassive black holes – those roughly 10 billion times the mass of our sun – have been found at the cores of very large galaxies in regions of the universe packed with other large galaxies. In fact, the current record holder tips the scale at 21 billion suns and resides in the crowded Coma galaxy cluster that consists of over 1,000 galaxies.

“The newly discovered supersized black hole resides in the center of a massive elliptical galaxy, NGC 1600, located in a cosmic backwater, a small grouping of 20 or so galaxies,” said lead discoverer Chung-Pei Ma, a University of California-Berkeley astronomer and head of the MASSIVE Survey, a study of the most massive galaxies and supermassive black holes in the local universe. While finding a gigantic black hole in a massive galaxy in a crowded area of the universe is to be expected – like running across a skyscraper in Manhattan – it seemed less likely they could be found in the universe’s small towns.

“There are quite a few galaxies the size of NGC 1600 that reside in average-size galaxy groups,” Ma said. “We estimate that these smaller groups are about 50 times more abundant than spectacular galaxy clusters like the Coma cluster. So the question now is, ‘Is this the tip of an iceberg?’ Maybe there are more monster black holes out there that don’t live in a skyscraper in Manhattan, but in a tall building somewhere in the Midwestern plains.”

The researchers also were surprised to discover that the black hole is 10 times more massive than they had predicted for a galaxy of this mass. Based on previous Hubble surveys of black holes, astronomers had developed a correlation between a black hole’s mass and the mass of its host galaxy’s central bulge of stars – the larger the galaxy bulge, the proportionally more massive the black hole. But for galaxy NGC 1600, the giant black hole’s mass far overshadows the mass of its relatively sparse bulge. “It appears that that relation does not work very well with extremely massive black holes; they are a larger fraction of the host galaxy’s mass,” Ma said.

Ma and her colleagues are reporting the discovery of the black hole, which is located about 200 million light years from Earth in the direction of the constellation Eridanus, in the April 6 issue of the journal Nature. Jens Thomas of the Max Planck-Institute for Extraterrestrial Physics, Garching, Germany is the paper’s lead author.

One idea to explain the black hole’s monster size is that it merged with another black hole long ago when galaxy interactions were more frequent. When two galaxies merge, their central black holes settle into the core of the new galaxy and orbit each other. Stars falling near the binary black hole, depending on their speed and trajectory, can actually rob momentum from the whirling pair and pick up enough velocity to escape from the galaxy’s core. This gravitational interaction causes the black holes to slowly move closer together, eventually merging to form an even larger black hole. The supermassive black hole then continues to grow by gobbling up gas funneled to the core by galaxy collisions. “To become this massive, the black hole would have had a very voracious phase during which it devoured lots of gas,” Ma said.

The frequent meals consumed by NGC 1600 may also be the reason why the galaxy resides in a small town, with few galactic neighbors. NGC 1600 is the most dominant galaxy in its galactic group, at least three times brighter than its neighbors. “Other groups like this rarely have such a large luminosity gap between the brightest and the second brightest galaxies,” Ma said.

Most of the galaxy’s gas was consumed long ago when the black hole blazed as a brilliant quasar from material streaming into it that was heated into a glowing plasma. “Now, the black hole is a sleeping giant,” Ma said. “The only way we found it was by measuring the velocities of stars near it, which are strongly influenced by the gravity of the black hole. The velocity measurements give us an estimate of the black hole’s mass.”

The velocity measurements were made by the Gemini Multi-Object Spectrograph (GMOS) on the Gemini North 8-meter telescope on Mauna Kea in Hawaii. GMOS spectroscopically dissected the light from the galaxy’s center, revealing stars within 3,000 light-years of the core. Some of these stars are circling around the black hole and avoiding close encounters. However, stars moving on a straighter path away from the core suggest that they had ventured closer to the center and had been slung away, most likely by the twin black holes.

Archival Hubble images, taken by the Near Infrared Camera and Multi-Object Spectrometer (NICMOS), supports the idea of twin black holes pushing stars away. The NICMOS images revealed that the galaxy’s core was unusually faint, indicating a lack of stars close to the galactic center. A star-depleted core distinguishes massive galaxies from standard elliptical galaxies, which are much brighter in their centers. Ma and her colleagues estimated that the amount of stars tossed out of the central region equals 40 billion suns, comparable to ejecting the entire disk of our Milky Way galaxy.

For more information, visit:

http://www.nasa.gov/hubble
http://hubblesite.org/newscenter/archive/releases/2016/12/

Ray Villard
Space Telescope Science Institute, Baltimore, Maryland
410-338-4488 / 410-338-4514
villard@stsci.edu

Last Updated: April 6, 2016

GM=tc^3 Adventures in Space/Time Monday, September 03, 2007 Not Dark Energy

The speculation called “dark energy” is subject of more questions in NEW SCIENTIST, Swiss cheese universe challenges dark energy.

“Dark energy may not be needed to explain why the expansion of space appears to be speeding up. If our universe is like Swiss cheese on large scales – with dense regions of matter and holes with little or no matter – it could at least partly mimic the effects of dark energy, suggests a controversial new model of the universe.”

As nige has noted, if the Universe is not homogeneous different regions will appear to expand at different rates. If your telescope looked in a direction of lower expansion, the Universe would appear to be accelerating. This adds to many anisotropies seen in the Cosmic Microwave Background. Though this model is very preliminary, physicists Sabino Matarrese and Rocky Kolb have published online On cosmological observables in a swiss cheese universe.

Back in March (2007), NEW SCIENTIST published Is dark energy an illusion?

“The quickening pace of our universe’s expansion may not be driven by a mysterious force called dark energy after all, but paradoxically, by the collapse of matter in small regions of space.”

Just last week A Hole In the Universe indicated that the Universe is not quite homogeneous, and a cosmology including “dark energy” may be all wrong. The proposed Supernova Acceleration Probe would survey only 15 square degrees of sky before ending its service life. The whole sky has an area of 41,253 square degrees! If SNAP looked at the wrong part of an inhomogeneous sky, it could give researchers the wrong value of cosmic acceleration. Then again, disciples of “dark energy” may already have the wrong idea.

Lawrence Krauss said that supernova data “naively implied that the Universe is accelerating.” The inferrence of cosmic acceleration relies on a daisy chain of assumptions, including homogeneity. It especially relies on assumption of a constant speed of light. SNAP/JDEM is subject to a review whose results will be announced Wednesday. With all the outstanding questions about its basic science, it is hard to see how SNAP can supersede projects like Constellation-X.
L. Riofrio at 5:50 PM

DAILY NEWS 31 August 2007 ‘Swiss cheese’ universe challenges dark energy By Anil Ananthaswamy

Dark energy may not be needed to explain why the expansion of space appears to be speeding up. If our universe is like Swiss cheese on large scales – with dense regions of matter and holes with little or no matter – it could at least partly mimic the effects of dark energy, suggests a controversial new model of the universe.

In 1998, astronomers found that distant supernovae were dimmer, and thus farther away, than expected. This suggested the expansion of the universe was accelerating as a result of a mysterious entity dubbed dark energy, which appears to make up 73% of the universe.

But trying to pin down the nature of dark energy has proven extremely difficult. Theories of particle physics suggest that space does have an inherent energy, but this energy is about 10120 times greater than what is actually observed.

This has caused some cosmologists to look for alternative explanations. “I don’t have anything against dark energy, but we ought to make all possible efforts to see whether we can avoid this exotic component in the universe,” says Sabino Matarrese of the University of Padova in Italy.

So he and colleagues, including Edward Kolb of the Fermi National Accelerator Laboratory in Batavia, Illinois, US, decided to model the universe as having large-scale variations in density.

That contradicts the standard model of cosmology, which assumes that the universe is homogeneous on large scales. In the homogeneous model, known as the Friedmann-Robertson-Walker (FRW) universe, the effect of dark energy is to stretch space, thus increasing the wavelength of photons from the supernovae.

Testing assumptions

A similar effect was seen when the researchers added large-scale spherical holes to the FRW universe. They allowed the density of matter within each hole to vary with radius and found that in certain cases, photons travelling through under-dense voids had their wavelengths stretched, mimicking dark energy.

The extent of the effect depends on the exact location of the supernovae and how many under-dense regions the photons have to cross before reaching Earth. And Matarrese cautions that the deviations are not enough to explain away all of the observed dark energy. He says their model is still very preliminary: “We are very far away from getting the full solution.”

Cosmologist Sean Carroll at Caltech in Pasadena, US, says the Swiss-cheese model is interesting and useful as a test of more mainstream theories. “The overwhelming majority of cosmologists think that the completely smooth approximation is a very good one,” Carroll told New Scientist. “But if you want to have confidence that you are on the right track, you better not just make assumptions and cross your fingers, you better test it.”

Up for debate

Astrophysicist Niayesh Afshordi of Harvard University in Cambridge, Massachusetts, US, is less impressed. Astronomical observations suggest that the density of matter in the universe is relatively smooth – and not like Swiss cheese – at scales of about 100 million light years or larger, he says. The new research, however, suggests that space is holey on scales of 500 million light years.

“The model is very inhomogeneous on scales that we observe as homogeneous,” Afshordi told New Scientist. “What we can learn from the toy model is not really applicable to this universe, because the properties of this model are very different what we see in our universe.”

However, team member Antonio Riotto of the University of Geneva in Switzerland argues that their Swiss-cheese model is realistic in the sense “that the universe is characterised by under-dense regions”.

“We know that the universe has voids, you can debate about their size,” he says. In fact, recent observations suggest that voids can extend across nearly a billion light years. Riotto says their model was worked out independently of that discovery, but “this observation is welcome by us”.

It’s bigger on the inside: Tardis regions in spacetime and the expanding universe – Brian Dodson – October 8th, 2013

Fans of Doctor Who will be very familiar with the stupefied phrase uttered by all new visitors to his Tardis: “It’s…bigger…on the inside.” As it turns out, this apparently irrational idea may have something to contribute to our understanding of the universe. A team of cosmologists in Finland and Poland propose that the observed acceleration of the expansion of the universe, usually explained by dark energy or modified laws of gravity, may actually be the result of regions of spacetime that are larger on the inside than they appear from the outside. The researchers have dubbed these “Tardis regions.”

Perhaps the most surprising cosmological observation of the past few decades was the 1998 discovery by Perlmutter, Schmidt and Riess, that the expansion of the universe has been accelerating for the past five billion years. This result, which won the 2011 Nobel Prize, was quickly corroborated by observation of independent phenomena such as the cosmic background radiation.

Why the acceleration is occurring is not currently understood, although it can be described. In terms of conventional cosmological theory, it calls for the existence of a “dark energy,” an energy field permeating the universe. However, because gravity attracts normal mass-energy, dark energy would have to have a negative energy density, something unknown as yet in nature. In addition, roughly 75 percent of the contents of the universe have to be made up of dark energy to get the observed acceleration of expansion. Even though dark energy provides a reasonable description of the universal acceleration, its value as an explanation is still controversial. Many have the gut reaction that dark energy is too strange to be true.

Professors Rasanen, and Szybkab, of the University of Helsinki and the Jagellonian University at Krakow, together with Rasanen’s graduate student Mikko Lavinto, decided to investigate another possibility.

The “standard cosmological model,” which is the framework within which accelerated expansion requires dark energy, was developed in the 1920s and 1930s. The FLRW metric (named for Friedmann, Lemaître, Robertson and Walker, the major contributors) is an exact solution to Einstein’s equations. It describes a strictly homogeneous, isotropic universe that can be expanding or contracting.

Strict homogeneity and strict isotropy means that the universe described by an FLRW metric looks the same at a given time from every point in space, at whatever distance or orientation you look. This is a universe in which galaxies, clusters of galaxies, sheets, walls, filaments, and voids do not exist. Not, then, very much like our own Universe, which appears to be rather homogeneous and isotropic when you look at distances greater than about a gigaparsec, but closer in it is nothing of the sort.

Rasanen’s research team decided to examine a model universe having a structure closer to ours, in an attempt to look for alternate explanations of the accelerating expansion we see. They took an FLRW metric filled with a uniform density of dust, and converted it into a Swiss cheese model but cutting random holes in it. This has the effect of making the model inhomogeneous and non-isotropic (except very far away), and hence the Swiss cheese model looks more like our own Universe, save for the fact that our Universe does not seem to be full of holes.

While Swiss cheese is delicious, a universe with holes is not. To rectify this, Rasanen’s team filled in the holes with plugs made from dust-filled exact solutions of Einstein’s equation. These plugs are a reasonable model of the region near a sizable body, such as a galaxy. By putting the plugs in the holes, and then smoothing the intersections between them, they obtained a rather uniform spacetime with a lot of smaller blobs of matter dispersed throughout it – a (very) simple analog to the structure of the universe in which we live.

Rasanen’s team made the plugs from a model in which the spatial parts essentially fold in on themselves as the spacetime evolves. As suggested by the figure above, such folds increase the length of a path passing through the plug without changing the external dimensions of the plug. For some such plugs, the length of a path through the plug becomes longer throughout the life of the Universe.

The team calls such a plug a Tardis region, and a spacetime containing Tardis regions is called a Tardis spacetime. As seen in the figure below, the proper diameter and volume of a properly configured plug starts somewhat larger than the apparent quantities, but then grows to much larger sizes.

Let’s get to the story about Tardis regions and expansion of the universe. Because gravity is universally attractive, in an inhomogeneous universe a denser region will tend to expand more slowly than will a less dense region – there is less gravitational interaction holding the less dense region together.

Although the Tardis regions expand faster than the surrounding dust space, this does not change their apparent size from outside, so at first glance it is difficult to see how this accelerates the expansion of the universe. The key is that when an observer looks at a distant object in a plugged Swiss cheese space, the light they see has passed through a number of plugs, the number increasing the further away the object. As the length of a path through a Tardis region rapidly becomes larger as time goes by, the total length of the path the light followed from an object increases faster than does the space outside the plugs. The result is that the expansion of the universe appears to be accelerating with time, without additional influences such as dark matter.

To sum up, a space with a large number of relatively small Tardis regions will appear initially to expand at roughly the same rate as does the dust space in which the Tardis regions are embedded. As time goes on, however, the Tardis regions expand faster than the Swiss cheese, and as they fill larger fractions of the photon paths between objects and observers, the expansion of the universe as measured by optical tests over large distances will appear to accelerate.

The effect can be made large enough to reproduce the observed acceleration, so the idea isn’t silly. But is this the explanation? It is too early to tell. The model is very artificial and simplistic, but does suggest that there is at least one possible alternate to dark energy within the bounds of classical general relativity.

Black-hole computing – Might nature’s bottomless pits actually be ultra-efficient quantum computers? That could explain why data never dies – Sabine Hossenfelder

After you die, your body’s atoms will disperse and find new venues, making their way into oceans, trees and other bodies. But according to the laws of quantum mechanics, all of the information about your body’s build and function will prevail. The relations between the atoms, the uncountable particulars that made you you, will remain forever preserved, albeit in unrecognisably scrambled form – lost in practice, but immortal in principle.

There is only one apparent exception to this reassuring concept: according to our current physical understanding, information cannot survive an encounter with a black hole. Forty years ago, Stephen Hawking demonstrated that black holes destroy information for good. Whatever falls into a black hole disappears from the rest of the Universe. It eventually reemerges in a wind of particles – ‘Hawking radiation’ – that leaks away from the event horizon, the black hole’s outer physical boundary. In this way, black holes slowly evaporate, but the process erases all knowledge about the black hole’s formation. The radiation merely carries data for the total mass, charge and angular momentum of the matter that collapsed; every other detail about anything that fell into the black hole is irretrievably lost.

Hawking’s discovery of black-hole evaporation has presented theoretical physicists with a huge conundrum: general relativity says that black holes must destroy information; quantum mechanics says it cannot happen because information must live on eternally. Both general relativity and quantum mechanics are extremely well-tested theories, and yet they refuse to combine. The clash reveals something much more fundamental than a seemingly exotic quirk about black holes: the information paradox makes it aptly clear that physicists still do not understand the fundamental laws of nature.

But Gia Dvali, professor of physics at the Ludwig-Maximilians University of Munich, believes he’s found the solution. ‘Black holes are quantum computers,’ he says. ‘We have an explicit information-processing sequence.’ If he is correct, the paradox is no more, and information truly is immortal. Even more startling, perhaps, is that his concept has practical implications. In the future, we might be able to tap black-hole physics to construct quantum computers of our own.

The main reason why recovering information from black holes seems impossible is that they are almost featureless spheroids with essentially no physical attributes on their horizons; they have ‘no hair’, as the late US physicist John Wheeler put it. You cannot store information in something that has no features that could be used to encode it, the standard argument goes. And therein lies the error, Dvali says: ‘All these no-hair theorems are wrong.’ He and his collaborators argue that gravitons – the so-far undiscovered quanta that carry gravity and make up space-time – stretch throughout the black hole and give rise to ‘quantum hair’ which allows storing as well as releasing information.

The new research builds on a counter-intuitive feature of quantum theory: quantum effects are not necessarily microscopically small. True, those effects are fragile, and are destroyed quickly in warm and busy environments, such as those typically found on Earth. This is why we don’t normally witness them. This is also the main challenge in building quantum computers, which process information using the quantum states of particles instead of the on-off logic of traditional transistors. But in a cold and isolated place, quantum behaviour can persist over large distances – large enough to span the tens to billions of kilometres of a black-hole horizon.

You don’t even need to go to outer space to witness long-range quantum effects. The enormous distances and masses necessary to create black-hole quantum hair might be far beyond our experimental capabilities, but by cooling atoms down to less than one ten-thousandth of a Kelvin (that is, one ten-thousandth of a degree above absolute zero), researchers have condensed up to a billion atoms, spread out over several millimetres, into a single quantum state. That’s huge for collective quantum behaviour.

Hawking’s information puzzle would find a natural solution if black holes are, in essence, puddles of condensed gravity

Such an atomic collective – known as a Bose-Einstein condensate, named after the Indian physicist Satyendra Bose and Albert Einstein – is currently one of the most promising tools for creating a workable quantum computer. Quantum effects within a Bose-Einstein condensate, like the ability to be in two places at the same time, can stretch through the whole condensate, giving rise to many interlocked states. Enormous information-processing power could become available if researchers succeed in stabilising the condensate and controlling these states. And, not coincidentally, Bose-Einstein condensates might also solve the decades-old puzzle of black-hole information loss.

Hawking’s information puzzle would find a natural solution, Dvali notes, if black holes consist of gravitons that have undergone Bose-Einstein condensation – puddles of condensed gravity, in essence. The idea might sound crazy, but for Dvali it’s a perfectly reasonable conclusion, drawn from what physicists have learned about black-hole information in the years since Hawking first posed his riddle. Theorists know how to calculate how much information the black hole must be able to store: the amount is quantified in the black hole’s entropy and proportional to the horizon surface area. They have also found that black holes can redistribute or ‘scramble’ information very quickly. And finally, they know the pace at which information must escape from the black hole in order to avoid conflicts with quantum mechanics.

Starting in 2012, Dvali explored these various attributes and discovered, to his surprise, that certain types of Bose-Einstein condensates share their essential properties with black holes. To act like a black hole, the condensate must linger at a transition point – its so-called quantum critical point – where extended fluctuations span through the fluid just before the quantum behaviour collapses. Such a quantum-critical condensate, Dvali calculated, has the same entropy, scrambling capacity and release time as a black hole: it has just the right quantum hair. ‘Somebody can say this is a coincidence, but I consider it extremely strong evidence – mathematical evidence that is – that black holes genuinely are Bose-Einstein condensates,’ he says.

Linking black holes with a form of matter that can be created in the lab means that some aspects of Dvali’s idea can be explored experimentally. Immanuel Bloch, professor of physics at the Max-Planck-Institute in Munich, has first-hand experience with Bose-Einstein condensates. He condenses atoms in ‘crystals of light’ – optical lattices created by intersecting multiple laser beams – and then takes snapshots of the condensate using a technique called fluorescence imaging. The resulting pictures beautifully reveal the atoms’ correlated quantum behaviour.

Bloch finds Dvali’s idea, which originated in a field entirely different from his, intriguing. ‘I am pretty excited about Gia’s proposal. I think that’s something really new,’ Bloch says. ‘People have seen collapse dynamics with interacting condensates, but nobody has so far investigated the quantum critical point and what happens there.

‘In the BEC [Bose-Einstein condensate] you have macroscopic quantum waves, and this means in the quantum numbers you have a lot of fluctuations. This is why the BEC normally looks like a Swiss cheese,’ he continues. But by applying a magnetic field, Bloch can change the strength by which the atoms interact, thereby coaxing them into an orderly lattice. ‘Now you make the atoms strongly interacting, then you go to the [very orderly] “Mott state”. This is a great state for quantum computing because you have this regular array. And you can address the atoms with lasers and rotate them around and change the spin [to encode and process information].’

‘Dvali’s idea is competing with a lot of other stuff out on the market. I have more skepticism than faith’

According to Dvali, black-hole physics reveals a better way to store information in a Bose-Einstein condensate by using different quantum states. Black holes are the simplest, most compact, most efficient information storage devices that physicists know of. Using the black holes’ coding protocol therefore should be the best possible method to store information in condensate-based quantum computers.

Creating a black-hole-mimic condensate in the lab seems doable to Bloch: ‘[In a black hole,] the interaction strength adjusts itself. We can simulate something like that by tuning the interaction strength to where the condensate is just about to collapse. The fluctuations become bigger and bigger and bigger as you get closer to the quantum critical point. And that could simulate such a system. One could study all the quantum fluctuations and non-equilibrium situations – all that is now possible by observing these condensates in situ, with high spatial resolution.’

Just because realising Dvali’s idea is possible does not necessarily mean it is practical, however. ‘It’s competing with a lot of other stuff out on the market. Right now, I have more skepticism than faith,’ Bloch says. He also points out that efficient information storage is nice, but for quantum computers ‘information capacity is presently not the problem’. The biggest challenge he sees is finding a way to individually manipulate the quantum states that Dvali has identified – data processing, rather than data storage. There are other practical hurdles as well. ‘There are so many things we don’t know, like noise, is it resistant to noise? We don’t know,’ Bloch notes. ‘For me, the much more interesting aspect is the connection to gravitational physics.’ And here the implications go well beyond information storage.

Dvali’s is not the only recent research suggesting a connection between gravity and condensed-matter physics, a trend that has opened whole new realms to experimental investigation. In the tradition of Einstein, physicists generally think of curved space-time as the arena for matter and its interactions. But now several independent lines of research suggest that space-time might not be as insubstantial as we thought. Gravity, it seems, can emerge from non-gravitational physics.

In the past decades, numerous links between gravity and certain types of fluids have demonstrated that systems with collective quantum behaviour can mimic curved space-time, giving rise to much the same equations as one obtains in Einstein’s theory of general relativity. There is not yet any approach from which general relativity can be derived in full generality by positing that space-time is a condensate. For now, nobody knows whether it is possible at all. Still, the newfound relations allow physicists to study those gravitational systems that can be mimicked with atomic condensates.

Simulating gravity with condensates allows physicists to explore regions – such as black-hole horizons – that are not otherwise accessible to experiment. And so, although Hawking radiation has never been observed in real black holes, its analogue has been measured for black holes simulated through Bose-Einstein condensates. Of course, these condensates are not really black holes – they trap sound waves, not light – but they obey some of the same mathematical laws. The condensates do thus, in a sense, perform otherwise complicated, even intractable, physics calculations.

‘We like to speak of “quantum simulations” and try to use these systems to look for interesting phenomena that are hard to calculate on classical computers,’ says Bloch. ‘We are also trying to use this kind of system to test other systems like the black holes, or we looked at the [analogue of the] Higgs particle in two dimensions.’ In a 2012 Nature paper, Bloch and his collaborators reported that their quantum simulation revealed that Higgs-like particles can also exist in two dimensions. The same technique could in principle be used to study Bose-Einstein condensates behaving like black holes.

‘The black hole [no hair] theorems are, sorry, crap’

But using black-hole physics to develop new protocols for quantum computers is one thing. Finding out whether astrophysical black holes really are condensates of gravitons is another thing entirely. ‘I am not interested in the idea if one can’t test it,’ says Stefan Hofmann, a theoretical cosmologist and colleague of Dvali’s in Munich.

Hofmann therefore has dedicated significant time to exploring the observational consequences of the idea that black holes are graviton condensates. ‘The black hole [no hair] theorems are, sorry, crap,’ he agrees with Dvali. Hofmann thinks that the quantum hair nearby the black-hole horizon would subtly alter the predictions of general relativity (especially the emission of gravitational waves during formation or collision of black holes), in ways that should be detectable. ‘The dream would be a binary [black hole] merger,’ Hofmann said in a 2015 seminar. His dream has just become true: the LIGO collaboration recently announced the first measurement of gravitational waves emitted from a merging pair of black holes.

Hofmann and his collaborators have yet to make quantitative predictions, but due to the macroscopic quantum effects, Dvali’s proposed solution to the information-loss problem might soon become experimentally testable. However, the idea that black holes are quantum-critical condensates of gravitons, truly equivalent to a Bose-Einstein condensate, leaves many questions open. To begin with, Dvali’s calculations cannot explain what actually happens to matter falling into a black hole. And Hofmann admits that it isn’t clear how the object is a ‘black hole’ in the conventional sense, since it can no longer be described within the familiar framework of general relativity.

Carlo Rovelli from the University of Marseille thinks that, even in incomplete form, Davli’s idea of black holes as condensates might be scientifically useful. ‘They are using a brutal approximation which might fail to capture aspects, but it might work to some extent, especially in the long wavelength regime. For the low-frequency quantum fluctuations of [space-time] it may not be absurd,’ Rovelli says. He cautions, however, that the condensate model ‘cannot be a complete description of what happens in the black hole’.

What is clear, though, is that this research has revealed a previously unrecognised, and quite fruitful, relation. ‘We have a very interesting bridge between quantum information and black-hole physics that was not discussed before,’ Dvali says. If he is right, the implications are conceptually staggering. Information really does live on eternally. In that sense, we are all immortal. And the supermassive black hole at the centre of our galaxy? It’s actually a cosmic quantum computer.

2,300 words

Sabine Hossenfelder
is a research fellow at the Frankfurt Institute for Advanced Studies, with a special interest in the phenomenology of quantum gravity. Her writing has appeared in Forbes, Scientific American, and New Scientist, among others.

Ligo’s black holes that helped prove Einstein’s theory of gravitational waves could have been born inside a massive star – By Abigail Beall – For Mailonline 13:02 17 Feb 2016 (updated 16:38 17 Feb 2016)

Scientists detected the first warping of space-time caused by a collision of two black holes last week.
The historic signals were picked up by two advanced detectors.
At the same time Nasa’s Fermi telescope detected a gamma ray burst.
Gamma rays could mean two black holes lived inside a rotating star.

Last week, scientists made the ‘the scientific breakthrough of the century’ with the detection of gravitational waves.

When the waves were detected, they knew they had been caused by two black holes 30 times the size of the sun colliding.

But a second signal, seen by a telescope in space suggests both black holes could have been formed inside a gigantic star.

The discovery was the first time anyone had detected the warping of space-time caused by a collision of two massive black holes – something first predicted in Einstein’s Theory of General Relativity in 1915.

These gravitational waves, created 1.3 billion light-years from Earth, help confirm our universe was created by the Big Bang, and will give an unprecedented glimpse into its beginning.

The historic signals were picked up by two advanced Laser Interferometer Gravitational-wave Observatories (Ligo) in Louisiana and Washington.

Just 0.4 seconds later, Nasa’s Fermi telescope also detected a gamma ray burst, a flash of electromagnetic rays associated with high energy collisions.

In order to produce a gamma ray burst, a black hole needs to be fed at an enormous rate of somewhere between the mass of a planet and the mass of the sun every second.

This is only possible to get near the centre of a massive star at the end of its life.

This gamma ray signal was a surprise for physicists, as they would not normally be associated with the merging of two black holes.

New algorithm developed at MIT for imaging black holes By Jessica Hall – June 7, 2016 at 1:27 pm

Everybody knows you can’t see a black hole. Nothing gets out, not even light. Except that, as with most conventional wisdom, isn’t the whole story. Leaving aside the can of worms labeled Hawking radiation, we still know that the matter falling into a black hole heats up as it falls in. In theory, we can pick that up with a good old radio telescope. But black holes are so far away that we need way better angular resolution than any telescope we currently have, if we want to confirm these predictions with actual data.

“A black hole is very, very far away and very compact,” says Katie Bouman, a grad student at MIT working with an international collaboration called the Event Horizon Telescope. “It’s equivalent to taking an image of a grapefruit on the moon, but with a radio telescope. To image something this small means that we would need a telescope with a 10,000-kilometer diameter, which is not practical, because the diameter of the Earth is not even 13,000 kilometers.” This is where interferometry comes in. Bouman developed a new imaging algorithm called CHIRP, for Continuous High-resolution Image Reconstruction using Patch priors, and it uses interferometry, essentially, “to turn the entire planet into a large radio telescope dish.”

The Event Horizon Telescope is actually an array of radio telescopes working to image Sagittarius A*, the black hole at our galaxy’s center. We can’t image Sagittarius A* with optical means, because there’s just too much debris in the way. But the EHT uses interferometry to combine and compare the input from multiple telescopes, a Nobel prizewinning technique which confers much better angular resolution. With the angular resolution afforded by a radio telescope the effective size of the planet, we could use interferometry to find out whether or not our galaxy’s supermassive black hole actually looks like we think it does.

CHIRP works a little like an insect eye, in that it combines sections of the EHT array’s visual field into a coherent whole. Part of the method uses algebra to multiply measurements from three telescopes together, which triangulates away noise generated by the interference of Earth’s atmosphere. Six telescopes have already signed on to participate in the collaboration, but it can accommodate every telescope on Earth: Using CHIRP, Bouman’s project can stitch together what all the radio telescopes see.

“Normal” interferometry uses an algorithm that treats an image from a radio telescope as a collection of individual points of different brightness on a plane. It tries to find the points whose brightness and location most closely match the data. Then the algorithm blurs together bright points near each other, to meld the astronomical images together. In the new model, instead of points on a 2D plane, there are cones whose heights give the total brightness at any spot — black, empty sky would be represented by a cone of zero height. This sharpens the image and filters out noise, using the same principles that make constructive and destructive interference work.

But the earth isn’t exactly peppered with interferometers. There are large areas on the ground that aren’t collecting any data. CHIRP fills in the gaps by mathematically stitching together different telescopes’ fields of view, wherever they overlap, to create a continuous whole. It’s like a brightness topo map of the sky; tall places are bright spots. “Translating the model into a visual image is like draping plastic wrap over it: The plastic will be pulled tight between nearby peaks, but it will slope down the sides of the cones adjacent to flat regions,” the team said in a statement. “The altitude of the plastic wrap corresponds to the brightness of the image.”

To verify CHIRP’s predictions, Bouman and team loosed machine learning on the imaging problem. They trained the learning algorithm on images of celestial bodies, earthly objects and black holes, and found that CHIRP frequently outperformed its predecessors. Since Bouman made her test data available online, other researchers can use and improve on it.

The mysterious boundary – The entrance to a black hole could reveal insights into the Big Bang, the formation of galaxies and even death by spaghettification BY ANDREW GRANT 2:40PM, MAY 16, 2014

A black hole’s event horizon is a one-way bridge to nowhere, a gateway to a netherworld cut off from the rest of the cosmos.

Understanding what happens at that pivotal boundary could reveal the hidden influences that have molded the universe from the instant of the Big Bang.

Today some of the best minds in physics are fixated on the event horizon, pondering what would happen to hypothetical astronauts and subatomic particles upon reaching the precipice of a black hole. At stake is the nearly 100-year quest to unify the well-tested theories of general relativity and quantum mechanics into a supertheory of quantum gravity.

But the event horizon is more than just a thought experiment or a tool to merge physics theories. It is a very real feature of the universe, a pivotal piece of cosmic architecture that has shaped the evolution of stars and galaxies. As soon as next year, a telescope the size of Earth may allow us to spot the edge of the shadowy abyss for the first time.

By studying the event horizon through both theory and observation, scientists could soon figure out how the universe began, how it evolved and even predict its ultimate fate. They’d also be able to answer a crucial question: Would a person falling into a black hole be stretched and flattened like a noodle, dying by spaghettification, or be incinerated?

Gravitational gusto
Scientists thought about the possibility of black holes and event horizons long before either term existed. In 1783, British geologist and astronomer John Michell considered Newton’s work on gravity and light and found that, in theory, a star with 125 million times the mass of the sun would have enough gravitational oomph to pull in any object trying to escape — even one traveling at light speed.

Although stars can never attain that much mass, Albert Einstein’s 1916 general theory of relativity put Michell’s hunch about supermassive objects onto solid theoretical ground. Later that year, German astronomer Karl Schwarzschild used general relativity to show that some stars could collapse under their own gravity and create a deep pit in the fabric of space-time. Anything, including light, that came within a certain distance of the collapsed star’s center of mass could never come out. That point of no return became known as the event horizon.

Confirmation for the existence of black holes came decades later. In 1974, scientists detected a heavy dose of radio waves emitted from the center of the Milky Way, about 26,000 light-years away. They eventually concluded that there must be a black hole there. Today, astronomers know that virtually every galaxy harbors a giant black hole at its center, shaping the formation of millions of stars and even neighboring galaxies with its immense gravitational influence. Galaxies also contain millions of small- and medium-sized black holes, each with an event horizon past which light is never seen again.

But the repercussions of black holes’ extreme gravity eventually led to conflicts with one of the keystones of 20th century physics: quantum mechanics. The trouble began in the mid-1970s, when University of Cambridge physicist Stephen Hawking proposed that black holes are not eternal. In the far, far future, when black holes have devoured almost all the matter in the universe, leaving little else to consume, energy should slowly leak out from their event horizons. That energy, now known as Hawking radiation, should continue seeping out until each black hole evaporates completely.

Hawking quickly realized the drastic consequences of his proposal. In a chaos-inducing 1976 paper, he explained that if a black hole eventually disappears, then so should all the information about all the particles that ever fell into it. That violates a central tenet of quantum mechanics: Information cannot be destroyed. Physicists could accept that all the properties of all the particles within a black hole were locked up, forever inaccessible to those outside a black hole’s event horizon. But they were not OK with that safe vanishing without a trace. “It violated everything I knew about quantum mechanics,” says Stanford theoretical physicist Leonard Susskind, who heard Hawking’s ideas at a conference in 1981. “It couldn’t be right.”

Violating theories
Susskind dug into this black hole information paradox, and by the turn of the century he thought he had resolved it with a proposal called complementarity. In essence, he argued that information can simultaneously cross the event horizon and never cross the event horizon, so long as no single observer can see it in both places.

If a particle were to fall into a black hole, an astronaut falling alongside it would see nothing special happen as both coasted across the event horizon and into the black hole’s interior. But another astronaut watching from outside would never see his friend or the particle pass the event horizon; from his point of view, the particle would get perilously close to the horizon but never quite cross it. Eventually, as the black hole evaporated perhaps a trillion trillion trillion trillion years later (astronauts in thought experiments have remarkable longevity), the astronaut outside the black hole would see the Hawking radiation associated with the infalling particle.

Susskind’s explanation is unintuitive, but at least it’s elegant. For both observers, information is preserved. Plus, the outside astronaut can potentially piece together everything that fell into the vast black hole interior just by monitoring the event horizon. This idea, proposed by Juan Maldacena at the Institute for Advanced Study in Princeton, N.J., is called the holographic principle: Just as a two-dimensional hologram can depict a three-dimensional object, the surface of a black hole theoretically reveals everything inside of it.

Pasta or Barbecue?
Since the 1970s, physicists have had trouble coming up with a proposal that describes the fate of something, or someone, falling into a black hole that doesn’t violate well-tested theories. Until 2012, complementarity (left side of image) seemed to do the job. It said that an astronaut falling into a black hole won’t notice anything special as he crosses the event horizon. Yet someone outside will never see his friend reach the horizon. Information is preserved for both observers. But complementarity breaks another rule of quantum mechanics (see “Problematic entanglements,” below). Some argue that walls of radiation along event horizons incinerate incoming matter.

But in 2012, a quartet of physicists including Joseph Polchinski from the University of California, Santa Barbara reignited the black hole information paradox by demonstrating that in solving one problem, Susskind and Maldacena had created another. The issue centers on another facet of quantum mechanics called entanglement, which intertwines the properties of multiple particles regardless of the distance between them. Susskind and Maldacena’s complementarity relies on entanglement to preserve information. As the proposal goes, particles of Hawking radiation are linked to each other so that over time an observer could measure the radiation and piece together what’s inside the black hole.

In yet another thought experiment, Polchinski and his team pondered what would happen if just one of a pair of entangled particles near a black hole’s event horizon fell in, while the other escaped as Hawking radiation. According to complementarity, the escaping particle would also have to be entangled with another Hawking particle. But that’s a no-no in quantum mechanics: Particles entangled with each other outside a black hole cannot also be entangled with particles inside the black hole. Physicists call this forbidden arrangement entanglement polygamy.

To remedy this violation of quantum theory, Polchinski’s team took its thought experiment a step further and tried severing the entanglement spanning the event horizon. The result: An impenetrable wall of energy formed at the event horizon, incinerating and shutting out any object big or small that tried to pass. They called this unforgiving boundary a firewall.

Unfortunately, while the firewall would play by the rules of quantum mechanics, it would violate Einstein’s theory of general relativity. According to Einstein, an astronaut should not notice anything unusual as he crosses the event horizon; in fact, he shouldn’t even know he’s crossed it until later, when he begins getting spaghettified, or stretched like a noodle, from the extreme gravity of the black hole’s interior and realizes that even a light-speed escape attempt would do no good. A firewall, on the other hand, would provide a pretty noticeable hint that the astronaut had reached the event horizon: He would fry instantly. If firewalls exist, then general relativity requires tweaking.

This firewall problem once again pits general relativity against quantum mechanics, and it has sparked new interest in thinking about the strange physics taking place at the event horizon. “I don’t even see a good framework of an idea to solve the problem,” Polchinski says.

Astronomical stakes
These thought experiments may seem academic, but the implications go well beyond the fates of a handful of particles. Event horizons seem to be the best theoretical test bed for combining general relativity and quantum mechanics into a unified theory of quantum gravity. “The last frontier for fundamental physics is quantum gravity,” says Janna Levin, an astrophysicist at Columbia University’s Barnard College. “And this one puzzle is offering us a chance to see the key elements.”

Physicists have had trouble developing a theory of quantum gravity because compared with the universe’s other three forces — strong, weak and electromagnetism — gravity is pathetically feeble. It’s the only force that is negligible at the small scales dominated by quantum physics. The quest for a theory of quantum gravity gained added significance after the recent discovery of ripples in spacetime dating back to a mere 10-36 seconds after the birth of the universe.

Understanding the universe so soon after the Big Bang is an amazing achievement, but a lot of interesting stuff happened in that trillionth of a trillionth of a trillionth of a second before those ripples cascaded through the infant cosmos.

If physicists are ever going to reach all the way back to the very beginning of the universe, Levin says, they will have to understand how the universe behaved when it was incredibly small and incredibly massive simultaneously. The best way to figure that out is to formulate a theory of quantum gravity by demystifying another such compact, massive environment: a black hole. “The event horizon is where gravity starts to come into its own,” says Sheperd Doeleman, an astronomer at MIT’s Haystack Observatory. “It rips off the Clark Kent business suit and starts to become as strong as the other forces.”

With so much at stake, many prominent physicists are stepping up and throwing some intriguing ideas into the mix.

The all-star roster includes Hawking. In a brief, cryptic January posting to the physics preprint server arXiv.org, he suggested that event horizons are not the points of no return proposed by Schwarzschild nearly a century ago. If event horizons occasionally allow stuff inside the black hole to escape, Hawking argued, then firewalls need not exist. While Hawking’s comments grabbed headlines — it didn’t hurt that his write-up included the misleading phrase “there are no black holes” — nobody is quite sure what the black hole savant has in mind. “People want to know what Hawking thinks,” says Sabine Hossenfelder, a cosmologist at the Nordic Institute for Theoretical Physics in Stockholm. “But practically, his paper has no use for me.” She wants Hawking to release a comprehensive paper explaining his argument and the reasoning behind it.

Patrick Hayden, a Stanford quantum physicist, has an idea similar to complementarity. He agrees with the arguments laid out by Polchinski’s team but suggests that it would be extremely difficult for a single observer to determine that a particle is engaged in entanglement polygamy. In fact, he says it would take a person so long to experimentally verify it that the black hole would have already evaporated. Once again, it may turn out that a black hole information paradox is allowed to exist for the simple reason that no one could ever detect it.

The most potentially paradigm-shifting idea comes from the dogged duo of Susskind and Maldacena. They address the firewall problem by combining entanglement, a mind-bending facet of quantum mechanics, with the sci-fi–sounding concept of wormholes. Wormholes are shortcuts through spacetime, the rough equivalent of crossing a mountain via tunnel rather than climbing over it. According to Susskind and Maldacena, every pair of entangled particles is connected by a wormhole, drastically shortening the distance between them.

Applying this to event horizons, they say that individual particles of Hawking radiation are linked via wormhole to the inside of the black hole. The proposal eliminates the need for firewalls by turning entanglement into a shortcut through spacetime rather than a mysterious long-distance link. In essence, the particles inside and outside the event horizon become one and the same.

Susskind and Maldacena’s proposal, while pretty wild, is stirring cautious optimism. “As physicists, we often rely on our sense of smell in judging scientific ideas,” Caltech theoretical physicist John Preskill wrote on his blog Quantum Frontiers. “At first whiff, [the wormhole proposal] may smell fresh and sweet, but it will have to ripen on the shelf for a while.” If Susskind and Maldacena are right, it would mean that quantum mechanics determines not only the behavior of particles at very small scales but also the large-scale structure of the universe. “Entanglement creates the hooks that hold space together,” Susskind says.

And in Susskind’s mind, that’s the beauty of the event horizon. A firewall proposal that he’s sure is wrong but can’t yet explain why may be the ticket to unraveling the great mysteries of the universe. Perhaps complementarity, wormholes or a mystery mechanism up Stephen Hawking’s sleeve will simultaneously rectify the black hole information paradox and deliver a theory of quantum gravity. “Once in a while, a conflict comes along and completely changes the way we think about things,” Susskind says. “This firewall story may be one of them.”

Picture perfect
With all the talk about hypothetical astronauts and entangled particles, it’s easy to forget that black holes are actual objects in the universe. It may be up for debate whether matter falling in gets stretched or burned, but there’s no doubt that throughout the cosmos incalculable amounts of gas and dust are flowing across the event horizons of black holes.

Astronomers know this because, despite the fact that no light can escape the event horizon, many black holes are fairly easy to detect. As the supergravity of a black hole reels in gas and dust, a traffic jam emerges near the event horizon. As matter bumps into other matter, it heats up and glows, emitting X-rays and other high-energy radiation. “Black holes are sitting in a luminous soup of billion-degree gas,” MIT’s Sheperd Doeleman says. Sometimes all that searing gas rockets away from the black hole in concentrated jets that can course more than a million light-years.

Astronomers aren’t sure why some galaxies’ black holes are voracious eaters, glowing brightly, while others seem dark and inactive, Doeleman says. The Milky Way’s central black hole, which weighs about 4 million times the mass of the sun, is relatively dormant. Astronomers are holding out hope that they’ll get to see the local black hole light up over the next year as a large gas cloud called G2 swings perilously close to its event horizon.

Doeleman has even greater ambitions. He leads a team that plans to directly image the event horizon of the Milky Way’s central black hole. That’s pretty hard to do: In fact, it requires a telescope the size of Earth.

So next year, Doeleman and his colleagues will unveil what amounts to an Earth-sized telescope.

The Event Horizon Telescope, the first instrument designed specifically for spying the structure of a black hole, combines multiple radio telescopes to achieve a resolution equivalent to that of a single one that is much larger.

This year, Doeleman is heading to the Atacama Large Millimeter/submillimeter Array in Chile, the world’s most powerful radio telescope network, to install extraordinarily precise atomic clocks that will allow researchers to combine the Chilean telescopes’ data with those from observatories in Hawaii, Spain and eventually the South Pole.

If all goes well, as early as next year a virtual telescope with the sensitivity of an Earth-sized radio dish will deliver images of a bright ring of hot gas surrounding a circular shadow: the heart of a black hole, bounded by the event horizon. “We’ve been working on this for a decade,” Doeleman says. “It’s exhilarating to be so close.”

Theorists aren’t as excited about the massive scope. After all, an Earth-sized telescope can’t zoom in on a single particle and resolve the information paradox. But perhaps a photograph will provide some inspiration. For the first time they’ll be able to take a good look at the mysterious boundary that has perplexed them for so long.
— Andrew Grant

CERN COURIER Nov 20, 2007 Physics in the multiverse The idea of multiple universes is more than a fantastic invention. It appears naturally within several theories, and deserves to be taken seriously, explains Aurélien Barrau.

Is our entire universe a tiny island within an infinitely vast and infinitely diversified meta-world? This could be either one of the most important revolutions in the history of cosmogonies or merely a misleading statement that reflects our lack of understanding of the most fundamental laws of physics.

The idea in itself is far from new: from Anaximander to David Lewis, philosophers have exhaustively considered this eventuality. What is especially interesting today is that it emerges, almost naturally, from some of our best – but often most speculative – physical theories. The multiverse is no longer a model; it is a consequence of our models. It offers an obvious understanding of the strangeness of the physical state of our universe. The proposal is attractive and credible, but it requires a profound rethinking of current physics.

At first glance, the multiverse seems to lie outside of science because it cannot be observed. How, following the prescription of Karl Popper, can a theory be falsifiable if we cannot observe its predictions? This way of thinking is not really correct for the multiverse for several reasons. First, predictions can be made in the multiverse: it leads only to statistical results, but this is also true for any physical theory within our universe, owing both to fundamental quantum fluctuations and to measurement uncertainties. Secondly, it has never been necessary to check all of the predictions of a theory to consider it as legitimate science. General relativity, for example, has been extensively tested in the visible world and this allows us to use it within black holes even though it is not possible to go there to check. Finally, the critical rationalism of Popper is not the final word in the philosophy of science. Sociologists, aestheticians and epistemologists have shown that there are other demarcation criteria to consider. History reminds us that the definition of science can only come from within and from the praxis: no active area of intellectual creation can be strictly delimited from outside. If scientists need to change the borders of their own field of research, it would be hard to justify a philosophical prescription preventing them from doing so. It is the same with art: nearly all artistic innovations of the 20th century have transgressed the definition of art as would have been given by a 19th-century aesthetician. Just as with science and scientists, art is internally defined by artists.

For all of these reasons, it is worth considering seriously the possibility that we live in a multiverse. This could allow understanding of the two problems of complexity and naturalness. The fact that the laws and couplings of physics appear to be fine-tuned to such an extent that life can exist and most fundamental quantities assume extremely “improbable” values would appear obvious if our entire universe were just a tiny part of a huge multiverse where different regions exhibit different laws. In this view, we are living in one of the “anthropically favoured” regions. This anthropic selection has strictly teleological and no theological dimension and absolutely no link with any kind of “intelligent design”. It is nothing other than the obvious generalization of the selection effect that already has to be taken into account within our own universe. When dealing with a sample, it is impossible to avoid wondering if it accurately represents the full set, and this question must of course be asked when considering our universe within the multiverse.

The multiverse is not a theory. It appears as a consequence of some theories, and these have other predictions that can be tested within our own universe. There are many different kinds of possible multiverses, depending on the particular theories, some of them even being possibly interwoven.

The most elementary multiverse is simply the infinite space predicted by general relativity – at least for flat and hyperbolic geometries. An infinite number of Hubble volumes should fill this meta-world. In such a situation, everything that is possible (i.e. compatible with the laws of physics as we know them) should occur. This is true because an event with a non-vanishing probability has to happen somewhere if space is infinite. The structure of the laws of physics and the values of fundamental parameters cannot be explained by this multiverse, but many specific circumstances can be understood by anthropic selections. Some places are, for example, less homogenous than our Hubble volume, so we cannot live there because they are less life-friendly than our universe, where the primordial fluctuations are perfectly adapted as the seeds for structure formation.

General relativity also faces the multiverse issue when dealing with black holes. The maximal analytic extension of the Schwarzschild geometry, as exhibited by conformal Penrose–Carter diagrams, shows that another universe could be seen from within a black hole. This interesting feature is well known to disappear when the collapse is considered dynamically. The situation is, however, more interesting for charged or rotating black holes, where an infinite set of universes with attractive and repulsive gravity appear in the conformal diagram. The wormholes that possibly connect these universes are extremely unstable, but this does not alter the fact that this solution reveals other universes (or other parts of our own universe, depending on the topology), whether accessible or not. This multiverse is, however, extremely speculative as it could be just a mathematical ghost. Furthermore, nothing allows us to understand explicitly how it formed.

A much more interesting pluriverse is associated with the interior of black holes when quantum corrections to general relativity are taken into account. Bounces should replace singularities in most quantum gravity approaches, and this leads to an expanding region of space–time inside the black hole that can be considered as a universe. In this model, our own universe would have been created by such a process and should also have a large number of child universes, thanks to its numerous stellar and supermassive black holes. This could lead to a kind of cosmological natural selection in which the laws of physics tend to maximize the number of black holes (just because such universes generate more universes of the same kind). It also allows for several possible observational tests that could refute the theory and does not rely on the use of any anthropic argument. However, it is not clear how the constants of physics could be inherited from the parent universe by the child universe with small random variations and the detailed model associated with this scenario does not yet exist.

One of the richest multiverses is associated with the fascinating meeting of inflationary cosmology and string theory. On the one hand, eternal inflation can be understood by considering a massive scalar field. The field will have quantum fluctuations, which will, in half of the regions, increase its value; in the other half, the fluctuations will decrease the value of the field. In the half where the field jumps up, the extra energy density will cause the universe to expand faster than in the half where the field jumps down. After some time, more than half of the regions will have the higher value of the field simply because they expand faster than the low-field regions. The volume-averaged value of the field will therefore rise and there will always be regions in which the field is high: the inflation becomes eternal. The regions in which the scalar field fluctuates downward will branch off from the eternally inflating tree and exit inflation.

On the other hand, string theory has recently faced a third change of paradigm. After the revolutions of supersymmetry and duality, we now have the “landscape”. This metaphoric word refers to the large number (maybe 10500) of possible false vacua of the theory. The known laws of physics would just correspond to a specific island among many others. The huge number of possibilities arises from different choices of Calabi–Yau manifolds and different values of generalised magnetic fluxes over different homology cycles. Among other enigmas, the incredibly strange value of the cosmological constant (why are the 119 first decimals of the “natural” value exactly compensated by some mysterious phenomena, but not the 120th?) would simply appear as an anthropic selection effect within a multiverse where nearly every possible value is realised somewhere. At this stage, every bubble-universe is associated with one realisation of the laws of physics and contains itself an infinite space where all contingent phenomena take place somewhere. Because the bubbles are causally disconnected forever (owing to the fast “space creation” by inflation) it will not be possible to travel and discover new laws of physics.

This multiverse – if true – would force a profound change of our deep understanding of physics. The laws reappear as kinds of phenomena; the ontological primer of our universe would have to be abandoned. At other places in the multiverse, there would be other laws, other constants, other numbers of dimensions; our world would be just a tiny sample. It could be, following Copernicus, Darwin and Freud, the fourth narcissistic injury.

Quantum mechanics was probably among the first branches of physics leading to the idea of a multiverse. In some situations, it inevitably predicts superposition. To avoid the existence of macro-scopic Schrödinger cats simultaneously living and dying, Bohr introduced a reduction postulate. This has two considerable drawbacks: first, it leads to an extremely intricate philosophical interpretation where the correspondence between the mathe-matics underlying the physical theory and the real world is no longer isomorphic (at least not at any time), and, second, it violates unitarity. No known physical phenomenon – not even the evaporation of black holes in its modern descriptions – does this.

These are good reasons for considering seriously the many-worlds interpretation of Hugh Everett. Every possible outcome to every event is allowed to define or exist in its own history or universe, via quantum decoherence instead of wave function collapse. In other words, there is a world where the cat is dead and another one where it is alive. This is simply a way of trusting strictly the fundamental equations of quantum mechanics. The worlds are not spatially separated, but exist more as kinds of “parallel” universes. This tantalising interpretation solves some paradoxes of quantum mechanics but remains vague about how to determine when splitting of universes happens. This multiverse is complex and, depending on the very quantum nature of phenomena leading to other kinds of multiverses, it could lead to higher or lower levels of diversity.

More speculative multiverses can also be imagined, associated with a kind of platonic mathematical democracy or with nominalist relativism. In any case, it is important to underline that the multiverse is not a hypothesis invented to answer a specific question. It is simply a consequence of a theory usually built for another purpose. Interestingly, this consequence also solves many complexity and naturalness problems. In most cases, it even seems that the existence of many worlds is closer to Ockham’s razor (the principle of simplicity) than the ad hoc assumptions that would have to be added to models to avoid the existence of other universes.

Given a model, for example the string-inflation paradigm, is it possible to make predictions in the multiverse? In principle, it is, at least in a Bayesian approach. The probability of observing vacuum i (and the associated laws of physics) is simply Pi = Piprior fi where Piprior is determined by the geography of the landscape of string theory and the dynamics of eternal inflation, and the selection factor fi characterizes the chances for an observer to evolve in vacuum i. This distribution gives the probability for a randomly selected observer to be in a given vacuum. Clearly, predictions can only be made probabilistically, but this is already true in standard physics. The fact that we can observe only one sample (our own universe) does not change the method qualitatively and still allows the refuting of models at given confidence levels. The key points here are the well known peculiarities of cosmology, even with only one universe: the observer is embedded within the system described; the initial conditions are critical; the experiment is “locally” irreproducible; the energies involved have not been experimentally probed on Earth; and the arrow of time must be conceptually reversed.

However, this statistical approach to testing the multiverse suffers from severe technical short cuts. First, while it seems natural to identify the prior probability with the fraction of volume occupied by a given vacuum, the result depends sensitively on the choice of a space-like hypersurface on which the distribution is to be evaluated. This is the so-called “measure problem” in the multiverse. Second, it is impossible to give any sensible estimate of fi. This would require an understanding of what life is – and even of what consciousness is – and that simply remains out of reach for the time being. Except in some favourable cases – for example when all the universes of the multiverse present a given characteristic that is incompatible with our universe – it is hard to refute explicitly a model in the multiverse. But difficult in practice does not mean intrinsically impossible. The multiverse remains within the realm of Popperian science. It is not qualitatively different from other proposals associated with usual ways of doing physics. Clearly, new mathematical tools and far more accurate predictions in the landscape (which is basically totally unknown) are needed for falsifiability to be more than an abstract principle in this context. Moreover, falsifiability is just one criterion among many possible ones and it should probably not be over-determined.

When facing the question of the incredible fine-tuning required for the fundamental parameters of physics to allow the emergence of complexity, there are few possible ways of thinking. If one does not want to use God or rely on an unbelievable luck that led to extremely specific initial conditions, there are mainly two remaining possible hypotheses. The first would be to consider that since complexity – and in particular, life – is an adaptive process, it would have emerged in nearly any kind of universe. This is a tantalising answer, but our own universe shows that life requires extremely specific conditions to exist. It is hard to imagine life in a universe without chemistry, maybe without bound states or with other numbers of dimensions. The second idea is to accept the existence of many universes with different laws where we naturally find ourselves in one of those compatible with complexity. The multiverse was not imagined to answer this specific question but appears “spontaneously” in serious physical theories, so it can be considered as the simplest explanation to the puzzling issue of naturalness. This of course does not prove the model to be correct, but it should be emphasised that there is absolutely no “pre-Copernican” anthropocentrism in this thought process.

It could well be that the whole idea of multiple universes is misleading. It could well be that the discovery of the most fundamental laws of physics will make those parallel worlds totally obsolete in a few years. It could well be that with the multiverse, science is just entering a “no through road”. Prudence is mandatory when physics tells us about invisible spaces. But it could also very well be that we are facing a deep change of paradigm that revolutionizes our understanding of nature and opens new fields of possible scientific thought. Because they lie on the border of science, these models are dangerous, but they offer the extraordinary possibility of constructive interference with other kinds of human knowledge. The multiverse is a risky thought – but, then again, let’s not forget that discovering new worlds has always been risky.

THE CYCLIC UNIVERSE: PAUL STEINHARDT A Conversation with Paul J. Steinhardt [11.19.02]

…in the last year I’ve been involved in the development of an alternative theory that turns the cosmic history topsy-turvy. All the events that created the important features of our universe occur in a different order, by different physics, at different times, over different time scales—and yet this model seems capable of reproducing all of the successful predictions of the consensus picture with the same exquisite detail.

PAUL STEINHARDT is the Albert Einstein Professor in Science and on the faculty of both the Departments of Physics and Astrophysical Sciences at Princeton University.

He is one of the leading theorists responsible for inflationary theory. He constructed the first workable model of inflation and the theory of how inflation could produce seeds for galaxy formation. He was also among the first to show evidence for dark energy and cosmic acceleration, introducing the term “quintessence” to refer to dynamical forms of dark energy. With Neil Turok he has pioneered mathematical and computational techniques which decisively disproved rival theories of structure formation such as cosmic strings. He made leading contributions to inflationary theory and to our understanding of the origin of the matter-antimatter asymmetry in the Universe. Hence, the authors not only witnessed but also led firsthand the revolutionary developments in the standard cosmological model caused by the fusion of particle physics and cosmology in the last 20 years.

THE CYCLIC UNIVERSE: PAUL STEINHARDT

[PAUL STEINHARDT:] I am theoretical cosmologist, so I am addressing the issue from that point of view. If you were to ask most cosmologists to give a summary of where we stand right now in the field, they would tell you that we live in a very special period in human history where, thanks to a whole host of advances in technology, we can suddenly view the very distant and very early universe in ways that we haven’t been able to do ever before. For example, we can get a snapshot of what the universe looked like in its infancy, when the first atoms were forming. We can get a snapshot of what the universe looked like in its adolescence, when the first stars and galaxies were forming. And we are now getting a full detail, three-dimensional image of what the local universe looks like today. When you put together this different information, which we’re getting for the first time in human history, you obtain a very tight series of constraints on any model of cosmic evolution. If you go back to the different theories of cosmic evolution in the early 1990’s, the data we’ve gathered in the last decade has eliminated all of them—save one, a model that you might think of today as the consensus model. This model involves a combination of the Big Bang model as developed in the 1920s, ’30s, and ’40s; the Inflationary Theory, which Alan Guth proposed in the 1980s; and a recent amendment that I will discuss shortly. This consensus theory matches the observations we have of the universe today in exquisite detail. For this reason, many cosmologists conclude that we have finally determined the basic cosmic history of the universe.

But I have a rather different point of view, a view that has been stimulated by two events. The first is the recent amendment to which I referred earlier. I want to argue that the recent amendment is not simply an amendment, but a real shock to our whole notion of time and cosmic history. And secondly, in the last year I’ve been involved in the development of an alternative theory that turns the cosmic history topsy-turvy. All the events that created the important features of our universe occur in a different order, by different physics, at different times, over different time scales—and yet this model seems capable of reproducing all of the successful predictions of the consensus picture with the same exquisite detail.

The key difference between this picture and the consensus picture comes down to the nature of time. The standard model, or consensus model, assumes that time has a beginning that we normally refer to as the Big Bang. According to this model, for reasons we don’t quite understand, the universe sprang from nothingness into somethingness, full of matter and energy, and has been expanding and cooling for the past 15 billion years. In the alternative model the universe is endless. Time is endless in the sense that it goes on forever in the past and forever in the future, and, in some sense, space is endless. Indeed, our three spatial dimensions remain infinite throughout the evolution of the universe.

More specifically, this model proposes a universe in which the evolution of the universe is cyclic. That is to say, the universe goes through periods of evolution from hot to cold, from dense to under-dense, from hot radiation to the structure we see today, and eventually to an empty universe. Then, a sequence of events occurs that cause the cycle to begin again. The empty universe is reinjected with energy, creating a new period of expansion and cooling. This process repeats periodically forever. What we’re witnessing now is simply the latest cycle.

The notion of a cyclic universe is not new. People have considered this idea as far back as recorded history. The ancient Hindus, for example, had a very elaborate and detailed cosmology based on a cyclic universe. They predicted the duration of each cycle to be 8.64 billion years—a prediction with three-digit accuracy. This is very impressive, especially since they had no quantum mechanics and no string theory! It disagrees with the number that I’m going suggest, which is trillions of years rather than billions.

The cyclic notion has also been a recurrent theme in Western thought. Edgar Allan Poe and Friedrich Nietzsche, for example, each had cyclic models of the universe, and in the early days of relativistic cosmology, Albert Einstein, Alexandr Friedman, Georges Lemaître, and Richard Tolman were interested in the cyclic idea. I think it is clear why so many have found the cyclic idea to be appealing: If you have a universe with a beginning, you have the challenge of explaining why it began and the conditions under which it began. If you have a universe, which is cyclic, it is eternal, so you don’t have to explain the beginning.

During the attempts to try to bring cyclic ideas into modern cosmology, it was discovered in the ’20s and ’30s that there are various technical problems. The idea at that time was a cycle in which our three-dimensional universe goes through periods of expansion beginning from the Big Bang and then reversal to contraction and a big crunch. The universe bounces and expansion begins again. One problem is that, every time the universe contracts to a crunch, the density and temperature of the universe rises to an infinite value, and it is not clear if the usual laws of physics can be applied. Second, every cycle of expansion and contraction creates entropy through natural thermodynamic processes, which adds to the entropy from earlier cycles. So, at the beginning of a new cycle, there is higher entropy density than the cycle before. It turns out that the duration of a cycle is sensitive to the entropy density. If the entropy increases, the duration of the cycle increases as well. So, going forward in time, each cycle becomes longer than the one before. The problem is that, extrapolating back in time, the cycles become shorter until, after a finite time, they shrink to zero duration. The problem of avoiding a beginning has not been solved. It has simply been pushed back a finite number of cycles. If we’re going to reintroduce the idea of a truly cyclic universe, these two problems must be overcome. The cyclic model that I will describe uses new ideas to do just that.

To appreciate why an alternative model is worth pursuing, its important to get a more detailed impression of what the consensus picture is like. Certainly some aspects are appealing. But, what I want to argue is that, overall, the consensus model is not so simple. In particular, recent observations have forced us to amend the consensus model and make it more complicated. So, let me begin with an overview of the consensus model.

The consensus theory begins with the Big Bang: the universe has a beginning. It’s a standard assumption that people have made over the last 50 years, but it’s not something we can prove at present from any fundamental laws of physics. Furthermore, you have to assume that the universe began with an energy density less than the critical value. Otherwise, the universe would stop expanding and recollapse before the next stage of evolution, the inflationary epoch. In addition, to reach this inflationary stage, there must be some sort of energy to drive the inflation. Typically this is assumed to be due to an inflation field. You have to assume that in those patches of the universe that began at less than the critical density, a significant fraction of the energy is stored in inflation energy so that it can eventually overtake the universe and start the period of accelerated expansion. All of these are reasonable assumption, but assumptions nevertheless. It’s important that to count these assumptions and ingredients, because they are helpful in comparing the consensus model to the challenger.

Assuming these conditions are met, the inflation energy overtakes the matter and radiation after a few instants. The inflationary epoch commences and the expansion of the universe accelerates at a furious pace. The inflation does a number of miraculous things: it makes the universe homogeneous, it makes the universe flat, and it leaves behind certain inhomogeneities, which are supposed to be the seeds for the formation of galaxies. Now the universe is prepared to enter the next stage of evolution with the right conditions. According to the inflationary model, the inflation energy decays into a hot gas of matter and radiation. After a second or so, there form the first light nuclei. After a few tens of thousands of years, the slowly moving matter dominates the universe. It’s during these stages that the first atoms form, the universe becomes transparent, and the structure in the universe begins to form—the first stars and galaxies. Up to this point the story is relatively simple.

But, there is the recent discovery that we’ve entered a new stage in the evolution of the universe. After the stars and galaxies have formed, something strange has happened to cause the expansion of the universe to speed up again. During the 15 billion years when matter and radiation dominated the universe and structure was forming, the expansion of the universe was slowing down because the matter and radiation within it is gravitationally self-attractive and resists the expansion of the universe. Until very recently, it had been presumed that matter would continue to be the dominant form of energy in the universe, and this deceleration would continue forever.

But we’ve discovered instead, due to recent observations that the expansion of the universe is speeding up. This means that most of the energy of the universe is neither matter nor radiation. Rather, another form of energy has overtaken the matter and radiation. For lack of a better term, this new energy form is called “dark energy.” Dark energy, unlike the matter and radiation that we’re familiar with, is gravitationally self-repulsive. That’s why it causes the expansion to speed up rather than slow down. In Newton’s theory gravity, all mass is gravitationally attractive, but Einstein’s theory allows the possibility of forms of energy that are gravitationally self-repulsive.

I don’t think either the physics or cosmology communities, or even the general public, have fully absorbed the full implications of this discovery. This is a revolution in the grand historic sense—in the Copernican sense. In fact, if you think about Copernicus—from whom we derive the word revolution—his importance was that he changed our notion of space and of our position in the universe. By showing that the earth revolves around the sun, he triggered a chain of ideas that led us to the notion that we live in no particular place in the universe; there’s nothing special about where we are. Now we’ve discovered something very strange about the nature of time: that we may live in no special place, but we do live at a special time, a time of recent transition from deceleration to acceleration; from one in which matter and radiation dominate the universe to one in which they are rapidly becoming insignificant components; from one in which structure is forming in ever-larger scales to one in which now, because of this accelerated expansion, structure formation stops. We are in the midst of the transition between these two stages of evolution. And just as Copernicus’s proposal that the earth is no longer the center of the universe led to a chain of ideas that changed our whole outlook on the structure of the solar system and eventually to the structure of the universe, it shouldn’t be too surprising that perhaps this new discovery of cosmic acceleration could lead to a whole change in our view of cosmic history. That’s a big part of the motivation for thinking about our alternative proposal.

With these thoughts about the consensus model in mind, let me turn to the cyclic proposal. Since it’s cyclic, I’m allowed to begin the discussion of the cycle at any point I choose. To make the discussion parallel, I’ll begin at a point analogous to the Big Bang; I’ll call it The Bang. This is a point in the cycle where the universe reaches its highest temperature and density. In this scenario, though, unlike the Big Bang model, the temperature and density don’t diverge. There is a maximal, finite temperature. It’s a very high temperature, around 10[20] (ten to the 20) degrees Kelvin—hot enough to evaporate atoms and nuclei into their fundamental constituents—but it’s not infinite. In fact, it’s well below the so-called Planck energy scale, where quantum gravity effects dominate. The theory begins with a bang and then proceeds directly to a phase dominated by radiation. In this scenario you do not have the inflation one has in the standard scenario. You still have to explain why the universe is flat, you still have to explain why the universe is homogeneous, and you still have to explain where the fluctuations came from that led to the formation of galaxies, but that’s not going to be explained by an early stage of inflation. It’s going to be explained by yet a different stage in the cyclic universe, which I’ll get to.

In this new model, you go directly to a radiation-dominated universe and form the usual nuclear abundances; then go directly to a matter-dominated universe in which the atoms and galaxies and larger scale structure form; and then proceed to a phase of the universe dominated by dark energy. In the standard case, the dark energy comes as a surprise, since it is something you have to add into the theory to make it consistent with what we observe. In the cyclic model, the dark energy moves to center stage as the key ingredient that is going to drive the universe, and in fact drives the universe into the cyclic evolution. The first thing the dark energy does when it dominates the universe is what we observe today: it causes the expansion of the universe to begin to accelerate. Why is that important? Although this acceleration rate is a hundred orders of magnitude smaller than the acceleration than one gets in inflation, if you give the universe enough time, it actually accomplishes the same feat that inflation does. Over time it thins out the distribution of matter and radiation in the universe, making the universe more and more homogeneous and isotropic—in fact, making it perfectly so—driving it into what is essentially a vacuum state.

Seth Lloyd said there were 10[80] (ten to the 80) or 10[90] (ten to the 90) bits inside the horizon, but if you were to look around the universe in a trillion years, you would find on average no bits inside your horizon, or less than one bit inside your horizon. In fact, when you count these bits, it’s important to realize that now that the universe is accelerating our computer is actually losing bits from inside our horizon. This is something that we observe.

At the same time that the universe is made homogeneous and isotropic, it is also being made flat. If the universe had any warp or curvature to it, or if you think about the universe stretching over this long period of time, although it’s a slow process it makes the space extremely flat. If it continued forever, of course, that would be the end of the story. But in this scenario, just like inflation, the dark energy only survives for a finite period, and triggers a series of events that eventually lead to a transformation of energy from gravity into new energy and radiation that will then start a new period of expansion of the universe. From a local observer’s point of view, it looks like the universe goes through exact cycles; that is to say, it looks like the universe empties out each round, and a new matter and radiation is created, leading a new period of expansion. In this sense it’s a cyclic universe. If you were a global observer and could see the entire universe, you’d discover that our three dimensions are forever infinite in this story. What’s happened is that at each stage, when we create matter and radiation, it gets thinned out. It’s out there somewhere, but it’s getting thinned out. Locally, it looks like the universe is cyclic, but globally the universe has a steady evolution, a well-defined era in which, over time and throughout our three dimensions, entropy increases from cycle to cycle.

Exactly how this works in detail can be described in various ways. I will choose to present a very nice geometrical picture that is motivated by superstring theory. We use only a few basic elements from superstring theory, so you don’t really have to know anything about superstring theory to understand what I’m going to talk about, except to understand that some of the strange things that I’m going to introduce I am not introducing for the first time. They are already sitting there in superstring theory waiting to be put to good purpose.

One of the ideas in superstring theory is that there are extra dimensions; it’s an essential element to that theory that is necessary to make it mathematically consistent. In one particular formulation of that theory the universe has a total of 11 dimensions. Six of them are curled up into a little ball so tiny that, for my purposes, I’m just going to pretend that they’re not there. However, there are three spatial dimensions, one time dimension, and one additional dimension that I do want to consider. In this picture our three dimensions with which we’re familiar and through which we move, lies along a hypersurface or membrane. This membrane is a boundary of the extra dimension. There is another boundary or membrane on the other side. In between, there’s an extra dimension that, if you like, only exists over a certain interval. It’s like we are one end of a sandwich, in between which there is a so-called bulk volume of space. These surfaces are referred to as orbifolds or branes—the latter referring to the word membrane. The branes have physical properties. They have energy and momentum, and when you excite them you can produce things like quarks and electrons. We are composed of the quarks and electrons on of these branes. And, since quarks and leptons can only move along branes, we are restricted to moving along and seeing only the three dimensions of our branes. We cannot see directly the bulk or any matter on the other brane.

In the cyclic universe, at regular intervals of trillions of years, these two branes smash together. This creates all kinds of excitations—particles and radiation. The collision thereby heats up the branes, and then they bounce apart again. The branes are attracted to each other through a force that acts just like a spring, causing the branes come together at regular intervals. To describe it more completely, what’s happening is that the universe goes through two kinds of stages of motion. When the universe has matter and radiation in it, or when the branes are far enough apart, the main motion is the branes stretching, or, equivalently, our three-dimensions expanding. During this period, the branes more or less remain a fixed distance apart. That’s what’s been happening, for example, in the last 15 billion years. During these stages, our three dimensions are stretching just as they normally would. At a microscopic distance away, there is another brane sitting and expanding, but since we can’t touch, feel, or see across the bulk, we can’t sense it directly. If there is a clump of matter over there, we can feel the gravitational effect, but we can’t see any light or anything else that it emits, because anything it emits is going to move along that brane. We only see things that move along our own brane.

Next, the energy associated with the force between these branes takes over the universe. From our vantage point on one of the branes, this acts just like the dark energy we observe today. It causes the branes to accelerate in their stretching to the point where all the matter and radiation produced since the last collision is spread out, and the branes become essentially smooth, flat, empty surfaces. If you like, you can think of them as being wrinkled and full of matter up to this point, and then stretching by a fantastic amount over the next trillion years. The stretching causes the mass and energy on the brane to thin out and the wrinkles to be smoothed out. After trillions of years, the branes are, for all intents and purposes, smooth, flat, parallel and empty.

Then, the force between these two branes slowly brings the branes together. As it brings them together, the force grows stronger and the branes speed towards one another. When they collide, there’s a walloping impact—enough to bring create a high density of matter and radiation with a very high, albeit finite temperature. The two branes go flying apart, more or less back to where they are, and then the new matter and radiation (through the action of gravity) causes the branes to begin a new period of stretching.

In this picture it’s clear that the universe is going through periods of expansion, and a funny kind of contraction. Where the two branes come together, it’s not a contraction of our dimensions, but a contraction of the extra dimension. Before the contraction, all matter and radiation has been spread out, but, unlike the old cyclic models of the 20’s and 30’s, it doesn’t come back together again during the contraction because our three dimensions—that is, the branes—remain stretched out. Only the extra dimension contracts. This process repeats itself cycle after cycle.

If you compare the cyclic model to the consensus picture, two of the functions of inflation—namely, flattening and homogenizing the universe—are accomplished by the period of accelerated expansion that we’ve now just begun. Of course, I really mean the analogous expansion that occurred one cycle ago before the most recent Bang. The third function of inflation—producing fluctuations in the density—occurs as these two branes come together. As they approach, quantum fluctuations cause the branes to begin to wrinkle. And because they are wrinkled, they do not collide everywhere at the same time. Rather, some regions collide a bit earlier than others. This means that some regions reheat to a finite temperature and begin to cool a little bit before other regions. When the branes come apart again, the temperature of the universe is not perfectly homogeneous but has spatial variations left over from the quantum wrinkles.

Remarkably, although the physical processes are completely different, and the time scale is completely different—this is taking billions of years, instead of 10[-30] (ten to the -30) seconds—it turns out that the spectrum of fluctuations you get in the distribution of energy and temperature is essentially the same as what you get in inflation. Hence, the cyclic model is also in exquisite agreement with all of the measurements of the temperature and mass distribution of the universe that we have today.

Because the physics in these two models is quite different, there is an important distinction in what we would observe if one or the other were actually true—although this effect has not been detected yet. In inflation when you create fluctuations, you don’t just create fluctuations in energy and temperature, but you also create fluctuations in spacetime itself, so-called gravitational waves. That’s a feature that we hope to look for in experiments in the coming decades as a verification of the consensus model. In our model you don’t get those gravitational waves. The essential difference is that inflationary fluctuations are created in a hyperrapid, violent process that is strong enough to created gravitational waves, whereas cyclic fluctuations are created in an ultraslow, gentle process that is too weak to produce gravitational waves. That’s an example where the two models give an observational prediction that is dramatically different. It’s just difficult to observe at the present time.

What’s fascinating at the moment is that we have two paradigms that are now available to us. On the one hand they are poles apart, in terms of what they tell us about the nature of time, about our cosmic history, about the order in which events occur, and about the time scale on which they occur. On the other hand they are remarkably similar in terms of what they predict about the universe today. Ultimately what will decide between the two is be a combination of observations—for example, the search for cosmic gravitational waves—and theory—because a key aspect to this scenario entails assumptions about what happens at the collision between branes that might be checked or refuted in superstring theory. In the meantime, for the next few years, we can all have great fun speculating about the implications of each of these ideas, which we prefer, and how we can best distinguish between them.