Rss

Archives for : August2016

Black-hole computing – Might nature’s bottomless pits actually be ultra-efficient quantum computers? That could explain why data never dies – Sabine Hossenfelder

After you die, your body’s atoms will disperse and find new venues, making their way into oceans, trees and other bodies. But according to the laws of quantum mechanics, all of the information about your body’s build and function will prevail. The relations between the atoms, the uncountable particulars that made you you, will remain forever preserved, albeit in unrecognisably scrambled form – lost in practice, but immortal in principle.

There is only one apparent exception to this reassuring concept: according to our current physical understanding, information cannot survive an encounter with a black hole. Forty years ago, Stephen Hawking demonstrated that black holes destroy information for good. Whatever falls into a black hole disappears from the rest of the Universe. It eventually reemerges in a wind of particles – ‘Hawking radiation’ – that leaks away from the event horizon, the black hole’s outer physical boundary. In this way, black holes slowly evaporate, but the process erases all knowledge about the black hole’s formation. The radiation merely carries data for the total mass, charge and angular momentum of the matter that collapsed; every other detail about anything that fell into the black hole is irretrievably lost.

Hawking’s discovery of black-hole evaporation has presented theoretical physicists with a huge conundrum: general relativity says that black holes must destroy information; quantum mechanics says it cannot happen because information must live on eternally. Both general relativity and quantum mechanics are extremely well-tested theories, and yet they refuse to combine. The clash reveals something much more fundamental than a seemingly exotic quirk about black holes: the information paradox makes it aptly clear that physicists still do not understand the fundamental laws of nature.

But Gia Dvali, professor of physics at the Ludwig-Maximilians University of Munich, believes he’s found the solution. ‘Black holes are quantum computers,’ he says. ‘We have an explicit information-processing sequence.’ If he is correct, the paradox is no more, and information truly is immortal. Even more startling, perhaps, is that his concept has practical implications. In the future, we might be able to tap black-hole physics to construct quantum computers of our own.

The main reason why recovering information from black holes seems impossible is that they are almost featureless spheroids with essentially no physical attributes on their horizons; they have ‘no hair’, as the late US physicist John Wheeler put it. You cannot store information in something that has no features that could be used to encode it, the standard argument goes. And therein lies the error, Dvali says: ‘All these no-hair theorems are wrong.’ He and his collaborators argue that gravitons – the so-far undiscovered quanta that carry gravity and make up space-time – stretch throughout the black hole and give rise to ‘quantum hair’ which allows storing as well as releasing information.

The new research builds on a counter-intuitive feature of quantum theory: quantum effects are not necessarily microscopically small. True, those effects are fragile, and are destroyed quickly in warm and busy environments, such as those typically found on Earth. This is why we don’t normally witness them. This is also the main challenge in building quantum computers, which process information using the quantum states of particles instead of the on-off logic of traditional transistors. But in a cold and isolated place, quantum behaviour can persist over large distances – large enough to span the tens to billions of kilometres of a black-hole horizon.

You don’t even need to go to outer space to witness long-range quantum effects. The enormous distances and masses necessary to create black-hole quantum hair might be far beyond our experimental capabilities, but by cooling atoms down to less than one ten-thousandth of a Kelvin (that is, one ten-thousandth of a degree above absolute zero), researchers have condensed up to a billion atoms, spread out over several millimetres, into a single quantum state. That’s huge for collective quantum behaviour.

Hawking’s information puzzle would find a natural solution if black holes are, in essence, puddles of condensed gravity

Such an atomic collective – known as a Bose-Einstein condensate, named after the Indian physicist Satyendra Bose and Albert Einstein – is currently one of the most promising tools for creating a workable quantum computer. Quantum effects within a Bose-Einstein condensate, like the ability to be in two places at the same time, can stretch through the whole condensate, giving rise to many interlocked states. Enormous information-processing power could become available if researchers succeed in stabilising the condensate and controlling these states. And, not coincidentally, Bose-Einstein condensates might also solve the decades-old puzzle of black-hole information loss.

Hawking’s information puzzle would find a natural solution, Dvali notes, if black holes consist of gravitons that have undergone Bose-Einstein condensation – puddles of condensed gravity, in essence. The idea might sound crazy, but for Dvali it’s a perfectly reasonable conclusion, drawn from what physicists have learned about black-hole information in the years since Hawking first posed his riddle. Theorists know how to calculate how much information the black hole must be able to store: the amount is quantified in the black hole’s entropy and proportional to the horizon surface area. They have also found that black holes can redistribute or ‘scramble’ information very quickly. And finally, they know the pace at which information must escape from the black hole in order to avoid conflicts with quantum mechanics.

Starting in 2012, Dvali explored these various attributes and discovered, to his surprise, that certain types of Bose-Einstein condensates share their essential properties with black holes. To act like a black hole, the condensate must linger at a transition point – its so-called quantum critical point – where extended fluctuations span through the fluid just before the quantum behaviour collapses. Such a quantum-critical condensate, Dvali calculated, has the same entropy, scrambling capacity and release time as a black hole: it has just the right quantum hair. ‘Somebody can say this is a coincidence, but I consider it extremely strong evidence – mathematical evidence that is – that black holes genuinely are Bose-Einstein condensates,’ he says.

Linking black holes with a form of matter that can be created in the lab means that some aspects of Dvali’s idea can be explored experimentally. Immanuel Bloch, professor of physics at the Max-Planck-Institute in Munich, has first-hand experience with Bose-Einstein condensates. He condenses atoms in ‘crystals of light’ – optical lattices created by intersecting multiple laser beams – and then takes snapshots of the condensate using a technique called fluorescence imaging. The resulting pictures beautifully reveal the atoms’ correlated quantum behaviour.

Bloch finds Dvali’s idea, which originated in a field entirely different from his, intriguing. ‘I am pretty excited about Gia’s proposal. I think that’s something really new,’ Bloch says. ‘People have seen collapse dynamics with interacting condensates, but nobody has so far investigated the quantum critical point and what happens there.

‘In the BEC [Bose-Einstein condensate] you have macroscopic quantum waves, and this means in the quantum numbers you have a lot of fluctuations. This is why the BEC normally looks like a Swiss cheese,’ he continues. But by applying a magnetic field, Bloch can change the strength by which the atoms interact, thereby coaxing them into an orderly lattice. ‘Now you make the atoms strongly interacting, then you go to the [very orderly] “Mott state”. This is a great state for quantum computing because you have this regular array. And you can address the atoms with lasers and rotate them around and change the spin [to encode and process information].’

‘Dvali’s idea is competing with a lot of other stuff out on the market. I have more skepticism than faith’

According to Dvali, black-hole physics reveals a better way to store information in a Bose-Einstein condensate by using different quantum states. Black holes are the simplest, most compact, most efficient information storage devices that physicists know of. Using the black holes’ coding protocol therefore should be the best possible method to store information in condensate-based quantum computers.

Creating a black-hole-mimic condensate in the lab seems doable to Bloch: ‘[In a black hole,] the interaction strength adjusts itself. We can simulate something like that by tuning the interaction strength to where the condensate is just about to collapse. The fluctuations become bigger and bigger and bigger as you get closer to the quantum critical point. And that could simulate such a system. One could study all the quantum fluctuations and non-equilibrium situations – all that is now possible by observing these condensates in situ, with high spatial resolution.’

Just because realising Dvali’s idea is possible does not necessarily mean it is practical, however. ‘It’s competing with a lot of other stuff out on the market. Right now, I have more skepticism than faith,’ Bloch says. He also points out that efficient information storage is nice, but for quantum computers ‘information capacity is presently not the problem’. The biggest challenge he sees is finding a way to individually manipulate the quantum states that Dvali has identified – data processing, rather than data storage. There are other practical hurdles as well. ‘There are so many things we don’t know, like noise, is it resistant to noise? We don’t know,’ Bloch notes. ‘For me, the much more interesting aspect is the connection to gravitational physics.’ And here the implications go well beyond information storage.

Dvali’s is not the only recent research suggesting a connection between gravity and condensed-matter physics, a trend that has opened whole new realms to experimental investigation. In the tradition of Einstein, physicists generally think of curved space-time as the arena for matter and its interactions. But now several independent lines of research suggest that space-time might not be as insubstantial as we thought. Gravity, it seems, can emerge from non-gravitational physics.

In the past decades, numerous links between gravity and certain types of fluids have demonstrated that systems with collective quantum behaviour can mimic curved space-time, giving rise to much the same equations as one obtains in Einstein’s theory of general relativity. There is not yet any approach from which general relativity can be derived in full generality by positing that space-time is a condensate. For now, nobody knows whether it is possible at all. Still, the newfound relations allow physicists to study those gravitational systems that can be mimicked with atomic condensates.

Simulating gravity with condensates allows physicists to explore regions – such as black-hole horizons – that are not otherwise accessible to experiment. And so, although Hawking radiation has never been observed in real black holes, its analogue has been measured for black holes simulated through Bose-Einstein condensates. Of course, these condensates are not really black holes – they trap sound waves, not light – but they obey some of the same mathematical laws. The condensates do thus, in a sense, perform otherwise complicated, even intractable, physics calculations.

‘We like to speak of “quantum simulations” and try to use these systems to look for interesting phenomena that are hard to calculate on classical computers,’ says Bloch. ‘We are also trying to use this kind of system to test other systems like the black holes, or we looked at the [analogue of the] Higgs particle in two dimensions.’ In a 2012 Nature paper, Bloch and his collaborators reported that their quantum simulation revealed that Higgs-like particles can also exist in two dimensions. The same technique could in principle be used to study Bose-Einstein condensates behaving like black holes.

‘The black hole [no hair] theorems are, sorry, crap’

But using black-hole physics to develop new protocols for quantum computers is one thing. Finding out whether astrophysical black holes really are condensates of gravitons is another thing entirely. ‘I am not interested in the idea if one can’t test it,’ says Stefan Hofmann, a theoretical cosmologist and colleague of Dvali’s in Munich.

Hofmann therefore has dedicated significant time to exploring the observational consequences of the idea that black holes are graviton condensates. ‘The black hole [no hair] theorems are, sorry, crap,’ he agrees with Dvali. Hofmann thinks that the quantum hair nearby the black-hole horizon would subtly alter the predictions of general relativity (especially the emission of gravitational waves during formation or collision of black holes), in ways that should be detectable. ‘The dream would be a binary [black hole] merger,’ Hofmann said in a 2015 seminar. His dream has just become true: the LIGO collaboration recently announced the first measurement of gravitational waves emitted from a merging pair of black holes.

Hofmann and his collaborators have yet to make quantitative predictions, but due to the macroscopic quantum effects, Dvali’s proposed solution to the information-loss problem might soon become experimentally testable. However, the idea that black holes are quantum-critical condensates of gravitons, truly equivalent to a Bose-Einstein condensate, leaves many questions open. To begin with, Dvali’s calculations cannot explain what actually happens to matter falling into a black hole. And Hofmann admits that it isn’t clear how the object is a ‘black hole’ in the conventional sense, since it can no longer be described within the familiar framework of general relativity.

Carlo Rovelli from the University of Marseille thinks that, even in incomplete form, Davli’s idea of black holes as condensates might be scientifically useful. ‘They are using a brutal approximation which might fail to capture aspects, but it might work to some extent, especially in the long wavelength regime. For the low-frequency quantum fluctuations of [space-time] it may not be absurd,’ Rovelli says. He cautions, however, that the condensate model ‘cannot be a complete description of what happens in the black hole’.

What is clear, though, is that this research has revealed a previously unrecognised, and quite fruitful, relation. ‘We have a very interesting bridge between quantum information and black-hole physics that was not discussed before,’ Dvali says. If he is right, the implications are conceptually staggering. Information really does live on eternally. In that sense, we are all immortal. And the supermassive black hole at the centre of our galaxy? It’s actually a cosmic quantum computer.

2,300 words

Sabine Hossenfelder
is a research fellow at the Frankfurt Institute for Advanced Studies, with a special interest in the phenomenology of quantum gravity. Her writing has appeared in Forbes, Scientific American, and New Scientist, among others.

Ligo’s black holes that helped prove Einstein’s theory of gravitational waves could have been born inside a massive star – By Abigail Beall – For Mailonline 13:02 17 Feb 2016 (updated 16:38 17 Feb 2016)

Scientists detected the first warping of space-time caused by a collision of two black holes last week.
The historic signals were picked up by two advanced detectors.
At the same time Nasa’s Fermi telescope detected a gamma ray burst.
Gamma rays could mean two black holes lived inside a rotating star.

Last week, scientists made the ‘the scientific breakthrough of the century’ with the detection of gravitational waves.

When the waves were detected, they knew they had been caused by two black holes 30 times the size of the sun colliding.

But a second signal, seen by a telescope in space suggests both black holes could have been formed inside a gigantic star.

The discovery was the first time anyone had detected the warping of space-time caused by a collision of two massive black holes – something first predicted in Einstein’s Theory of General Relativity in 1915.

These gravitational waves, created 1.3 billion light-years from Earth, help confirm our universe was created by the Big Bang, and will give an unprecedented glimpse into its beginning.

The historic signals were picked up by two advanced Laser Interferometer Gravitational-wave Observatories (Ligo) in Louisiana and Washington.

Just 0.4 seconds later, Nasa’s Fermi telescope also detected a gamma ray burst, a flash of electromagnetic rays associated with high energy collisions.

In order to produce a gamma ray burst, a black hole needs to be fed at an enormous rate of somewhere between the mass of a planet and the mass of the sun every second.

This is only possible to get near the centre of a massive star at the end of its life.

This gamma ray signal was a surprise for physicists, as they would not normally be associated with the merging of two black holes.

New algorithm developed at MIT for imaging black holes By Jessica Hall – June 7, 2016 at 1:27 pm

Everybody knows you can’t see a black hole. Nothing gets out, not even light. Except that, as with most conventional wisdom, isn’t the whole story. Leaving aside the can of worms labeled Hawking radiation, we still know that the matter falling into a black hole heats up as it falls in. In theory, we can pick that up with a good old radio telescope. But black holes are so far away that we need way better angular resolution than any telescope we currently have, if we want to confirm these predictions with actual data.

“A black hole is very, very far away and very compact,” says Katie Bouman, a grad student at MIT working with an international collaboration called the Event Horizon Telescope. “It’s equivalent to taking an image of a grapefruit on the moon, but with a radio telescope. To image something this small means that we would need a telescope with a 10,000-kilometer diameter, which is not practical, because the diameter of the Earth is not even 13,000 kilometers.” This is where interferometry comes in. Bouman developed a new imaging algorithm called CHIRP, for Continuous High-resolution Image Reconstruction using Patch priors, and it uses interferometry, essentially, “to turn the entire planet into a large radio telescope dish.”

The Event Horizon Telescope is actually an array of radio telescopes working to image Sagittarius A*, the black hole at our galaxy’s center. We can’t image Sagittarius A* with optical means, because there’s just too much debris in the way. But the EHT uses interferometry to combine and compare the input from multiple telescopes, a Nobel prizewinning technique which confers much better angular resolution. With the angular resolution afforded by a radio telescope the effective size of the planet, we could use interferometry to find out whether or not our galaxy’s supermassive black hole actually looks like we think it does.

CHIRP works a little like an insect eye, in that it combines sections of the EHT array’s visual field into a coherent whole. Part of the method uses algebra to multiply measurements from three telescopes together, which triangulates away noise generated by the interference of Earth’s atmosphere. Six telescopes have already signed on to participate in the collaboration, but it can accommodate every telescope on Earth: Using CHIRP, Bouman’s project can stitch together what all the radio telescopes see.

“Normal” interferometry uses an algorithm that treats an image from a radio telescope as a collection of individual points of different brightness on a plane. It tries to find the points whose brightness and location most closely match the data. Then the algorithm blurs together bright points near each other, to meld the astronomical images together. In the new model, instead of points on a 2D plane, there are cones whose heights give the total brightness at any spot — black, empty sky would be represented by a cone of zero height. This sharpens the image and filters out noise, using the same principles that make constructive and destructive interference work.

But the earth isn’t exactly peppered with interferometers. There are large areas on the ground that aren’t collecting any data. CHIRP fills in the gaps by mathematically stitching together different telescopes’ fields of view, wherever they overlap, to create a continuous whole. It’s like a brightness topo map of the sky; tall places are bright spots. “Translating the model into a visual image is like draping plastic wrap over it: The plastic will be pulled tight between nearby peaks, but it will slope down the sides of the cones adjacent to flat regions,” the team said in a statement. “The altitude of the plastic wrap corresponds to the brightness of the image.”

To verify CHIRP’s predictions, Bouman and team loosed machine learning on the imaging problem. They trained the learning algorithm on images of celestial bodies, earthly objects and black holes, and found that CHIRP frequently outperformed its predecessors. Since Bouman made her test data available online, other researchers can use and improve on it.

The mysterious boundary – The entrance to a black hole could reveal insights into the Big Bang, the formation of galaxies and even death by spaghettification BY ANDREW GRANT 2:40PM, MAY 16, 2014

A black hole’s event horizon is a one-way bridge to nowhere, a gateway to a netherworld cut off from the rest of the cosmos.

Understanding what happens at that pivotal boundary could reveal the hidden influences that have molded the universe from the instant of the Big Bang.

Today some of the best minds in physics are fixated on the event horizon, pondering what would happen to hypothetical astronauts and subatomic particles upon reaching the precipice of a black hole. At stake is the nearly 100-year quest to unify the well-tested theories of general relativity and quantum mechanics into a supertheory of quantum gravity.

But the event horizon is more than just a thought experiment or a tool to merge physics theories. It is a very real feature of the universe, a pivotal piece of cosmic architecture that has shaped the evolution of stars and galaxies. As soon as next year, a telescope the size of Earth may allow us to spot the edge of the shadowy abyss for the first time.

By studying the event horizon through both theory and observation, scientists could soon figure out how the universe began, how it evolved and even predict its ultimate fate. They’d also be able to answer a crucial question: Would a person falling into a black hole be stretched and flattened like a noodle, dying by spaghettification, or be incinerated?

Gravitational gusto
Scientists thought about the possibility of black holes and event horizons long before either term existed. In 1783, British geologist and astronomer John Michell considered Newton’s work on gravity and light and found that, in theory, a star with 125 million times the mass of the sun would have enough gravitational oomph to pull in any object trying to escape — even one traveling at light speed.

Although stars can never attain that much mass, Albert Einstein’s 1916 general theory of relativity put Michell’s hunch about supermassive objects onto solid theoretical ground. Later that year, German astronomer Karl Schwarzschild used general relativity to show that some stars could collapse under their own gravity and create a deep pit in the fabric of space-time. Anything, including light, that came within a certain distance of the collapsed star’s center of mass could never come out. That point of no return became known as the event horizon.

Confirmation for the existence of black holes came decades later. In 1974, scientists detected a heavy dose of radio waves emitted from the center of the Milky Way, about 26,000 light-years away. They eventually concluded that there must be a black hole there. Today, astronomers know that virtually every galaxy harbors a giant black hole at its center, shaping the formation of millions of stars and even neighboring galaxies with its immense gravitational influence. Galaxies also contain millions of small- and medium-sized black holes, each with an event horizon past which light is never seen again.

But the repercussions of black holes’ extreme gravity eventually led to conflicts with one of the keystones of 20th century physics: quantum mechanics. The trouble began in the mid-1970s, when University of Cambridge physicist Stephen Hawking proposed that black holes are not eternal. In the far, far future, when black holes have devoured almost all the matter in the universe, leaving little else to consume, energy should slowly leak out from their event horizons. That energy, now known as Hawking radiation, should continue seeping out until each black hole evaporates completely.

Hawking quickly realized the drastic consequences of his proposal. In a chaos-inducing 1976 paper, he explained that if a black hole eventually disappears, then so should all the information about all the particles that ever fell into it. That violates a central tenet of quantum mechanics: Information cannot be destroyed. Physicists could accept that all the properties of all the particles within a black hole were locked up, forever inaccessible to those outside a black hole’s event horizon. But they were not OK with that safe vanishing without a trace. “It violated everything I knew about quantum mechanics,” says Stanford theoretical physicist Leonard Susskind, who heard Hawking’s ideas at a conference in 1981. “It couldn’t be right.”

Violating theories
Susskind dug into this black hole information paradox, and by the turn of the century he thought he had resolved it with a proposal called complementarity. In essence, he argued that information can simultaneously cross the event horizon and never cross the event horizon, so long as no single observer can see it in both places.

If a particle were to fall into a black hole, an astronaut falling alongside it would see nothing special happen as both coasted across the event horizon and into the black hole’s interior. But another astronaut watching from outside would never see his friend or the particle pass the event horizon; from his point of view, the particle would get perilously close to the horizon but never quite cross it. Eventually, as the black hole evaporated perhaps a trillion trillion trillion trillion years later (astronauts in thought experiments have remarkable longevity), the astronaut outside the black hole would see the Hawking radiation associated with the infalling particle.

Susskind’s explanation is unintuitive, but at least it’s elegant. For both observers, information is preserved. Plus, the outside astronaut can potentially piece together everything that fell into the vast black hole interior just by monitoring the event horizon. This idea, proposed by Juan Maldacena at the Institute for Advanced Study in Princeton, N.J., is called the holographic principle: Just as a two-dimensional hologram can depict a three-dimensional object, the surface of a black hole theoretically reveals everything inside of it.

Pasta or Barbecue?
Since the 1970s, physicists have had trouble coming up with a proposal that describes the fate of something, or someone, falling into a black hole that doesn’t violate well-tested theories. Until 2012, complementarity (left side of image) seemed to do the job. It said that an astronaut falling into a black hole won’t notice anything special as he crosses the event horizon. Yet someone outside will never see his friend reach the horizon. Information is preserved for both observers. But complementarity breaks another rule of quantum mechanics (see “Problematic entanglements,” below). Some argue that walls of radiation along event horizons incinerate incoming matter.

But in 2012, a quartet of physicists including Joseph Polchinski from the University of California, Santa Barbara reignited the black hole information paradox by demonstrating that in solving one problem, Susskind and Maldacena had created another. The issue centers on another facet of quantum mechanics called entanglement, which intertwines the properties of multiple particles regardless of the distance between them. Susskind and Maldacena’s complementarity relies on entanglement to preserve information. As the proposal goes, particles of Hawking radiation are linked to each other so that over time an observer could measure the radiation and piece together what’s inside the black hole.

In yet another thought experiment, Polchinski and his team pondered what would happen if just one of a pair of entangled particles near a black hole’s event horizon fell in, while the other escaped as Hawking radiation. According to complementarity, the escaping particle would also have to be entangled with another Hawking particle. But that’s a no-no in quantum mechanics: Particles entangled with each other outside a black hole cannot also be entangled with particles inside the black hole. Physicists call this forbidden arrangement entanglement polygamy.

To remedy this violation of quantum theory, Polchinski’s team took its thought experiment a step further and tried severing the entanglement spanning the event horizon. The result: An impenetrable wall of energy formed at the event horizon, incinerating and shutting out any object big or small that tried to pass. They called this unforgiving boundary a firewall.

Unfortunately, while the firewall would play by the rules of quantum mechanics, it would violate Einstein’s theory of general relativity. According to Einstein, an astronaut should not notice anything unusual as he crosses the event horizon; in fact, he shouldn’t even know he’s crossed it until later, when he begins getting spaghettified, or stretched like a noodle, from the extreme gravity of the black hole’s interior and realizes that even a light-speed escape attempt would do no good. A firewall, on the other hand, would provide a pretty noticeable hint that the astronaut had reached the event horizon: He would fry instantly. If firewalls exist, then general relativity requires tweaking.

This firewall problem once again pits general relativity against quantum mechanics, and it has sparked new interest in thinking about the strange physics taking place at the event horizon. “I don’t even see a good framework of an idea to solve the problem,” Polchinski says.

Astronomical stakes
These thought experiments may seem academic, but the implications go well beyond the fates of a handful of particles. Event horizons seem to be the best theoretical test bed for combining general relativity and quantum mechanics into a unified theory of quantum gravity. “The last frontier for fundamental physics is quantum gravity,” says Janna Levin, an astrophysicist at Columbia University’s Barnard College. “And this one puzzle is offering us a chance to see the key elements.”

Physicists have had trouble developing a theory of quantum gravity because compared with the universe’s other three forces — strong, weak and electromagnetism — gravity is pathetically feeble. It’s the only force that is negligible at the small scales dominated by quantum physics. The quest for a theory of quantum gravity gained added significance after the recent discovery of ripples in spacetime dating back to a mere 10-36 seconds after the birth of the universe.

Understanding the universe so soon after the Big Bang is an amazing achievement, but a lot of interesting stuff happened in that trillionth of a trillionth of a trillionth of a second before those ripples cascaded through the infant cosmos.

If physicists are ever going to reach all the way back to the very beginning of the universe, Levin says, they will have to understand how the universe behaved when it was incredibly small and incredibly massive simultaneously. The best way to figure that out is to formulate a theory of quantum gravity by demystifying another such compact, massive environment: a black hole. “The event horizon is where gravity starts to come into its own,” says Sheperd Doeleman, an astronomer at MIT’s Haystack Observatory. “It rips off the Clark Kent business suit and starts to become as strong as the other forces.”

With so much at stake, many prominent physicists are stepping up and throwing some intriguing ideas into the mix.

The all-star roster includes Hawking. In a brief, cryptic January posting to the physics preprint server arXiv.org, he suggested that event horizons are not the points of no return proposed by Schwarzschild nearly a century ago. If event horizons occasionally allow stuff inside the black hole to escape, Hawking argued, then firewalls need not exist. While Hawking’s comments grabbed headlines — it didn’t hurt that his write-up included the misleading phrase “there are no black holes” — nobody is quite sure what the black hole savant has in mind. “People want to know what Hawking thinks,” says Sabine Hossenfelder, a cosmologist at the Nordic Institute for Theoretical Physics in Stockholm. “But practically, his paper has no use for me.” She wants Hawking to release a comprehensive paper explaining his argument and the reasoning behind it.

Patrick Hayden, a Stanford quantum physicist, has an idea similar to complementarity. He agrees with the arguments laid out by Polchinski’s team but suggests that it would be extremely difficult for a single observer to determine that a particle is engaged in entanglement polygamy. In fact, he says it would take a person so long to experimentally verify it that the black hole would have already evaporated. Once again, it may turn out that a black hole information paradox is allowed to exist for the simple reason that no one could ever detect it.

The most potentially paradigm-shifting idea comes from the dogged duo of Susskind and Maldacena. They address the firewall problem by combining entanglement, a mind-bending facet of quantum mechanics, with the sci-fi–sounding concept of wormholes. Wormholes are shortcuts through spacetime, the rough equivalent of crossing a mountain via tunnel rather than climbing over it. According to Susskind and Maldacena, every pair of entangled particles is connected by a wormhole, drastically shortening the distance between them.

Applying this to event horizons, they say that individual particles of Hawking radiation are linked via wormhole to the inside of the black hole. The proposal eliminates the need for firewalls by turning entanglement into a shortcut through spacetime rather than a mysterious long-distance link. In essence, the particles inside and outside the event horizon become one and the same.

Susskind and Maldacena’s proposal, while pretty wild, is stirring cautious optimism. “As physicists, we often rely on our sense of smell in judging scientific ideas,” Caltech theoretical physicist John Preskill wrote on his blog Quantum Frontiers. “At first whiff, [the wormhole proposal] may smell fresh and sweet, but it will have to ripen on the shelf for a while.” If Susskind and Maldacena are right, it would mean that quantum mechanics determines not only the behavior of particles at very small scales but also the large-scale structure of the universe. “Entanglement creates the hooks that hold space together,” Susskind says.

And in Susskind’s mind, that’s the beauty of the event horizon. A firewall proposal that he’s sure is wrong but can’t yet explain why may be the ticket to unraveling the great mysteries of the universe. Perhaps complementarity, wormholes or a mystery mechanism up Stephen Hawking’s sleeve will simultaneously rectify the black hole information paradox and deliver a theory of quantum gravity. “Once in a while, a conflict comes along and completely changes the way we think about things,” Susskind says. “This firewall story may be one of them.”

Picture perfect
With all the talk about hypothetical astronauts and entangled particles, it’s easy to forget that black holes are actual objects in the universe. It may be up for debate whether matter falling in gets stretched or burned, but there’s no doubt that throughout the cosmos incalculable amounts of gas and dust are flowing across the event horizons of black holes.

Astronomers know this because, despite the fact that no light can escape the event horizon, many black holes are fairly easy to detect. As the supergravity of a black hole reels in gas and dust, a traffic jam emerges near the event horizon. As matter bumps into other matter, it heats up and glows, emitting X-rays and other high-energy radiation. “Black holes are sitting in a luminous soup of billion-degree gas,” MIT’s Sheperd Doeleman says. Sometimes all that searing gas rockets away from the black hole in concentrated jets that can course more than a million light-years.

Astronomers aren’t sure why some galaxies’ black holes are voracious eaters, glowing brightly, while others seem dark and inactive, Doeleman says. The Milky Way’s central black hole, which weighs about 4 million times the mass of the sun, is relatively dormant. Astronomers are holding out hope that they’ll get to see the local black hole light up over the next year as a large gas cloud called G2 swings perilously close to its event horizon.

Doeleman has even greater ambitions. He leads a team that plans to directly image the event horizon of the Milky Way’s central black hole. That’s pretty hard to do: In fact, it requires a telescope the size of Earth.

So next year, Doeleman and his colleagues will unveil what amounts to an Earth-sized telescope.

The Event Horizon Telescope, the first instrument designed specifically for spying the structure of a black hole, combines multiple radio telescopes to achieve a resolution equivalent to that of a single one that is much larger.

This year, Doeleman is heading to the Atacama Large Millimeter/submillimeter Array in Chile, the world’s most powerful radio telescope network, to install extraordinarily precise atomic clocks that will allow researchers to combine the Chilean telescopes’ data with those from observatories in Hawaii, Spain and eventually the South Pole.

If all goes well, as early as next year a virtual telescope with the sensitivity of an Earth-sized radio dish will deliver images of a bright ring of hot gas surrounding a circular shadow: the heart of a black hole, bounded by the event horizon. “We’ve been working on this for a decade,” Doeleman says. “It’s exhilarating to be so close.”

Theorists aren’t as excited about the massive scope. After all, an Earth-sized telescope can’t zoom in on a single particle and resolve the information paradox. But perhaps a photograph will provide some inspiration. For the first time they’ll be able to take a good look at the mysterious boundary that has perplexed them for so long.
— Andrew Grant

CERN COURIER Nov 20, 2007 Physics in the multiverse The idea of multiple universes is more than a fantastic invention. It appears naturally within several theories, and deserves to be taken seriously, explains Aurélien Barrau.

Is our entire universe a tiny island within an infinitely vast and infinitely diversified meta-world? This could be either one of the most important revolutions in the history of cosmogonies or merely a misleading statement that reflects our lack of understanding of the most fundamental laws of physics.

The idea in itself is far from new: from Anaximander to David Lewis, philosophers have exhaustively considered this eventuality. What is especially interesting today is that it emerges, almost naturally, from some of our best – but often most speculative – physical theories. The multiverse is no longer a model; it is a consequence of our models. It offers an obvious understanding of the strangeness of the physical state of our universe. The proposal is attractive and credible, but it requires a profound rethinking of current physics.

At first glance, the multiverse seems to lie outside of science because it cannot be observed. How, following the prescription of Karl Popper, can a theory be falsifiable if we cannot observe its predictions? This way of thinking is not really correct for the multiverse for several reasons. First, predictions can be made in the multiverse: it leads only to statistical results, but this is also true for any physical theory within our universe, owing both to fundamental quantum fluctuations and to measurement uncertainties. Secondly, it has never been necessary to check all of the predictions of a theory to consider it as legitimate science. General relativity, for example, has been extensively tested in the visible world and this allows us to use it within black holes even though it is not possible to go there to check. Finally, the critical rationalism of Popper is not the final word in the philosophy of science. Sociologists, aestheticians and epistemologists have shown that there are other demarcation criteria to consider. History reminds us that the definition of science can only come from within and from the praxis: no active area of intellectual creation can be strictly delimited from outside. If scientists need to change the borders of their own field of research, it would be hard to justify a philosophical prescription preventing them from doing so. It is the same with art: nearly all artistic innovations of the 20th century have transgressed the definition of art as would have been given by a 19th-century aesthetician. Just as with science and scientists, art is internally defined by artists.

For all of these reasons, it is worth considering seriously the possibility that we live in a multiverse. This could allow understanding of the two problems of complexity and naturalness. The fact that the laws and couplings of physics appear to be fine-tuned to such an extent that life can exist and most fundamental quantities assume extremely “improbable” values would appear obvious if our entire universe were just a tiny part of a huge multiverse where different regions exhibit different laws. In this view, we are living in one of the “anthropically favoured” regions. This anthropic selection has strictly teleological and no theological dimension and absolutely no link with any kind of “intelligent design”. It is nothing other than the obvious generalization of the selection effect that already has to be taken into account within our own universe. When dealing with a sample, it is impossible to avoid wondering if it accurately represents the full set, and this question must of course be asked when considering our universe within the multiverse.

The multiverse is not a theory. It appears as a consequence of some theories, and these have other predictions that can be tested within our own universe. There are many different kinds of possible multiverses, depending on the particular theories, some of them even being possibly interwoven.

The most elementary multiverse is simply the infinite space predicted by general relativity – at least for flat and hyperbolic geometries. An infinite number of Hubble volumes should fill this meta-world. In such a situation, everything that is possible (i.e. compatible with the laws of physics as we know them) should occur. This is true because an event with a non-vanishing probability has to happen somewhere if space is infinite. The structure of the laws of physics and the values of fundamental parameters cannot be explained by this multiverse, but many specific circumstances can be understood by anthropic selections. Some places are, for example, less homogenous than our Hubble volume, so we cannot live there because they are less life-friendly than our universe, where the primordial fluctuations are perfectly adapted as the seeds for structure formation.

General relativity also faces the multiverse issue when dealing with black holes. The maximal analytic extension of the Schwarzschild geometry, as exhibited by conformal Penrose–Carter diagrams, shows that another universe could be seen from within a black hole. This interesting feature is well known to disappear when the collapse is considered dynamically. The situation is, however, more interesting for charged or rotating black holes, where an infinite set of universes with attractive and repulsive gravity appear in the conformal diagram. The wormholes that possibly connect these universes are extremely unstable, but this does not alter the fact that this solution reveals other universes (or other parts of our own universe, depending on the topology), whether accessible or not. This multiverse is, however, extremely speculative as it could be just a mathematical ghost. Furthermore, nothing allows us to understand explicitly how it formed.

A much more interesting pluriverse is associated with the interior of black holes when quantum corrections to general relativity are taken into account. Bounces should replace singularities in most quantum gravity approaches, and this leads to an expanding region of space–time inside the black hole that can be considered as a universe. In this model, our own universe would have been created by such a process and should also have a large number of child universes, thanks to its numerous stellar and supermassive black holes. This could lead to a kind of cosmological natural selection in which the laws of physics tend to maximize the number of black holes (just because such universes generate more universes of the same kind). It also allows for several possible observational tests that could refute the theory and does not rely on the use of any anthropic argument. However, it is not clear how the constants of physics could be inherited from the parent universe by the child universe with small random variations and the detailed model associated with this scenario does not yet exist.

One of the richest multiverses is associated with the fascinating meeting of inflationary cosmology and string theory. On the one hand, eternal inflation can be understood by considering a massive scalar field. The field will have quantum fluctuations, which will, in half of the regions, increase its value; in the other half, the fluctuations will decrease the value of the field. In the half where the field jumps up, the extra energy density will cause the universe to expand faster than in the half where the field jumps down. After some time, more than half of the regions will have the higher value of the field simply because they expand faster than the low-field regions. The volume-averaged value of the field will therefore rise and there will always be regions in which the field is high: the inflation becomes eternal. The regions in which the scalar field fluctuates downward will branch off from the eternally inflating tree and exit inflation.

On the other hand, string theory has recently faced a third change of paradigm. After the revolutions of supersymmetry and duality, we now have the “landscape”. This metaphoric word refers to the large number (maybe 10500) of possible false vacua of the theory. The known laws of physics would just correspond to a specific island among many others. The huge number of possibilities arises from different choices of Calabi–Yau manifolds and different values of generalised magnetic fluxes over different homology cycles. Among other enigmas, the incredibly strange value of the cosmological constant (why are the 119 first decimals of the “natural” value exactly compensated by some mysterious phenomena, but not the 120th?) would simply appear as an anthropic selection effect within a multiverse where nearly every possible value is realised somewhere. At this stage, every bubble-universe is associated with one realisation of the laws of physics and contains itself an infinite space where all contingent phenomena take place somewhere. Because the bubbles are causally disconnected forever (owing to the fast “space creation” by inflation) it will not be possible to travel and discover new laws of physics.

This multiverse – if true – would force a profound change of our deep understanding of physics. The laws reappear as kinds of phenomena; the ontological primer of our universe would have to be abandoned. At other places in the multiverse, there would be other laws, other constants, other numbers of dimensions; our world would be just a tiny sample. It could be, following Copernicus, Darwin and Freud, the fourth narcissistic injury.

Quantum mechanics was probably among the first branches of physics leading to the idea of a multiverse. In some situations, it inevitably predicts superposition. To avoid the existence of macro-scopic Schrödinger cats simultaneously living and dying, Bohr introduced a reduction postulate. This has two considerable drawbacks: first, it leads to an extremely intricate philosophical interpretation where the correspondence between the mathe-matics underlying the physical theory and the real world is no longer isomorphic (at least not at any time), and, second, it violates unitarity. No known physical phenomenon – not even the evaporation of black holes in its modern descriptions – does this.

These are good reasons for considering seriously the many-worlds interpretation of Hugh Everett. Every possible outcome to every event is allowed to define or exist in its own history or universe, via quantum decoherence instead of wave function collapse. In other words, there is a world where the cat is dead and another one where it is alive. This is simply a way of trusting strictly the fundamental equations of quantum mechanics. The worlds are not spatially separated, but exist more as kinds of “parallel” universes. This tantalising interpretation solves some paradoxes of quantum mechanics but remains vague about how to determine when splitting of universes happens. This multiverse is complex and, depending on the very quantum nature of phenomena leading to other kinds of multiverses, it could lead to higher or lower levels of diversity.

More speculative multiverses can also be imagined, associated with a kind of platonic mathematical democracy or with nominalist relativism. In any case, it is important to underline that the multiverse is not a hypothesis invented to answer a specific question. It is simply a consequence of a theory usually built for another purpose. Interestingly, this consequence also solves many complexity and naturalness problems. In most cases, it even seems that the existence of many worlds is closer to Ockham’s razor (the principle of simplicity) than the ad hoc assumptions that would have to be added to models to avoid the existence of other universes.

Given a model, for example the string-inflation paradigm, is it possible to make predictions in the multiverse? In principle, it is, at least in a Bayesian approach. The probability of observing vacuum i (and the associated laws of physics) is simply Pi = Piprior fi where Piprior is determined by the geography of the landscape of string theory and the dynamics of eternal inflation, and the selection factor fi characterizes the chances for an observer to evolve in vacuum i. This distribution gives the probability for a randomly selected observer to be in a given vacuum. Clearly, predictions can only be made probabilistically, but this is already true in standard physics. The fact that we can observe only one sample (our own universe) does not change the method qualitatively and still allows the refuting of models at given confidence levels. The key points here are the well known peculiarities of cosmology, even with only one universe: the observer is embedded within the system described; the initial conditions are critical; the experiment is “locally” irreproducible; the energies involved have not been experimentally probed on Earth; and the arrow of time must be conceptually reversed.

However, this statistical approach to testing the multiverse suffers from severe technical short cuts. First, while it seems natural to identify the prior probability with the fraction of volume occupied by a given vacuum, the result depends sensitively on the choice of a space-like hypersurface on which the distribution is to be evaluated. This is the so-called “measure problem” in the multiverse. Second, it is impossible to give any sensible estimate of fi. This would require an understanding of what life is – and even of what consciousness is – and that simply remains out of reach for the time being. Except in some favourable cases – for example when all the universes of the multiverse present a given characteristic that is incompatible with our universe – it is hard to refute explicitly a model in the multiverse. But difficult in practice does not mean intrinsically impossible. The multiverse remains within the realm of Popperian science. It is not qualitatively different from other proposals associated with usual ways of doing physics. Clearly, new mathematical tools and far more accurate predictions in the landscape (which is basically totally unknown) are needed for falsifiability to be more than an abstract principle in this context. Moreover, falsifiability is just one criterion among many possible ones and it should probably not be over-determined.

When facing the question of the incredible fine-tuning required for the fundamental parameters of physics to allow the emergence of complexity, there are few possible ways of thinking. If one does not want to use God or rely on an unbelievable luck that led to extremely specific initial conditions, there are mainly two remaining possible hypotheses. The first would be to consider that since complexity – and in particular, life – is an adaptive process, it would have emerged in nearly any kind of universe. This is a tantalising answer, but our own universe shows that life requires extremely specific conditions to exist. It is hard to imagine life in a universe without chemistry, maybe without bound states or with other numbers of dimensions. The second idea is to accept the existence of many universes with different laws where we naturally find ourselves in one of those compatible with complexity. The multiverse was not imagined to answer this specific question but appears “spontaneously” in serious physical theories, so it can be considered as the simplest explanation to the puzzling issue of naturalness. This of course does not prove the model to be correct, but it should be emphasised that there is absolutely no “pre-Copernican” anthropocentrism in this thought process.

It could well be that the whole idea of multiple universes is misleading. It could well be that the discovery of the most fundamental laws of physics will make those parallel worlds totally obsolete in a few years. It could well be that with the multiverse, science is just entering a “no through road”. Prudence is mandatory when physics tells us about invisible spaces. But it could also very well be that we are facing a deep change of paradigm that revolutionizes our understanding of nature and opens new fields of possible scientific thought. Because they lie on the border of science, these models are dangerous, but they offer the extraordinary possibility of constructive interference with other kinds of human knowledge. The multiverse is a risky thought – but, then again, let’s not forget that discovering new worlds has always been risky.

THE CYCLIC UNIVERSE: PAUL STEINHARDT A Conversation with Paul J. Steinhardt [11.19.02]

…in the last year I’ve been involved in the development of an alternative theory that turns the cosmic history topsy-turvy. All the events that created the important features of our universe occur in a different order, by different physics, at different times, over different time scales—and yet this model seems capable of reproducing all of the successful predictions of the consensus picture with the same exquisite detail.

PAUL STEINHARDT is the Albert Einstein Professor in Science and on the faculty of both the Departments of Physics and Astrophysical Sciences at Princeton University.

He is one of the leading theorists responsible for inflationary theory. He constructed the first workable model of inflation and the theory of how inflation could produce seeds for galaxy formation. He was also among the first to show evidence for dark energy and cosmic acceleration, introducing the term “quintessence” to refer to dynamical forms of dark energy. With Neil Turok he has pioneered mathematical and computational techniques which decisively disproved rival theories of structure formation such as cosmic strings. He made leading contributions to inflationary theory and to our understanding of the origin of the matter-antimatter asymmetry in the Universe. Hence, the authors not only witnessed but also led firsthand the revolutionary developments in the standard cosmological model caused by the fusion of particle physics and cosmology in the last 20 years.

THE CYCLIC UNIVERSE: PAUL STEINHARDT

[PAUL STEINHARDT:] I am theoretical cosmologist, so I am addressing the issue from that point of view. If you were to ask most cosmologists to give a summary of where we stand right now in the field, they would tell you that we live in a very special period in human history where, thanks to a whole host of advances in technology, we can suddenly view the very distant and very early universe in ways that we haven’t been able to do ever before. For example, we can get a snapshot of what the universe looked like in its infancy, when the first atoms were forming. We can get a snapshot of what the universe looked like in its adolescence, when the first stars and galaxies were forming. And we are now getting a full detail, three-dimensional image of what the local universe looks like today. When you put together this different information, which we’re getting for the first time in human history, you obtain a very tight series of constraints on any model of cosmic evolution. If you go back to the different theories of cosmic evolution in the early 1990’s, the data we’ve gathered in the last decade has eliminated all of them—save one, a model that you might think of today as the consensus model. This model involves a combination of the Big Bang model as developed in the 1920s, ’30s, and ’40s; the Inflationary Theory, which Alan Guth proposed in the 1980s; and a recent amendment that I will discuss shortly. This consensus theory matches the observations we have of the universe today in exquisite detail. For this reason, many cosmologists conclude that we have finally determined the basic cosmic history of the universe.

But I have a rather different point of view, a view that has been stimulated by two events. The first is the recent amendment to which I referred earlier. I want to argue that the recent amendment is not simply an amendment, but a real shock to our whole notion of time and cosmic history. And secondly, in the last year I’ve been involved in the development of an alternative theory that turns the cosmic history topsy-turvy. All the events that created the important features of our universe occur in a different order, by different physics, at different times, over different time scales—and yet this model seems capable of reproducing all of the successful predictions of the consensus picture with the same exquisite detail.

The key difference between this picture and the consensus picture comes down to the nature of time. The standard model, or consensus model, assumes that time has a beginning that we normally refer to as the Big Bang. According to this model, for reasons we don’t quite understand, the universe sprang from nothingness into somethingness, full of matter and energy, and has been expanding and cooling for the past 15 billion years. In the alternative model the universe is endless. Time is endless in the sense that it goes on forever in the past and forever in the future, and, in some sense, space is endless. Indeed, our three spatial dimensions remain infinite throughout the evolution of the universe.

More specifically, this model proposes a universe in which the evolution of the universe is cyclic. That is to say, the universe goes through periods of evolution from hot to cold, from dense to under-dense, from hot radiation to the structure we see today, and eventually to an empty universe. Then, a sequence of events occurs that cause the cycle to begin again. The empty universe is reinjected with energy, creating a new period of expansion and cooling. This process repeats periodically forever. What we’re witnessing now is simply the latest cycle.

The notion of a cyclic universe is not new. People have considered this idea as far back as recorded history. The ancient Hindus, for example, had a very elaborate and detailed cosmology based on a cyclic universe. They predicted the duration of each cycle to be 8.64 billion years—a prediction with three-digit accuracy. This is very impressive, especially since they had no quantum mechanics and no string theory! It disagrees with the number that I’m going suggest, which is trillions of years rather than billions.

The cyclic notion has also been a recurrent theme in Western thought. Edgar Allan Poe and Friedrich Nietzsche, for example, each had cyclic models of the universe, and in the early days of relativistic cosmology, Albert Einstein, Alexandr Friedman, Georges Lemaître, and Richard Tolman were interested in the cyclic idea. I think it is clear why so many have found the cyclic idea to be appealing: If you have a universe with a beginning, you have the challenge of explaining why it began and the conditions under which it began. If you have a universe, which is cyclic, it is eternal, so you don’t have to explain the beginning.

During the attempts to try to bring cyclic ideas into modern cosmology, it was discovered in the ’20s and ’30s that there are various technical problems. The idea at that time was a cycle in which our three-dimensional universe goes through periods of expansion beginning from the Big Bang and then reversal to contraction and a big crunch. The universe bounces and expansion begins again. One problem is that, every time the universe contracts to a crunch, the density and temperature of the universe rises to an infinite value, and it is not clear if the usual laws of physics can be applied. Second, every cycle of expansion and contraction creates entropy through natural thermodynamic processes, which adds to the entropy from earlier cycles. So, at the beginning of a new cycle, there is higher entropy density than the cycle before. It turns out that the duration of a cycle is sensitive to the entropy density. If the entropy increases, the duration of the cycle increases as well. So, going forward in time, each cycle becomes longer than the one before. The problem is that, extrapolating back in time, the cycles become shorter until, after a finite time, they shrink to zero duration. The problem of avoiding a beginning has not been solved. It has simply been pushed back a finite number of cycles. If we’re going to reintroduce the idea of a truly cyclic universe, these two problems must be overcome. The cyclic model that I will describe uses new ideas to do just that.

To appreciate why an alternative model is worth pursuing, its important to get a more detailed impression of what the consensus picture is like. Certainly some aspects are appealing. But, what I want to argue is that, overall, the consensus model is not so simple. In particular, recent observations have forced us to amend the consensus model and make it more complicated. So, let me begin with an overview of the consensus model.

The consensus theory begins with the Big Bang: the universe has a beginning. It’s a standard assumption that people have made over the last 50 years, but it’s not something we can prove at present from any fundamental laws of physics. Furthermore, you have to assume that the universe began with an energy density less than the critical value. Otherwise, the universe would stop expanding and recollapse before the next stage of evolution, the inflationary epoch. In addition, to reach this inflationary stage, there must be some sort of energy to drive the inflation. Typically this is assumed to be due to an inflation field. You have to assume that in those patches of the universe that began at less than the critical density, a significant fraction of the energy is stored in inflation energy so that it can eventually overtake the universe and start the period of accelerated expansion. All of these are reasonable assumption, but assumptions nevertheless. It’s important that to count these assumptions and ingredients, because they are helpful in comparing the consensus model to the challenger.

Assuming these conditions are met, the inflation energy overtakes the matter and radiation after a few instants. The inflationary epoch commences and the expansion of the universe accelerates at a furious pace. The inflation does a number of miraculous things: it makes the universe homogeneous, it makes the universe flat, and it leaves behind certain inhomogeneities, which are supposed to be the seeds for the formation of galaxies. Now the universe is prepared to enter the next stage of evolution with the right conditions. According to the inflationary model, the inflation energy decays into a hot gas of matter and radiation. After a second or so, there form the first light nuclei. After a few tens of thousands of years, the slowly moving matter dominates the universe. It’s during these stages that the first atoms form, the universe becomes transparent, and the structure in the universe begins to form—the first stars and galaxies. Up to this point the story is relatively simple.

But, there is the recent discovery that we’ve entered a new stage in the evolution of the universe. After the stars and galaxies have formed, something strange has happened to cause the expansion of the universe to speed up again. During the 15 billion years when matter and radiation dominated the universe and structure was forming, the expansion of the universe was slowing down because the matter and radiation within it is gravitationally self-attractive and resists the expansion of the universe. Until very recently, it had been presumed that matter would continue to be the dominant form of energy in the universe, and this deceleration would continue forever.

But we’ve discovered instead, due to recent observations that the expansion of the universe is speeding up. This means that most of the energy of the universe is neither matter nor radiation. Rather, another form of energy has overtaken the matter and radiation. For lack of a better term, this new energy form is called “dark energy.” Dark energy, unlike the matter and radiation that we’re familiar with, is gravitationally self-repulsive. That’s why it causes the expansion to speed up rather than slow down. In Newton’s theory gravity, all mass is gravitationally attractive, but Einstein’s theory allows the possibility of forms of energy that are gravitationally self-repulsive.

I don’t think either the physics or cosmology communities, or even the general public, have fully absorbed the full implications of this discovery. This is a revolution in the grand historic sense—in the Copernican sense. In fact, if you think about Copernicus—from whom we derive the word revolution—his importance was that he changed our notion of space and of our position in the universe. By showing that the earth revolves around the sun, he triggered a chain of ideas that led us to the notion that we live in no particular place in the universe; there’s nothing special about where we are. Now we’ve discovered something very strange about the nature of time: that we may live in no special place, but we do live at a special time, a time of recent transition from deceleration to acceleration; from one in which matter and radiation dominate the universe to one in which they are rapidly becoming insignificant components; from one in which structure is forming in ever-larger scales to one in which now, because of this accelerated expansion, structure formation stops. We are in the midst of the transition between these two stages of evolution. And just as Copernicus’s proposal that the earth is no longer the center of the universe led to a chain of ideas that changed our whole outlook on the structure of the solar system and eventually to the structure of the universe, it shouldn’t be too surprising that perhaps this new discovery of cosmic acceleration could lead to a whole change in our view of cosmic history. That’s a big part of the motivation for thinking about our alternative proposal.

With these thoughts about the consensus model in mind, let me turn to the cyclic proposal. Since it’s cyclic, I’m allowed to begin the discussion of the cycle at any point I choose. To make the discussion parallel, I’ll begin at a point analogous to the Big Bang; I’ll call it The Bang. This is a point in the cycle where the universe reaches its highest temperature and density. In this scenario, though, unlike the Big Bang model, the temperature and density don’t diverge. There is a maximal, finite temperature. It’s a very high temperature, around 10[20] (ten to the 20) degrees Kelvin—hot enough to evaporate atoms and nuclei into their fundamental constituents—but it’s not infinite. In fact, it’s well below the so-called Planck energy scale, where quantum gravity effects dominate. The theory begins with a bang and then proceeds directly to a phase dominated by radiation. In this scenario you do not have the inflation one has in the standard scenario. You still have to explain why the universe is flat, you still have to explain why the universe is homogeneous, and you still have to explain where the fluctuations came from that led to the formation of galaxies, but that’s not going to be explained by an early stage of inflation. It’s going to be explained by yet a different stage in the cyclic universe, which I’ll get to.

In this new model, you go directly to a radiation-dominated universe and form the usual nuclear abundances; then go directly to a matter-dominated universe in which the atoms and galaxies and larger scale structure form; and then proceed to a phase of the universe dominated by dark energy. In the standard case, the dark energy comes as a surprise, since it is something you have to add into the theory to make it consistent with what we observe. In the cyclic model, the dark energy moves to center stage as the key ingredient that is going to drive the universe, and in fact drives the universe into the cyclic evolution. The first thing the dark energy does when it dominates the universe is what we observe today: it causes the expansion of the universe to begin to accelerate. Why is that important? Although this acceleration rate is a hundred orders of magnitude smaller than the acceleration than one gets in inflation, if you give the universe enough time, it actually accomplishes the same feat that inflation does. Over time it thins out the distribution of matter and radiation in the universe, making the universe more and more homogeneous and isotropic—in fact, making it perfectly so—driving it into what is essentially a vacuum state.

Seth Lloyd said there were 10[80] (ten to the 80) or 10[90] (ten to the 90) bits inside the horizon, but if you were to look around the universe in a trillion years, you would find on average no bits inside your horizon, or less than one bit inside your horizon. In fact, when you count these bits, it’s important to realize that now that the universe is accelerating our computer is actually losing bits from inside our horizon. This is something that we observe.

At the same time that the universe is made homogeneous and isotropic, it is also being made flat. If the universe had any warp or curvature to it, or if you think about the universe stretching over this long period of time, although it’s a slow process it makes the space extremely flat. If it continued forever, of course, that would be the end of the story. But in this scenario, just like inflation, the dark energy only survives for a finite period, and triggers a series of events that eventually lead to a transformation of energy from gravity into new energy and radiation that will then start a new period of expansion of the universe. From a local observer’s point of view, it looks like the universe goes through exact cycles; that is to say, it looks like the universe empties out each round, and a new matter and radiation is created, leading a new period of expansion. In this sense it’s a cyclic universe. If you were a global observer and could see the entire universe, you’d discover that our three dimensions are forever infinite in this story. What’s happened is that at each stage, when we create matter and radiation, it gets thinned out. It’s out there somewhere, but it’s getting thinned out. Locally, it looks like the universe is cyclic, but globally the universe has a steady evolution, a well-defined era in which, over time and throughout our three dimensions, entropy increases from cycle to cycle.

Exactly how this works in detail can be described in various ways. I will choose to present a very nice geometrical picture that is motivated by superstring theory. We use only a few basic elements from superstring theory, so you don’t really have to know anything about superstring theory to understand what I’m going to talk about, except to understand that some of the strange things that I’m going to introduce I am not introducing for the first time. They are already sitting there in superstring theory waiting to be put to good purpose.

One of the ideas in superstring theory is that there are extra dimensions; it’s an essential element to that theory that is necessary to make it mathematically consistent. In one particular formulation of that theory the universe has a total of 11 dimensions. Six of them are curled up into a little ball so tiny that, for my purposes, I’m just going to pretend that they’re not there. However, there are three spatial dimensions, one time dimension, and one additional dimension that I do want to consider. In this picture our three dimensions with which we’re familiar and through which we move, lies along a hypersurface or membrane. This membrane is a boundary of the extra dimension. There is another boundary or membrane on the other side. In between, there’s an extra dimension that, if you like, only exists over a certain interval. It’s like we are one end of a sandwich, in between which there is a so-called bulk volume of space. These surfaces are referred to as orbifolds or branes—the latter referring to the word membrane. The branes have physical properties. They have energy and momentum, and when you excite them you can produce things like quarks and electrons. We are composed of the quarks and electrons on of these branes. And, since quarks and leptons can only move along branes, we are restricted to moving along and seeing only the three dimensions of our branes. We cannot see directly the bulk or any matter on the other brane.

In the cyclic universe, at regular intervals of trillions of years, these two branes smash together. This creates all kinds of excitations—particles and radiation. The collision thereby heats up the branes, and then they bounce apart again. The branes are attracted to each other through a force that acts just like a spring, causing the branes come together at regular intervals. To describe it more completely, what’s happening is that the universe goes through two kinds of stages of motion. When the universe has matter and radiation in it, or when the branes are far enough apart, the main motion is the branes stretching, or, equivalently, our three-dimensions expanding. During this period, the branes more or less remain a fixed distance apart. That’s what’s been happening, for example, in the last 15 billion years. During these stages, our three dimensions are stretching just as they normally would. At a microscopic distance away, there is another brane sitting and expanding, but since we can’t touch, feel, or see across the bulk, we can’t sense it directly. If there is a clump of matter over there, we can feel the gravitational effect, but we can’t see any light or anything else that it emits, because anything it emits is going to move along that brane. We only see things that move along our own brane.

Next, the energy associated with the force between these branes takes over the universe. From our vantage point on one of the branes, this acts just like the dark energy we observe today. It causes the branes to accelerate in their stretching to the point where all the matter and radiation produced since the last collision is spread out, and the branes become essentially smooth, flat, empty surfaces. If you like, you can think of them as being wrinkled and full of matter up to this point, and then stretching by a fantastic amount over the next trillion years. The stretching causes the mass and energy on the brane to thin out and the wrinkles to be smoothed out. After trillions of years, the branes are, for all intents and purposes, smooth, flat, parallel and empty.

Then, the force between these two branes slowly brings the branes together. As it brings them together, the force grows stronger and the branes speed towards one another. When they collide, there’s a walloping impact—enough to bring create a high density of matter and radiation with a very high, albeit finite temperature. The two branes go flying apart, more or less back to where they are, and then the new matter and radiation (through the action of gravity) causes the branes to begin a new period of stretching.

In this picture it’s clear that the universe is going through periods of expansion, and a funny kind of contraction. Where the two branes come together, it’s not a contraction of our dimensions, but a contraction of the extra dimension. Before the contraction, all matter and radiation has been spread out, but, unlike the old cyclic models of the 20’s and 30’s, it doesn’t come back together again during the contraction because our three dimensions—that is, the branes—remain stretched out. Only the extra dimension contracts. This process repeats itself cycle after cycle.

If you compare the cyclic model to the consensus picture, two of the functions of inflation—namely, flattening and homogenizing the universe—are accomplished by the period of accelerated expansion that we’ve now just begun. Of course, I really mean the analogous expansion that occurred one cycle ago before the most recent Bang. The third function of inflation—producing fluctuations in the density—occurs as these two branes come together. As they approach, quantum fluctuations cause the branes to begin to wrinkle. And because they are wrinkled, they do not collide everywhere at the same time. Rather, some regions collide a bit earlier than others. This means that some regions reheat to a finite temperature and begin to cool a little bit before other regions. When the branes come apart again, the temperature of the universe is not perfectly homogeneous but has spatial variations left over from the quantum wrinkles.

Remarkably, although the physical processes are completely different, and the time scale is completely different—this is taking billions of years, instead of 10[-30] (ten to the -30) seconds—it turns out that the spectrum of fluctuations you get in the distribution of energy and temperature is essentially the same as what you get in inflation. Hence, the cyclic model is also in exquisite agreement with all of the measurements of the temperature and mass distribution of the universe that we have today.

Because the physics in these two models is quite different, there is an important distinction in what we would observe if one or the other were actually true—although this effect has not been detected yet. In inflation when you create fluctuations, you don’t just create fluctuations in energy and temperature, but you also create fluctuations in spacetime itself, so-called gravitational waves. That’s a feature that we hope to look for in experiments in the coming decades as a verification of the consensus model. In our model you don’t get those gravitational waves. The essential difference is that inflationary fluctuations are created in a hyperrapid, violent process that is strong enough to created gravitational waves, whereas cyclic fluctuations are created in an ultraslow, gentle process that is too weak to produce gravitational waves. That’s an example where the two models give an observational prediction that is dramatically different. It’s just difficult to observe at the present time.

What’s fascinating at the moment is that we have two paradigms that are now available to us. On the one hand they are poles apart, in terms of what they tell us about the nature of time, about our cosmic history, about the order in which events occur, and about the time scale on which they occur. On the other hand they are remarkably similar in terms of what they predict about the universe today. Ultimately what will decide between the two is be a combination of observations—for example, the search for cosmic gravitational waves—and theory—because a key aspect to this scenario entails assumptions about what happens at the collision between branes that might be checked or refuted in superstring theory. In the meantime, for the next few years, we can all have great fun speculating about the implications of each of these ideas, which we prefer, and how we can best distinguish between them.

SUNDAY, JUL 13, 2014 04:00 PM BST The universe according to Nietzsche: Modern cosmology and the theory of eternal recurrence – PAUL STEINHARDT

Excerpted from “The Universe: Leading Scientists Explore the Origin, Mysteries, and Future of the Cosmos.” It originally appeared as a speech given by Steinhardt at an event in 2002.

If you were to ask most cosmologists to give a summary of where we stand right now in the field, they would tell you that we live in a very special period in human history where, thanks to a whole host of advances in technology, we can suddenly view the very distant and very early universe in ways we haven’t been able to do ever before. For example, we can get a snapshot of what the universe looked like in its infancy, when the first atoms were forming. We can get a snapshot of what the universe looked like in its adolescence, when the first stars and galaxies were forming. And we are now getting a full detail, three-dimensional image of what the local universe looks like today. When you put together this different information, which we’re getting for the first time in human history, you obtain a very tight series of constraints on any model of cosmic evolution.

If you go back to the different theories of cosmic evolution in the early 1990s, the data we’ve gathered in the last decade has eliminated all of them save one, a model that you might think of today as the consensus model. This model involves a combination of the Big Bang model as developed in the 1920s, ’30s, and ’40s; the inflationary theory, which Alan Guth proposed in the 1980s; and a recent amendment that I will discuss shortly. This consensus theory matches the observations we have of the universe today in exquisite detail. For this reason, many cosmologists conclude that we have finally determined the basic cosmic history of the universe.

But I have a rather different point of view, a view that has been stimulated by two events. The first is the recent amendment to which I referred earlier. I want to argue that the recent amendment is not simply an amendment but a real shock to our whole notion of time and cosmic history. And secondly, in the last year I’ve been involved in the development of an alternative theory that turns the cosmic history topsy-turvy: All the events that created the important features of our universe occur in a different order, by different physics, at different times, over different time scales. And yet this model seems capable of reproducing all the successful predictions of the consensus picture with the same exquisite detail.

The key difference between this picture and the consensus picture comes down to the nature of time. The standard model, or consensus model, assumes that time has a beginning that we normally refer to as the Big Bang. According to that model, for reasons we don’t quite understand, the universe sprang from nothingness into somethingness, full of matter and energy, and has been expanding and cooling for the past 15 billion years. In the alternative model, the universe is endless. Time is endless, in the sense that it goes on forever in the past and forever in the future, and in some sense space is endless. Indeed, our three spatial dimensions remain infinite throughout the evolution of the universe.

More specifically, this model proposes a universe in which the evolution of the universe is cyclic. That is to say, the universe goes through periods of evolution from hot to cold, from dense to under-dense, from hot radiation to the structure we see today, and eventually to an empty universe. Then, a sequence of events occurs that cause the cycle to begin again. The empty universe is reinjected with energy, creating a new period of expansion and cooling. This process repeats periodically forever. What we’re witnessing now is simply the latest cycle.

The notion of a cyclic universe is not new. People have considered this idea as far back as recorded history. The ancient Hindus, for example, had a very elaborate and detailed cosmology based on a cyclic universe. They predicted the duration of each cycle to be 8.64 billion years—a prediction with three-digit accuracy. This is very impressive, especially since they had no quantum mechanics and no string theory! It disagrees with the number I’m going suggest, which is trillions of years rather than billions.

The cyclic notion has also been a recurrent theme in Western thought. Edgar Allan Poe and Friedrich Nietzsche, for example, each had cyclic models of the universe, and in the early days of relativistic cosmology Albert Einstein, Alexander Friedmann, Georges Lemaître, and Richard Tolman were interested in the cyclic idea. I think it’s clear why so many have found the cyclic idea to be appealing: If you have a universe with a beginning, you have the challenge of explaining why it began and the conditions under which it began. If you have a universe that’s cyclic, it’s eternal, so you don’t have to explain the beginning.

During the attempts to try to bring cyclic ideas into modern cosmology, it was discovered in the 1920s and ’30s that there are various technical problems. The idea at that time was a cycle in which our three-dimensional universe goes through periods of expansion beginning from the Big Bang and then reversal to contraction and a Big Crunch. The universe bounces, and expansion begins again. One problem is that every time the universe contracts to a crunch, the density and temperature of the universe rises to an infinite value, and it is not clear if the usual laws of physics can be applied.

Second, every cycle of expansion and contraction creates entropy through natural thermodynamic processes, which adds to the entropy from earlier cycles. So at the beginning of a new cycle, there is higher entropy density than the cycle before. It turns out that the duration of a cycle is sensitive to the entropy density. If the entropy increases, the duration of the cycle increases as well. So, going forward in time, each cycle becomes longer than the one before. The problem is that, extrapolating back in time, the cycles become shorter until, after a finite time, they shrink to zero duration. The problem of avoiding a beginning has not been solved; it has simply been pushed back a finite number of cycles. If we’re going to reintroduce the idea of a truly cyclic universe, these two problems must be overcome. The cyclic model I will describe uses new ideas to do just that.

To appreciate why an alternative model is worth pursuing, it’s important to get a more detailed impression of what the consensus picture is like. Certainly some aspects are appealing. But what I want to argue is that, overall, the consensus model is not so simple. In particular, recent observations have forced us to amend the consensus model and make it more complicated. So, let me begin with an overview of the consensus model.

The consensus theory begins with the Big Bang: The universe has a beginning. It’s a standard assumption that people have made over the last fifty years, but it’s not something we can prove at present from any fundamental laws of physics. Furthermore, you have to assume that the universe began with an energy density less than the critical value. Otherwise, the universe would stop expanding and recollapse before the next stage of evolution, the inflationary epoch. In addition, to reach this inflationary stage, there must be some sort of energy to drive the inflation. Typically this is assumed to be due to an inflation field. You have to assume that in those patches of the universe that began at less than the critical density, a significant fraction of the energy is stored in inflation energy so that it can eventually overtake the universe and start the period of accelerated expansion. All of these are reasonable assumption, but assumptions nevertheless. It’s important to take into account these assumptions and ingredients, because they’re helpful in comparing the consensus model to the challenger.

Assuming these conditions are met, the inflation energy overtakes the matter and radiation after a few instants. The inflationary epoch commences, and the expansion of the universe accelerates at a furious pace. The inflation does a number of miraculous things: It makes the universe homogeneous, it makes the universe flat, and it leaves behind certain inhomogeneities, which are supposed to be the seeds for the formation of galaxies. Now the universe is prepared to enter the next stage of evolution with the right conditions. According to the inflationary model, the inflation energy decays into a hot gas of matter and radiation. After a second or so, there form the first light nuclei. After a few tens of thousands of years, the slowly moving matter dominates the universe. It’s during these stages that the first atoms form, the universe becomes transparent, and the structure in the universe begins to form—the first stars and galaxies. Up to this point, the story is relatively simple.

But there is the recent discovery that we’ve entered a new stage in the evolution of the universe. After the stars and galaxies have formed, something strange has happened to cause the expansion of the universe to speed up again. During the 15 billion years when matter and radiation dominated the universe and structure was forming, the expansion of the universe was slowing down, because the matter and radiation within it is gravitationally self-attractive and resists the expansion of the universe. Until very recently, it had been presumed that matter would continue to be the dominant form of energy in the universe and this deceleration would continue forever.

But we’ve discovered instead, due to recent observations, that the expansion of the universe is speeding up. This means that most of the energy of the universe is neither matter nor radiation. Rather, another form of energy has overtaken the matter and radiation. For lack of a better term, this new energy form is called dark energy. Dark energy, unlike the matter and radiation we’re familiar with, is gravitationally self-repulsive. That’s why it causes the expansion to speed up rather than slow down. In Newton’s theory of gravity, all mass is gravitationally attractive, but Einstein’s theory allows the possibility of forms of energy that are gravitationally self-repulsive.

I don’t think either the physics or cosmology communities, or even the general public, have fully absorbed the full implications of this discovery. This is a revolution in the grand historic sense—in the Copernican sense. In fact, if you think about Copernicus—from whom we derive the word “revolution”—his importance was that he changed our notion of space and of our position in the universe. By showing that the Earth revolves around the sun, he triggered a chain of ideas that led us to the notion that we live in no particular place in the universe; there’s nothing special about where we are. Now we’ve discovered something very strange about the nature of time: that we may live in no special place, but we do live at a special time, a time of recent transition from deceleration to acceleration; from one in which matter and radiation dominate the universe to one in which they are rapidly becoming insignificant components; from one in which structure is forming in ever larger scales to one in which now, because of this accelerated expansion, structure formation stops. We are in the midst of the transition between these two stages of evolution. And just as Copernicus’ proposal that the Earth is no longer the center of the universe led to a chain of ideas that changed our whole outlook on the structure of the solar system and eventually to the structure of the universe, it shouldn’t be too surprising that perhaps this new discovery of cosmic acceleration could lead to a whole change in our view of cosmic history. That’s a big part of the motivation for thinking about our alternative proposal.

With these thoughts about the consensus model in mind, let me turn to the cyclic proposal. Since it’s cyclic, I’m allowed to begin the discussion of the cycle at any point I choose. To make the discussion parallel, I’ll begin at a point analogous to the Big Bang; I’ll call it the Bang. This is a point in the cycle where the universe reaches its highest temperature and density. In this scenario, though, unlike the Big Bang model, the temperature and density don’t diverge. There is a maximal, finite temperature. It’s a very high temperature, around 1020 degrees Kelvin—hot enough to evaporate atoms and nuclei into their fundamental constituents—but it’s not infinite. In fact, it’s well below the so-called Planck energy scale, where quantum gravity effects dominate. The theory begins with a bang and then proceeds directly to a phase dominated by radiation. In this scenario you do not have the inflation one has in the standard scenario. You still have to explain why the universe is flat, you still have to explain why the universe is homogeneous, and you still have to explain where the fluctuations came from that led to the formation of galaxies, but that’s not going to be explained by an early stage of inflation. It’s going to be explained by yet a different stage in the cyclic universe, which I’ll get to.

In this new model, you go directly to a radiation-dominated universe and form the usual nuclear abundances; then go directly to a matter-dominated universe in which the atoms and galaxies and larger-scale structure form; and then proceed to a phase of the universe dominated by dark energy. In the standard case, the dark energy comes as a surprise, since it’s something you have to add into the theory to make it consistent with what we observe. In the cyclic model, the dark energy moves to center stage as the key ingredient that is going to drive the universe, and in fact drives the universe, into the cyclic evolution. The first thing the dark energy does when it dominates the universe is what we observe today: It causes the expansion of the universe to begin to accelerate. Why is that important? Although this acceleration rate is 100 orders of magnitude smaller than the acceleration that one gets in inflation, if you give the universe enough time it actually accomplishes the same feat that inflation does. Over time, it thins out the distribution of matter and radiation in the universe, making the universe more and more homogeneous and isotropic—in fact, making it perfectly so—driving it into what is essentially a vacuum state.

Seth Lloyd said there were 1080 or 1090 bits inside the horizon, but if you were to look around the universe in a trillion years, you would find on average no bits inside your horizon, or less than one bit inside your horizon. In fact, when you count these bits, it’s important to realize that now that the universe is accelerating, our computer is actually losing bits from inside our horizon. This is something that we observe.

At the same time that the universe is made homogeneous and isotropic, it is also being made flat. If the universe had any warp or curvature to it, or if you think about the universe stretching over this long period of time, although it’s a slow process it makes the space extremely flat. If it continued forever, of course, that would be the end of the story. But in this scenario, just like inflation, the dark energy survives only for a finite period and triggers a series of events that eventually lead to a transformation of energy from gravity into new energy and radiation that will then start a new period of expansion of the universe. From a local observer’s point of view, it looks like the universe goes through exact cycles; that is to say, it looks like the universe empties out each round and a new matter and radiation is created, leading to a new period of expansion. In this sense it’s a cyclic universe. If you were a global observer and could see the entire universe, you’d discover that our three dimensions are forever infinite in this story. What’s happened is that at each stage when we create matter and radiation, it gets thinned out. It’s out there somewhere, but it’s getting thinned out. Locally, it looks like the universe is cyclic, but globally the universe has a steady evolution, a well-defined era in which, over time and throughout our three dimensions, entropy increases from cycle to cycle.

Exactly how this works in detail can be described in various ways. I will choose to present a very nice geometrical picture that’s motivated by superstring theory. We use only a few basic elements from superstring theory, so you don’t really have to know anything about superstring theory to understand what I’m going to talk about, except to understand that some of the strange things I’m going to introduce I am not introducing for the first time. They’re already sitting there in superstring theory waiting to be put to good purpose.

One of the ideas in superstring theory is that there are extra dimensions; it’s an essential element to that theory, which is necessary to make it mathematically consistent. In one particular formulation of that theory, the universe has a total of eleven dimensions. Six of them are curled up into a little ball so tiny that, for my purposes, I’m just going to pretend they’re not there. However, there are three spatial dimensions, one time dimension, and one additional dimension that I do want to consider. In this picture, our three dimensions with which we’re familiar and through which we move lie along a hypersurface, or membrane. This membrane is a boundary of the extra dimension. There is another boundary, or membrane, on the other side. In between, there’s an extra dimension that, if you like, only exists over a certain interval. It’s like we are one end of a sandwich, in between which there is a so-called bulk volume of space. These surfaces are referred to as orbifolds or branes—the latter referring to the word “membrane.” The branes have physical properties. They have energy and momentum, and when you excite them you can produce things like quarks and electrons. We are composed of the quarks and electrons on one of these branes. And, since quarks and leptons can only move along branes, we are restricted to moving along and seeing only the three dimensions of our brane. We cannot see directly the bulk or any matter on the other brane.

In the cyclic universe, at regular intervals of trillions of years, these two branes smash together. This creates all kinds of excitations—particles and radiation. The collision thereby heats up the branes, and then they bounce apart again. The branes are attracted to each other through a force that acts just like a spring, causing the branes to come together at regular intervals. To describe it more completely, what’s happening is that the universe goes through two kinds of stages of motion. When the universe has matter and radiation in it, or when the branes are far enough apart, the main motion is the branes stretching, or, equivalently, our three dimensions expanding. During this period, the branes more or less remain a fixed distance apart. That’s what’s been happening, for example, in the last 15 billion years. During these stages, our three dimensions are stretching just as they normally would. At a microscopic distance away, there is another brane sitting and expanding, but since we can’t touch, feel, or see across the bulk, we can’t sense it directly. If there is a clump of matter over there, we can feel the gravitational effect, but we can’t see any light or anything else it emits, because anything it emits is going to move along that brane. We only see things that move along our own brane.

Next, the energy associated with the force between these branes takes over the universe. From our vantage point on one of the branes, this acts just like the dark energy we observe today. It causes the branes to accelerate in their stretching, to the point where all the matter and radiation produced since the last collision is spread out and the branes become essentially smooth, flat, empty surfaces. If you like, you can think of them as being wrinkled and full of matter up to this point, and then stretching by a fantastic amount over the next trillion years. The stretching causes the mass and energy on the brane to thin out and the wrinkles to be smoothed out. After trillions of years, the branes are, for all intents and purposes, smooth, flat, parallel, and empty.

Then the force between these two branes slowly brings the branes together. As it brings them together, the force grows stronger and the branes speed toward one another. When they collide, there’s a walloping impact—enough to create a high density of matter and radiation with a very high, albeit finite, temperature. The two branes go flying apart, more or less back to where they are, and then the new matter and radiation, through the action of gravity, causes the branes to begin a new period of stretching.

In this picture, it’s clear that the universe is going through periods of expansion and a funny kind of contraction. Where the two branes come together, it’s not a contraction of our dimensions but a contraction of the extra dimension. Before the contraction, all matter and radiation has been spread out, but, unlike the old cyclic models of the 1920s and ’30s, it doesn’t come back together again during the contraction, because our three dimensions—that is, the branes—remain stretched out. Only the extra dimension contracts. This process repeats itself cycle after cycle.

If you compare the cyclic model to the consensus picture, two of the functions of inflation—namely, flattening and homogenizing the universe—are accomplished by the period of accelerated expansion that we’ve now just begun. Of course, I really mean the analogous expansion that occurred one cycle ago, before the most recent Bang. The third function of inflation—producing fluctuations in the density—occurs as these two branes come together. As they approach, quantum fluctuations cause the branes to begin to wrinkle. And because they’re wrinkled, they don’t collide everywhere at the same time. Rather, some regions collide a bit earlier than others. This means that some regions reheat to a finite temperature and begin to cool a little bit before other regions. When the branes come apart again, the temperature of the universe is not perfectly homogeneous but has spatial variations left over from the quantum wrinkles.

Remarkably, although the physical processes are completely different and the time scale is completely different—this is taking billions of years, instead of 10-30 seconds—it turns out that the spectrum of fluctuations you get in the distribution of energy and temperature is essentially the same as what you get in inflation. Hence, the cyclic model is also in exquisite agreement with all of the measurements of the temperature and mass distribution of the universe that we have today.

Because the physics in these two models is quite different, there is an important distinction in what we would observe if one or the other were actually true—although this effect has not been detected yet. In inflation when you create fluctuations, you don’t just create fluctuations in energy and temperature but you also create fluctuations in spacetime itself, so-called gravitational waves. That’s a feature we hope to look for in experiments in the coming decades as a verification of the consensus model. In our model, you don’t get those gravitational waves. The essential difference is that inflationary fluctuations are created in a hyperrapid, violent process that is strong enough to create gravitational waves, whereas cyclic fluctuations are created in an ultraslow, gentle process that is too weak to produce gravitational waves. That’s an example where the two models give an observational prediction that is dramatically different. It’s just difficult to observe at the present time.

What’s fascinating at the moment is that we have two paradigms now available to us. On the one hand, they are poles apart in terms of what they tell us about the nature of time, about our cosmic history, about the order in which events occur, and about the time scale on which they occur. On the other hand, they are remarkably similar in terms of what they predict about the universe today. Ultimately what will decide between the two is a combination of observations—for example, the search for cosmic gravitational waves—and theory, because a key aspect to this scenario entails assumptions about what happens at the collision between branes that might be checked or refuted in superstring theory. In the meantime, for the next few years, we can all have great fun speculating about the implications of each of these ideas and how we can best distinguish between them.

Paul Steinhardt is a theoretical physicist, an Albert Einstein Professor of Science at Princeton University and coauthor (with Neil Turok) of “Endless Universe: Beyond the Big Bang.” This piece originally appeared as a speech by Steinhardt at an event in 2002. It has been excerpted here as it appears in “The Universe: Leading Scientists Explore the Origin, Mysteries, and Future of the Cosmos.” Copyright © 2014 by Edge Foundation Inc. Published by Harper Perennial

What Happened Before The Big Bang? July 3, 2007 Penn State

Summary

New discoveries have been made about another universe whose collapse appears to have given birth to the one we have today. The research introduces a new mathematical model that gives new details about the beginning of our universe, which now appears to have been a Big Bounce, according to a new theory of quantum gravity, and not a Big Bang, as described by Einstein’s Theory of General Relativity.

New discoveries have been made about another universe whose collapse appears to have given birth to the one we live in today. They will be announced in the early on-line edition of the journal Nature Physics on 1 July 2007 and will be published in the August 2007 issue of the journal’s print edition. “My paper introduces a new mathematical model that we can use to derive new details about the properties of a quantum state as it travels through the Big Bounce, which replaces the classical idea of a Big Bang as the beginning of our universe,” said Martin Bojowald, assistant professor of physics at Penn State. Bojowald’s research also suggests that, although it is possible to learn about many properties of the earlier universe, we always will be uncertain about some of these properties because his calculations reveal a “cosmic forgetfulness” that results from the extreme quantum forces during the Big Bounce.

The idea that the universe erupted with a Big Bang explosion has been a big barrier in scientific attempts to understand the origin of our expanding universe, although the Big Bang long has been considered by physicists to be the best model. As described by Einstein’s Theory of General Relativity, the origin of the Big Bang is a mathematically nonsensical state — a “singularity” of zero volume that nevertheless contained infinite density and infinitely large energy.

Now, however, Bojowald and other physicists at Penn State are exploring territory unknown even to Einstein — the time before the Big Bang — using a mathematical time machine called Loop Quantum Gravity. This theory, which combines Einstein’s Theory of General Relativity with equations of quantum physics that did not exist in Einstein’s day, is the first mathematical description to systematically establish the existence of the Big Bounce and to deduce properties of the earlier universe from which our own may have sprung. For scientists, the Big Bounce opens a crack in the barrier that was the Big Bang.

“Einstein’s Theory of General Relativity does not include the quantum physics that you must have in order to describe the extremely high energies that dominated our universe during its very early evolution,” Bojowald explained, “but we now have Loop Quantum Gravity, a theory that does include the necessary quantum physics.” Loop Quantum Gravity was pioneered and is being developed in the Penn State Institute for Gravitational Physics and Geometry, and is now a leading approach to the goal of unifying general relativity with quantum physics. Scientists using this theory to trace our universe backward in time have found that its beginning point had a minimum volume that is not zero and a maximum energy that is not infinite. As a result of these limits, the theory’s equations continue to produce valid mathematical results past the point of the classical Big Bang, giving scientists a window into the time before the Big Bounce.

Quantum-gravity theory indicates that the fabric of space-time has an “atomic” geometry that is woven with one-dimensional quantum threads. This fabric tears violently under the extreme conditions dominated by quantum physics near the Big Bounce, causing gravity to become strongly repulsive so that, instead of vanishing into infinity as predicted by Einstein’s Theory of General Relativity, the universe rebounded in the Big Bounce that gave birth to our expanding universe. The theory reveals a contracting universe before the Big Bounce, with space-time geometry that otherwise was similar to that of our universe today.

Bojowald found he had to create a new mathematical model to use with the theory of Loop Quantum Gravity in order to explore the universe before the Big Bounce with more precision. “A more precise model was needed within Loop Quantum Gravity than the existing numerical methods, which require successive approximations of the solutions and yield results that are not as general and complete as one would like,” Bojowald explained. He developed a mathematical model that produces precise analytical solutions by solving of a set of mathematical equations.

In addition to being more precise, Bojowald’s new model also is much shorter. He reformulated the quantum-gravity models using a different mathematical description, which he says made it possible to solve the equations explicitly and also turned out to be a strong simplification. “The earlier numerical model looked much more complicated, but its solutions looked very clean, which was a clue that such a mathematical simplification might exist,” he said. Bojowald reformulated quantum gravity’s differential equations — which require many calculations of numerous consecutive small changes in time — into an integrable system — in which a cumulative length of time can be specified for adding up all the small incremental changes.

The model’s equations require parameters that describe the state of our current universe accurately so that scientists then can use the model to travel backward in time, mathematically “un-evolving” the universe to reveal its state at earlier times. The model’s equations also contain some “free” parameters that are not yet known precisely but are nevertheless necessary to describe certain properties. Bojowald discovered that two of these free parameters are complementary: one is relevant almost exclusively after the Big Bounce and the other is relevant almost exclusively before the Big Bounce. Because one of these free parameters has essentially no influence on calculations of our current universe, Bojowald colludes that it cannot be used as a tool for back-calculating its value in the earlier universe before the Big Bounce.

The two free parameters, which Bojowald found were complementary, represent the quantum uncertainty in the total volume of the universe before and after the Big Bang. “These uncertainties are additional parameters that apply when you put a system into a quantum context such as a theory of quantum gravity,” Bojowald said. “It is similar to the uncertainty relations in quantum physics, where there is complimentarity between the position of an object and its velocity — if you measure one you cannot simultaneously measure the other.”

Similarly, Bojowald’s study indicates that there is complementarity between the uncertainty factors for the volume of the universe before the Big Bounce and the universe after the Big Bounce. “For all practical purposes, the precise uncertainty factor for the volume of the previous universe never will be determined by a procedure of calculating backwards from conditions in our present universe, even with most accurate measurements we ever will be able to make,” Bojowald explained. This discovery implies further limitations for discovering whether the matter in the universe before the Big Bang was dominated more strongly by quantum or classical properties.

“A problem with the earlier numerical model is you don’t see so clearly what the free parameters really are and what their influence is,” Bojowald said. “This mathematical model gives you an improved expression that contains all the free parameters and you can immediately see the influence of each one,” he explained. “After the equations were solved, it was rather immediate to reach conclusions from the results.”

Bojowald reached an additional conclusion after finding that at least one of the parameters of the previous universe did not survive its trip through the Big Bounce — that successive universes likely will not be perfect replicas of each other. He said, “the eternal recurrence of absolutely identical universes would seem to be prevented by the apparent existence of an intrinsic cosmic forgetfulness.”

The research was sponsored, in part, by the National Science Foundation.

The above post is reprinted from materials provided by Penn State.

What if the Big Bang was really the “Big Bounce”? – Primordial inflation data may provide a clue to a unified quantum gravity. – by Chris Lee – Jul 9, 2014 5:30pm BST

Not so long ago, our very own Matthew Francis attended the press conference in which results were announced from Antarctic observatory BICEP 2. Researchers claimed that the instruments there had located the unmistakable signature of gravitational waves during primordial inflation—a period of time during which the Universe expanded at a furious rate.

But our initial article also hinted at trouble to come.

The BICEP 2 experiment measures the ratio between light scattered by gravitational waves and light scattered by everything else, which shows up in the polarization of the cosmic microwave background (CMB) radiation. BICEP 2, however, is not the only instrument that can measure the properties of the CMB. Scientists have used the Planck satellite to measure the same ratio of light scatters—and guess what? The value obtained from BICEP 2 data doesn’t agree with the value obtained from the Planck data.

Under these circumstances, we’re faced with two possibilities: either one set of experimental data has not been interpreted properly or the Universe plays by unexpected rules. These possibilities are not mutually exclusive, providing lots of room for an interesting range of explanations. Under these circumstances, theoretical physicists tend to get a bit wild around the eyes and start stocking up on food, water, paper, and pencils. Once they are in their safe place, they let their imaginations run wild…

Bounce house

A group of Chinese and Canadian physicists asked themselves if a bouncing Universe might explain both the BICEP 2 and Planck results. A bouncing Universe is a consequence of loop quantum gravity, an attempt to unify quantum mechanics and relativity. One neat feature of loop quantum gravity is that, when the Universe is dense, gravity becomes repulsive. This means that inflation occurs naturally and doesn’t require additional physics.

A secondary consequence of gravity becoming repulsive is that the Universe can’t collapse to a singularity. This implies that the Universe may not have started at the Big Bang, which (under this model) just represents the point at which the Universe was at its minimum size. At times before the Big Bang, the Universe was still collapsing from a previous expansion. In this view, the Universe is a bit like a jackalope: bounding and rebounding.

Could a bouncy Universe explain the discrepancy between BICEP 2 and Planck? As with all theoretical physics papers, the details are rather murky, but the core of the argument has to do with the fact that the Universe’s singularity is no longer a singularity.

If the Big Bang was truly a singularity, any trace of the Universe that existed before the Big Bang would have been erased in it; the singularity destroys all. However, in loop quantum gravity, the beginning of the Universe is not a singularity, and so some of the CMB has its origin in the contracting Universe that existed prior to the Big Bang.

Even this, by itself, cannot explain the discrepancy between BICEP 2 and Planck. But, according to this paper, the contraction of the Universe just before the Big Bang was slower than its expansion after the Big Bang. The contraction and expansion of the Universe modifies the spectrum of the CMB. As a result, the contribution to the CMB from before the Big Bang is different from the CMB generated by the Big Bang. Meaning that, to understand what we’re seeing, we need to separate these two contributions.

The researchers showed that the contribution from before the Big Bang suppresses the amount of power pushed into some features of the CMB (the lowest order multipoles) and, simultaneously, increases the degree to which these same modes are polarized by gravitational interactions.

You might be asking what a multipole is. If I understand it correctly, this is a way of describing the spatial distribution of the CMB. Basically, the CMB is very slightly irregular in space. Any irregular shape can be described by a series of wave-like shapes with different amplitudes and a regular frequency spacing. These amplitudes tell us how much power was radiated into a particular shape early in the Universe, which tells us a lot about what was happening at the earliest moments of the Big Bang.

The BICEP 2 experiment obtained data for the very high-order multipoles, but no data at all for the low-order multipoles. Planck, on the other hand, has great data for the high-order multipoles and very noisy data for the low-order multipoles.

The researchers claim that their model fits the two datasets better than some common standard models (known as lambda-cold dark matter models). But the lambda-CDM models are constrained by lots of data from other sources, and they simply cannot be twisted to fit the new information. This means that, if both the BICEP 2 and Planck results hold up, some of the lambda-CDM models are in trouble. Loop quantum gravity, by contrast, isn’t as well developed and has two completely unknown parameters. This gives the model the freedom to be tweaked to fit both data sets.

Unfortunately, this also means that loop quantum gravity requires some other source of data to constrain these two free parameters. At the moment, the researchers are in the position of stating that loop quantum gravity fits the existing data better and of simultaneously using that same data to determine the values of their free parameters. Once data from other sources comes in, the true test for loop quantum gravity will begin. In the meantime, I still love the idea of a bouncy cosmos.

Since its announcement, the BICEP 2 analysis has been called into question, leading Physical Review Letters to publish a warning alongside the letter reporting the BICEP 2 results. Why publish those initial results if they might be wrong? Because the work is a great first attempt, and everybody’s future results will only build on this. Science is mostly about putting down layers of bricks and mortar, rather than dropping pre-constructed buildings. The next set of results from BICEP 2, along with improved analysis of Planck data, should clarify things substantially.

Physical Review Letters, 2014, DOI: 10.1103/PhysRevLett.112.251301

Chris Lee / Chris writes for Ars Technica’s science section. A physicist by day and science writer by night, he specializes in quantum physics and optics. He lives and works in Eindhoven, the Netherlands.

Dr Param Singh – Quantum Mechanics – Bouncing Universe – Perimeter Institute

Dr Param Singh Quantum Mechanics

Dr Param Singh is working on a theory that he hopes will shorten the odds. He’s trying to overcome the same problem as everyone else, namely the rather inconvenient idea of everything emerging from nothing, one Thursday afternoon 13.7 billion years ago. But Dr Param Singh’s ideas strike at the fundamental principles that caused all the problems in the first place.

Dr Param Singh “So, if you believe the universe is expanding and if you look at its history then the universe must have expanded from something. And if you look backward and backward, what big bang theory tells you is that the universe starts expanding from nothing.”

Retracing Hubble is impossible – mathematically

The principal mathematical objection is that, as the clock is wound back, and Hubble’s zero hour is approached, all the stuff of the universe is crammed into a smaller and smaller space. Eventually, that space will become infinitely small. And in mathematics, invoking Infinity is the same as giving up. Or cheating.

Dr Param Singh “Even if the mathematical laws would not have broken down at this point, even then its philosophically very incomplete, like, how can something just originate from nothing? And that is what the theory has to explain.”

It’s Param’s job to understand how the unimaginably large emerged from the infinitesimally small.

But it’s not just philosophy and Infinity that stand in his way.

Param Singh “If you look at our universe which is at large scales, the mathematics that we know from Einstein’s theory very well describes most of the phenomena – like all phenomena. Like this ball which I throw up – it comes back.”

Quantum Mechanics

Dr Dr Param Singh “But if I want to describe what is inside this ball, the atomic structure of the ball, or how the molecules are made and how atoms are made, what are their fundamental constituents, then I don’t use classical gravity, I use a completely different physics called quantum mechanics. If I look at the universe, and I ask the question, I want to describe how it came from nothing, what was its nature when it was very small, then I have to use both the classical gravity and quantum mechanics and they don’t talk to each other. What they need is a new theory, and new mathematics. And that is the biggest problem to find.”

Dr Param Singh has been working on a new way to combine the two systems. A scheme that works in the very big AND the very small. What he’s found is that the maths predicts a very peculiar phenomenon.

Dr Param Singh “What we find is, that gravitational force, which is attractive, becomes repulsive when the universe is very small. That is predicted by the mathematics, the new mathematics which we obtain by the marriage of quantum mechanics and Einstein’s gravity. It is a completely different paradigm now.”

The problem of the Big Bang. infinities are swept away by the new “repulsive” gravity. The point of “everything is nothing” is never reached.

Equation Detail
Dr Param Singh “The maths is here, so this is one of the equations which took a couple of years to derive and the part in orange is the one that is predicted by Einstein’s theory and the part in the white is the corrections which come from quantum gravity. So if you look at this orange part, this orange part tells you that if you look at the universe, which is becoming smaller and smaller as you approach a big bang, the left-hand side and the right-hand side, they both become Infinity. And we know that whenever we encountered Infinity in mathematics, something has gone terribly wrong. So what quantum gravity gives us is this expression, which ensures that as we approach the big bang, when universe is becoming smaller and smaller, both sides become zero, and after that, the universe starts expanding again on the other direction and the same laws remain valid.”

Bouncing Universe

In Param Singh’s scheme, instead of emerging from nothing, our universe owes its existence to a previous one that had the misfortune to collapse in on itself, then, thanks to some clever maths, rebounded to become what we see today. So the big bang was not a bang at all. It was, rather, a bouncing universe, big bounce.

Dr Param Singh “It’s a surprising thing, a bouncing universe, but in nature, if you look around us, there are lots of cycles, always happening, like we have seasons, we have even the motion of planets around the sun. In fact, nature tries to prefer things were just cyclic and a way. But if we look at the whole lifespan of the age of the universe, which is billions of years, then maybe these cycles or the bounces, may not at all be surprising, and these are just the cycles of weather, in a way, for the universe, of going through contraction and expansion and contraction and expansion and so on.”

Of course, it might all be nothing more than a fantasy world of mathematics and little else. And there’s always the nagging question of what started the infinite bouncing in the first place.

Dr Param Singh “Well, that’s the most important question and I don’t know the answer to that. Maybe very soon will find an answer to how it all started.”