Dark Matter: Hot Or Not?

Illustris simulation, showing the distribution of dark matter in 350 million by 300,000 light years. Galaxies are shown as high-density white dots (left) and as normal, baryonic matter (right). Credit: Markus Haider/Illustris

For almost a century, astronomers and cosmologists have postulated that space is filled with an invisible mass known as “dark matter”. Accounting for 27% of the mass and energy in the observable universe, the existence of this matter was intended to explain all the “missing” baryonic matter in cosmological models. Unfortunately, the concept of dark matter has solved one cosmological problem, only to create another.

If this matter does exist, what is it made of? So far, theories have ranged from saying that it is made up of cold, warm or hot matter, with the most widely-accepted theory being the Lambda Cold Dark Matter (Lambda-CDM) model. However, a new study produced by a team of European astronomer suggests that the Warm Dark Matter (WDM) model may be able to explain the latest observations made of the early Universe.

But first, some explanations are in order. The different theories on dark matter (cold, warm, hot) refer not to the temperatures of the matter itself, but the size of the particles themselves with respect to the size of a protogalaxy – an early Universe formation, from which dwarf galaxies would later form.

The size of these particles determines how fast they can travel, which determines their thermodynamic properties, and indicates how far they could have traveled – aka. their “free streaming length” (FSL) – before being slowed by cosmic expansion. Whereas hot dark matter would be made up of very light particles with high FSLs, cold dark matter is believed to be made up of massive particles bigger that a protogalaxy (hence, a low FSL).

Cold dark matter has been speculated to take the form of Massive Compact Halo Objects (MACHOs) like black holes; Robust Associations of Massive Baryonic Objects (RAMBOs) like clusters of brown dwarfs; or a class of undiscovered heavy particles – i.e. Weakly-Interacting Massive Particles (WIMPs), and axions.

The widely-accepted Lambda-CDM model is based in part of the theory that dark matter is “cold”. As cosmological explanations go, it is the most simple and can account for the formation of galaxies or galaxy cluster formations. However, there remains some holes in this theory, the biggest of which is that it predicts that there should be many more small, dwarf galaxies in the early Universe than we can account for.

In short, the existence of dark matter as massive particles that have low FSL would result in small fluctuations in the density of matter in the early Universe – which would lead to large amounts of low-mass galaxies to be found as satellites of galactic halos, and with large concentrations of dark matter in their centers.

Naturally, the absence of these galaxies might lead one to speculate that we simply haven’t spotted these galaxies yet, and that IR surveys like the Two-Micron All Sky Survey (2MASS) and the Wide-field Infrared Survey Explorer (WISE) missions might find them in time.

But as the international research team – which includes astronomers from the Astronomical Observatory of Rome (INAF), the Italian Space Agency Science Data Center and the Paris Observatory – another possibility is that dark matter is neither hot nor cold, but “warm” – i.e. consisting of middle-mass particles (also undiscovered) with FSLs that are roughly the same as objects big as galaxies.

As Dr. Nicola Menci – a researcher with the INAF and the lead author of the study – told Universe Today via email:

“The Cold Dark Matter particles are characterized by low root mean square velocities, due to their large masses (usually assumed of the order of >~ 100 GeV, a hundred times the mass of a proton). Such low thermal velocities allow for the clumping of CDM even on very small scales. Conversely, lighter dark matter particles with masses of the order of keV (around 1/500 the mass of the electron) would be characterized by larger thermal velocities, inhibiting the clumping of DM on mass scales of dwarf galaxies. This would suppress the abundance of dwarf galaxies (and of satellite galaxies) and produce shallow inner density profiles in such objects, naturally matching the observations without the need for a strong feedback from stellar populations.”

In other words, they found that the WDM could better account for the early Universe as we are seeing it today. Whereas the Lambda-CDM model would result in perturbations in densities in the early Universe, the longer FSL of warm dark matter particles would smooth these perturbations out, thus resembling what we see when we look deep into the cosmos to see the Universe during the epoch of galaxy formation.

For the sake of their study, which appeared recently in the July 1st issue of The Astrophysical Journal Letters, the research team relied on data obtained from the Hubble Frontier Fields (HFF) program. Taking advantage of improvements made in recent years, they were able to examine the magnitude of particularly faint and distant galaxies.

As Menci explained, this is a relatively new ability which the Hubble Space Telescope would not have been able to do a few years ago:

“Since galaxy formation is deeply affected by the nature of DM on the scale of dwarf galaxies, a powerful tool to constraint DM models is to measure the abundance of low-mass galaxies at early cosmic times (high redshifts z=6-8), the epoch of their formation. This is a challenging task since it implies finding extremely faint objects (absolute magnitudes M_UV=-12 to -13) at very large distances (12-13 billion of light years) even for the Hubble Space Telescope.

“However, the Hubble Frontier Field programme exploits the gravitational lensing produced by foreground galaxy clusters to amplify the light from distant galaxies. Since the formation of dwarf galaxies is suppressed in WDM models – and the strength of the suppression is larger for lighter DM particles – the high measured abundance of high-redshift dwarf galaxies (~ 3 galaxies per cube Mpc) can provide a lower limit for the WDM particle mass, which is completely independent of the stellar properties of galaxies.”

The results they obtained provided strict constraints on dark matter and early galaxy formation, and were thus consistent with what HFF has been seeing. These results could indicate that our failure to detect dark matter so far may have been the result of looking for the wrong kind of particles. But of course, these results are just one step in a larger effort, and will require further testing and confirmation.

Looking ahead, Menci and his colleagues hope to obtain further information from the HFF program, and hopes that future missions will allow them to see if their findings hold up. As already noted, these include infrared astronomy missions, which are expected to “see” more of the early Universe by looking beyond the visible spectrum.

“Our results are based on the abundance of high-redshift dwarfs measured in only two fields,” he said. “However, the HFF program aims at measuring such abundances in six independent fields. The operation of the James Webb Space Telescope in the near future – with a lensing program analogous to the HFF –  will allow us to pin down the possible mechanisms for the production of WDM particles, or to rule out WDM models as alternatives to CDM,” he said. “

For almost a century, dark matter has been a pervasive and ellusive mystery, always receding away the moment think we are about to figure it out. But the  deeper we look into the known Universe (and the farther back in time) the more we are able to learn about the its evolution, and thus see if they accord with our theories.

Further Reading: The Astrophysical Journal Letters, AAS Nova

The post Dark Matter: Hot Or Not? appeared first on Universe Today.

Physicists Maybe, Just Maybe, Confirm the Possible Discovery of 5th Force of Nature

The discovery of a possible fifth fundamental force could change our understanding of the universe. Credit: ESA/Hubble/NASA/Judy Schmidt

For some time, physicists have understood that all known phenomena in the Universe are governed by four fundamental forces. These include weak nuclear force, strong nuclear force, electromagnetism and gravity. Whereas the first three forces of are all part of the Standard Model of particle physics, and can be explained through quantum mechanics, our understanding of gravity is dependent upon Einstein’s Theory of Relativity.

Understanding how these four forces fit together has been the aim of theoretical physics for decades, which in turn has led to the development of multiple theories that attempt to reconcile them (i.e. Super String Theory, Quantum Gravity, Grand Unified Theory, etc). However, their efforts may be complicated (or helped) thanks to new research that suggests there might just be a fifth force at work.

In a paper that was recently published in the journal Physical Review Letters, a research team from the University of California, Irvine explain how recent particle physics experiments may have yielded evidence of a new type of boson. This boson apparently does not behave as other bosons do, and may be an indication that there is yet another force of nature out there governing fundamental interactions.

As Jonathan Feng, a professor of physics & astronomy at UCI and one of the lead authors on the paper, said:

“If true, it’s revolutionary. For decades, we’ve known of four fundamental forces: gravitation, electromagnetism, and the strong and weak nuclear forces. If confirmed by further experiments, this discovery of a possible fifth force would completely change our understanding of the universe, with consequences for the unification of forces and dark matter.”

The efforts that led to this potential discovery began back in 2015, when the UCI team came across a study from a group of experimental nuclear physicists from the Hungarian Academy of Sciences Institute for Nuclear Research. At the time, these physicists were looking into a radioactive decay anomaly that hinted at the existence of a light particle that was 30 times heavier than an electron.

In a paper describing their research, lead researcher Attila Krasznahorka and his colleagues claimed that what they were observing might be the creation of “dark photons”. In short, they believed that they might have at last found evidence of Dark Matter, the mysterious, invisible mass that makes up about 85% of the Universe’s mass.

This report was largely overlooked at the time, but gained widespread attention earlier this year when Prof. Feng and his research team found it and began assessing its conclusions. But after studying the Hungarian teams results and comparing them to previous experiments, they concluded that the experimental evidence did not support the existence of dark photons.

Instead, they proposed that the discovery could indicate the possible presence of a fifth fundamental force of nature. These findings were published in arXiv in April, which was followed-up by a paper titled “Particle Physics Models for the 17 MeV Anomaly in Beryllium Nuclear Decays“, which was published in PRL this past Friday.

Essentially, the UCI team argue that instead of a dark photon, what the Hungarian research team might have witnessed was the creation of a previously undiscovered boson – which they have named the “protophobic X boson”. Whereas other bosons interact with electrons and protons, this hypothetical boson interacts with only electrons and neutrons, and only at an extremely limited range.

This limited interaction is believed to be the reason why the particle has remained unknown until now, and why the adjectives “photobic” and “X” are added to the name. “There’s no other boson that we’ve observed that has this same characteristic,” said Timothy Tait, a professor of physics & astronomy at UCI and the co-author of the paper. “Sometimes we also just call it the ‘X boson,’ where ‘X’ means unknown.”

If such a particle does exist, the possibilities for research breakthroughs could be endless. Feng hopes it could be joined with the three other forces governing particle interactions (electromagnetic, strong and weak nuclear forces) as a larger, more fundamental force. Feng also speculated that this possible discovery could point to the existence of a “dark sector” of our universe, which is governed by its own matter and forces.

“It’s possible that these two sectors talk to each other and interact with one another through somewhat veiled but fundamental interactions,” he said. “This dark sector force may manifest itself as this protophopic force we’re seeing as a result of the Hungarian experiment. In a broader sense, it fits in with our original research to understand the nature of dark matter.”

If this should prove to be the case, then physicists may be closer to figuring out the existence of dark matter (and maybe even dark energy), two of the greatest mysteries in modern astrophysics. What’s more, it could aid researchers in the search for physics beyond the Standard Model – something the researchers at CERN have been preoccupied with since the discovery of the Higgs Boson in 2012.

But as Feng notes, we need to confirm the existence of this particle through further experiments before we get all excited by its implications:

“The particle is not very heavy, and laboratories have had the energies required to make it since the ’50s and ’60s. But the reason it’s been hard to find is that its interactions are very feeble. That said, because the new particle is so light, there are many experimental groups working in small labs around the world that can follow up the initial claims, now that they know where to look.”

As the recent case involving CERN – where LHC teams were forced to announce that they had not discovered two new particles – demonstrates, it is important not to count our chickens before they are roosted. As always, cautious optimism is the best approach to potential new findings.

Further Reading: University of California, Irvine

The post Physicists Maybe, Just Maybe, Confirm the Possible Discovery of 5th Force of Nature appeared first on Universe Today.

The Hubble Constant Just Got Constantier

A team of astronomers using the Hubble Space Telescope have found that the current rate of expansion of the Universe could be almost 10 percent faster than previously thought. Image: NASA, ESA, A. Feild (STScI), and A. Riess (STScI/JHU)

Just when we think we understand the Universe pretty well, along come some astronomers to upend everything. In this case, something essential to everything we know and see has been turned on its head: the expansion rate of the Universe itself, aka the Hubble Constant.

A team of astronomers using the Hubble telescope has determined that the rate of expansion is between five and nine percent faster than previously measured. The Hubble Constant is not some curiousity that can be shelved until the next advances in measurement. It is part and parcel of the very nature of everything in existence.

“This surprising finding may be an important clue to understanding those mysterious parts of the universe that make up 95 percent of everything and don’t emit light, such as dark energy, dark matter, and dark radiation,” said study leader and Nobel Laureate Adam Riess of the Space Telescope Science Institute and The Johns Hopkins University, both in Baltimore, Maryland.

But before we get into the consequences of this study, let’s back up a bit and look at how the Hubble Constant is measured.

Measuring the expansion rate of the Universe is a tricky business. Using the image at the top, it works like this:

  1. Within the Milky Way, the Hubble telescope is used to measure the distance to Cepheid variables, a type of pulsating star. Parallax is used to do this, and parallax is a basic tool of geometry, which is also used in surveying. Astronomers know what the true brightness of Cepheids are, so comparing that to their apparent brightness from Earth gives an accurate measurement of the distance between the star and us. Their rate of pulsation also fine tunes the distance calculation. Cepheid variables are sometimes called “cosmic yardsticks” for this reason.
  2. Then astronomers turn their sights on other nearby galaxies which contain not only Cepheid variables, but also Type 1a supernova, another well-understood type of star. These supernovae, which are of course exploding stars, are another reliable yardstick for astronomers. The distance to these galaxies is obtained by using the Cepheids to measure the true brightness of the supernovae.
  3. Next, astronomers point the Hubble at galaxies that are even further away. These ones are so distant, that any Cepheids in those galaxies cannot be seen. But Type 1a supernovae are so bright that they can be seen, even at these enormous distances. Then, astronomers compare the true and apparent brightnesses of the supernovae to measure out to the distance where the expansion of the Universe can be seen. The light from the distant supernovae is “red-shifted”, or stretched, by the expansion of space. When the measured distance is compared with the red-shift of the light, it yields a measurement of the rate of the expansion of the Universe.
  4. Take a deep breath and read all that again.

The great part of all of this is that we have an even more accurate measurement of the rate of expansion of the Universe. The uncertainty in the measurement is down to 2.4%. The challenging part is that this rate of expansion of the modern Universe doesn’t jive with the measurement from the early Universe.

The rate of expansion of the early Universe is obtained from the left over radiation from the Big Bang. When that cosmic afterglow is measured by NASA’s Wilkinson Microwave Anisotropy Probe (WMAP) and the ESA’s Planck satellite, it yields a smaller rate of expansion. So the two don’t line up. It’s like building a bridge, where construction starts at both ends and should line up by the time you get to the middle. (Caveat: I have no idea if bridges are built like that.)

“You start at two ends, and you expect to meet in the middle if all of your drawings are right and your measurements are right,” Riess said. “But now the ends are not quite meeting in the middle and we want to know why.”

“If we know the initial amounts of stuff in the universe, such as dark energy and dark matter, and we have the physics correct, then you can go from a measurement at the time shortly after the big bang and use that understanding to predict how fast the universe should be expanding today,” said Riess. “However, if this discrepancy holds up, it appears we may not have the right understanding, and it changes how big the Hubble constant should be today.”

Why it doesn’t all add up is the fun, and maybe maddening, part of this.

What we call Dark Energy is the force that drives the expansion of the Universe. Is Dark Energy growing stronger? Or how about Dark Matter, which comprises most of the mass in the Universe. We know we don’t know much about it. Maybe we know even less than that, and its nature is changing over time.

“We know so little about the dark parts of the universe, it’s important to measure how they push and pull on space over cosmic history,” said Lucas Macri of Texas A&M University in College Station, a key collaborator on the study.

The team is still working with the Hubble to reduce the uncertainty in measurements of the rate of expansion. Instruments like the James Webb Space Telescope and the European Extremely Large Telescope might help to refine the measurement even more, and help address this compelling issue.

The post The Hubble Constant Just Got Constantier appeared first on Universe Today.