What on earth and off earth is dark energy?

TL:DR – A reprint of a feature article of mine on Dark Energy that was published in StarDate magazin in July 2007.


Type 1a Supernova Credit: NASA/Swift/S. Immler)Forget the Large Hadron Collider (LHC), with its alleged ability to create earth-sucking microscopic black holes, its forthcoming efforts to simulate conditions a trillionth of a second after the Big Bang 100 metres beneath the Swiss countryside. There is a far bigger puzzle facing science that the LHC cannot answer: What is the mysterious energy that seems to be accelerating ancient supernovae at the farthest reaches of the universe?

In the late 1990s, the universe changed. The sums suddenly did not add up. Observations of the remnants of stars that exploded billions of years ago, Type Ia supernovae, showed that not only are they getting further away as the universe expands but they are moving faster and faster. It is as if a mysterious invisible force works against gravity and pervades the cosmos accelerating the expansion of the universe. This force has become known as dark energy and although it apparently fills the universe, scientists have absolutely no idea what it is or where it comes from, several big research teams around the globe are working with astronomical technology that could help them find an answer.

Until type Ia supernovae appeared on the cosmological scene, scientists thought that the expansion of the universe following the Big Bang was slowing down. Type Ia supernovae are very distant objects, which means their light has taken billions of years to reach us. But, their brightness could be measured to a high degree of accuracy that they provide astronomers with a standard beacon with which the vast emptiness of space could be illuminated, figuratively speaking.

The supernovae data, obtained by the High-Z SN Search team and the Supernova Cosmology Project, rooted in Lawrence Berkeley National Laboratory, suggested that not only is the universe expanding, but that this expansion is accelerating. to make On the basis of the Type Ia supernovae, the rate of acceleration of expansion suggests that dark energy comprises around 73% the total energy of the universe, with dark matter representing 24% of the energy and all the planets, stars, galaxies, black holes, etc containing a mere 4%.

HETDEX, TEX STYLE

Professor Karl Gebhardt and Senior Research Scientists Dr Gary Hill and Dr Phillip McQueen and their colleagues running the Hobby Eberly Telescope Dark Energy Experiment (HETDEX) based at the McDonald Observatory in Texas are among the pioneers hoping to reveal the source and nature of dark energy. Those ancient supernovae are at a “look-back time” of 9 billion years, just two-thirds the universe’s age. HETDEX will look back much further to 10 -12 billion years.

HETDEX DomeHETDEX will not be looking for dark energy itself but its effects on how matter is distributed. “In the very early Universe, matter was spread out in peaks and troughs, like ripples on a pond, galaxies that later formed inherited that pattern,” Gebhardt explains. A detailed 3D map of the galaxies should reveal the pattern. “HETDEX uses the characteristic pattern of ripples as a fixed ruler that expands with the universe,” explains Senior Research Scientist Gary Hill. Measuring the distribution of galaxies uses this ruler to map out the positions of the galaxies, but this needs a lot of telescope time and a powerful new instrument. “Essentially we are just making a very big map [across some 15 billion cubic light years] of where the galaxies are and then analyzing that map to reveal the characteristic patterns,” Hill adds.

“We’ve designed an upgrade that allows the HET to observe 30 times more sky at a time than it is currently able to do,” he says. HETDEX will produce much clearer images and work much better than previous instruments, says McQueen. Such a large field of view needs technology that can analyze the light from those distant galaxies very precisely. There will be 145 such detectors, known as spectrographs, which will simultaneously gather the light from tens of thousands of fibers. “When light from a galaxy falls on one of the fibers its position and distance are measured very accurately,” adds Hill.

The team has dubbed the suite of spectrographs VIRUS. “It is a very powerful and efficient instrument for this work,” adds Hill, “but is simplified by making many copies of the simple spectrograph. This replication greatly reduces costs and risk as well.”

McQueen adds that after designing VIRUS, the team has built a prototype of one of the 145 unit spectrographs. VIRUS-P is now operational on the Observatory’s Harlan J. Smith 2.7 m telescope, he told us, “We’re delighted with its performance, and it’s given us real confidence in this part of our experiment.”

VIRUS will make observations of 10,000 galaxies every night. So, after just 100 nights VIRUS will have mapped a million galaxies. “We need a powerful telescope to undertake the DEX survey as quickly as possible,” adds McQueen. Such a map will constrain the expansion of the universe very precisely. “Since dark energy only manifests itself in the expansion of the universe, HETDEX will measure the effect of dark energy to within one percent,” Gebhardt says. The map will allow the team to determine whether the presence of dark energy across the universe has had a constant effect or whether dark energy itself evolves over time.

“If dark energy’s contribution to the expansion of the universe has changed over time, we expect HETDEX to see the change [in its observations],” adds Gebhardt, “Such a result will have profound implications for the nature of dark energy, since it will be something significantly different than what Einstein proposed.”

SLOAN RANGER

Scientific scrutiny of the original results has been so intense that most cosmologists are convinced dark energy exists. “There was a big change in our understanding around 2003-2004 as a triangle of evidence emerged,” says Bob Nichol of the University of Portsmouth, England, who is working on several projects investigating dark energy.

SDSS M51

First, the microwave background, the so-called afterglow of creation, showed that the geometry of the universe has a mathematically “flat” structure. Secondly, the data from the Type Ia supernovae measurements show that the expansion is accelerating. Thirdly, results from the Anglo-Australian 2dF redshift survey and then the Sloan Digital Sky Survey (SDSS) showed that on the large scale, the universe is lumpy with huge clusters of galaxies spread across the universe.

The SDSS carried out the biggest galaxy survey to date and confirmed gravity’s role in the expansion structures in the universe by looking at the ripples of the Big Bang across the cosmic ocean. “We are now seeing the corresponding cosmic ripples in the SDSS galaxy maps,” Daniel Eisenstein of the University of Arizona has said, “Seeing the same ripples in the early universe and the relatively nearby galaxies is smoking-gun evidence that the distribution of galaxies today grew via gravity.”

But why did an initially smooth universe become our lumpy cosmos of galaxies and galaxy clusters? An explanation of how this lumpiness arose might not only help explain the evolution of the early universe, but could shed new light on its continued evolution and its ultimate fate. SDSS project will provide new insights into the nature of dark energy’s materialistic counterpart, dark matter.

As with dark energy, dark matter is a mystery. Scientists believe it exists because without it the theories that explain our observations of how galaxies behave would not stack up. Dark matter is so important to these calculations, that a value for all the mass of the universe five times bigger than the sum of all the ordinary matter has to be added to the equations to make them work. While dark energy could explain the accelerating acceleration our expanding universe, the existence of dark matter could provide an explanation for how the lumpiness arose.

“In the early universe, the interaction between gravity and pressure caused a region of space with more ordinary matter than average to oscillate, sending out waves very much like the ripples in a pond when you throw in a pebble,” Nichol, who is part of the SDSS team, explains. “These ripples in matter grew for a million years until the universe cooled enough to freeze them in place. What we now see in the SDSS galaxy data is the imprint of these ripples billions of years later.”

Colleague Idit Zehavi now at Case Western University adds a different tone. Gravity’s signature could be likened to the resonance of a bell she suggests, “The last ring gets forever quieter and deeper in tone as the universe expands.” It is now so faint as to be detectable only by the most sensitive surveys. The SDSS has measured the tone of this last ring very accurately.”

“Comparing the measured value with that predicted by theory allows us to determine how fast the Universe is expanding,” explains Zehavi. This, as we have seen, depends on the amount of both dark matter and dark energy.

The triangle of evidence – microwave background, type Ia supernovae, and galactic large-scale structure – leads to only one possible conclusion: that there is not enough ordinary matter in the universe to make it behave in the way we observe and there is not enough normal energy to make it accelerate as it does. “The observations have forced us, unwillingly, into a corner,” says Nichol, “dark energy has to exist, but we do not yet know what it is.”

The next phase of SDSS research will be carried out by an international collaboration and sharpen the triangle still further along with the HETDEX results. “HETDEX adds greatly to the triangle of evidence for dark energy,” adds Hill, “because it measures large-scale structure at much greater look-back times between local measurements and the much older cosmic microwave background,” says Hill. As the results emerge, scientists might face the possibility that dark energy has changed over time or it may present evidence that requires modifications to the theory of gravity instead.

Wiggle-Z

The Anglo-Australian team is also undertaking its own cosmic ripple experiment, Wiggle-Z. “This program is measuring the size of ripples in the Universe when the Universe was about 7 billion years old,” Brian Schmidt at Australian National University says. Schmidt was leader of the High-Z supernovae team that found the first accelerating evidence. SDSS and 2dF covered 1-2 billion years ago and HETDEX will measure ripples at 10 billion years. “Together they provide the best possible measure of what the Universe has been doing over the past several years,” Schmidt muses.

INTERNATIONAL SURVEY

The Dark Energy Survey, another international collaboration, will make any photographer green with envy, but thankful they don’t have to carry it with them. The Fermilab team plans to build an extremely sensitive 500 Megapixel camera, with a 1 meter diameter and a 2.2 degree field of view that can grab those millions of pixels within seconds.

The camera itself will be mounted in a cage at the prime focus of the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, a southern hemisphere telescope owned and operated by the National Optical Astronomy Observatory (NOAO). This instrument, while being available to the wider astronomical community, will provide the team with the necessary power to conduct a large scale sky survey.

Over five years, DES will use almost a third of the available telescope time to carry out its wide survey. The team hopes to achieve exceptional precision in measuring the properties of dark energy using counts of galaxy clusters, supernovae, large-scale galaxy clustering, and measurements of how light from distant objects is bent by the gravity of closer objects between it and the earth. By probing dark energy using four different methods, the Dark Energy Survey will also double check for errors, according to team member Joshua Frieman.

WFMOS

Subaru 51

According to Nichol, “The discovery of dark energy is very exciting because it has rocked the whole of science to its foundations.” Nichol is part of the WFMOS (wide field multi-object spectrograph) team hoping to build an array of spectrographs for the Subaru telescopes. These spectrographs will make observations of millions of galaxies across an enormous volume of space at a distances equivalent to almost two thirds the age of the universe. “Our results will sit between the very accurate HETDEX measurements and the next generation SDSS results coming in the next five years,” he explains, “All the techniques are complimentary to one another, and will ultimately help us understand dark energy.”

DESTINY’S CHILD

If earth-based studies have begun to reveal the secrets of dark energy, then three projects vying for attention could take the experiments off-planet to get a slightly closer look. The projects all hope to look at supernovae and the large-scale spread of matter. They will be less error prone than any single technique and so provide definitive results.

SNAP, SuperNova/Acceleration Probe, is led by Saul Perlmutter of Lawrence Berkeley National Laboratory in Berkeley, California, one of the original supernova explorers. SNAP will observe light from thousands of Type Ia supernovae in the visible and infra-red regions of the spectrum as well as look at how that light is distorted by massive objects in between the supernovae and the earth.

Adept, Advanced Dark Energy Physics Telescope, is led by Charles Bennett of Johns Hopkins University in Baltimore, Maryland. This mission will also look at near-infrared light from 100 million galaxies and a thousand Type Ia supernovae. It will look for those cosmic ripples and so map out the positions of millions of galaxies. This information will allow scientists to track how the universe has changed over billions of years and the role played by dark energy.

Destiny, Dark Energy Space Telescope, led by Tod Lauer of the National Optical Astronomy Observatory, based in Tucson, Arizona, will detect and observe more than 3000 supernovae over a two-year mission and then survey a vast region of space looking at the lumpiness of the universe.

LIGHTS OUT ON DARK ENERGY

So, what is dark energy? “At this point it is pure speculation,” answers Hill, “The observations are currently too poor, so we are focusing on making the most accurate measurements possible.” Many scientists are rather embarrassed but equally excited by the thought that we understand only a tiny fraction of the universe. Understanding dark matter and dark energy is one of the most exciting quests in science. “Right now, we have no idea where it will lead, adds Hill.

Supernovae (NASA collage)

“Despite some lingering doubts, it looks like we are stuck with the accelerating universe,” says Schmidt. “The observations from supernovae, large-scale structure, and the cosmic microwave background look watertight,” he says. He too concedes that science is left guessing. The simplest solution is that dark energy was formed along with the universe. The heretical solution would mean modifying Einstein’s theory of General Relativity, which has so far been a perfect predictor of nature. “Theories abound,” Schmidt adds, “whatever the solution, it is exciting, but a very, very hard problem to solve.”

This David Bradley special feature article originally appeared on Sciencebase last summer, having been published in print in StarDate magazine – 2007-07-01-21:12:X1

Night at the Web Museum

night-at-the-museumYears ago when BioMedNet’s HMSBeagle was still sailing the high seas, I wrote a feature for the Adapt or Die careers column on scientific jobs in museums, the feature, which is available on Sciencebase is still relatively valid, but one big aspect of museums that has changed significantly since the Beagle was abandoned in dry dock is that museums the world over have virtualised themselves.

It has reached the point now, that a museum without a web presence and moreover without a beautifully designed, powerful, comprehensive, informative and interactive web presence is no museum at all. I’m sure that curators the world over reading this will by now be gnashing their teeth, and if any of them have no teeth to grit, teeth will be provided.

Anyway, it’s not to say that a purely offline museum would not be attractive to passing trade. However, Thomas Fotakis and Anastasios Economides of the University of Macedonia, in Thessaloniki, Greece, a country of multiple museums to say the least, suggest that museums cannot afford to remain offline. In a rapidly evolving and changing society where the internet has become an almost ubiquitous tool and entertainment device museums – whether art, history, or science and technology – have to get online if they are to stay current. Moreover, they suggest that the creation of a virtual meta museum is the next logical step.

“Museums should collaborate in order to develop a unified huge multimedia database that will contain all of their artefacts (either exhibited or in repository), information about art, culture, science, history, artists, exhibits, related events,” the researchers say.

Virtual and offline visitors would be able to search such a meta museum to find things of interest, participate in discussions with other visitors, and from the research point of view, unearth relationships between disparate artefacts in different parts of the world that may not be apparent to those merely browsing dusty shelves and cabinets in the offline world.

Through their websites, museums would create an environment in which visitors are not only able to explore the exhibited objects and art works, but also deal with new original experiences and ideas. Therefore, visitors can have a more essential and important experience.

But before that happens there are various concerns that have been raised about the web presence of almost all museums. For instance, many museum sites have been built without firm foundations and with no particular aims other than to create a web site. Further, many have never been assessed as to how well they actual meet their visitors’ needs and with no clear aim for the site, this is perhaps not surprising. Most worryingly though, is that much of the material on museum web sites simply duplicates in digital form the materials found in the bricks and mortar museums rather than rethink and developing the possibilities offered by the web. There are exceptions, of course.

Now, Fotakis and Economides have reviewed more than 200 museums having an online presence setting a range of criteria to reveal successes and to uncover problems. Their system, MuseumQual, provides a way to evaluate museum websites based on quality, quantity and most of all, user experience. “It is not enough to present a lot of information on the website,” they affirm, “It is also important that a visitor easily explores and utilises it.”

They tested 70 art, 70 science/technology, and 70 history museums using their program, looking at layout, multimedia, interactivity, feedback, and technical aspects such as whether or not a site force-fed the user browser cookies, had any “under construction” pages, privacy or security issues, and good navigation.

“Our evaluation showed that museums present websites that stay at a satisfying, yet not exceptional, level,” the researchers say, “Science museums’ sites lead the list, with art museums’ sites following closely and history museums’ sites coming in next. Almost all sites outperformed with respect to technical characteristics. However, many sites present inefficiencies regarding Interactivity and Feedback.”

They also emphasise that virtual museums have a role to play in providing access to people with special needs and unfortunately only few of the 210 sites tested provided access to culture, art and knowledge equally, having failed to account for people with alternative accessibility requirements. One aspect of accessibility could benefit anyone and that is the creation of virtual tours in the form of audio podcasts or video files or newsfeeds.

Nevertheless, having tested sites around the world, they specifically highlight the Miami Science Museum (www.miamisci.org), Fort Lauderdale Museum of Discovery and Science (www.mods.org), and Science Museum of Minnesota (www.smm.org) as offering a huge amount of practical and valuable information. Most scitech sites were easy to navigate and aesthetically pleasing, with Scitech Science Museum (www.scitech.org.au), Pusat Sains Negara Science Museum (www.psn.gov.my/en/) and Science Museum in London (www.sciencemuseum.org.uk) being particularly strong on multimedia. The best sites for interactivity and e-services were the Science Museum of Minnesota (www.smm.org) and Infoage Science Center (www.infoage.org).

The overall winners on all aspects of the MuseumQual analysis were:

  1. Museum of Fine Arts of Boston (www.mfa.org) (4.55/5.00)
  2. Centre Pompidou (www.centrepompidou.fr) (4.53/5.00)
  3. The Science Museum in London (www.sciencemuseum.org.uk) (4.48/5.00)

Some museums have been on the web for a decade or more but most, even the old hands, are failing to exploit the full potential of the modern multimedia internet. No one wants a museum that exists only in the past, it’s time to bring them together into the twenty-first century to enrich our lives whether in science, art or history. By doing so the virtual meta museum suggested by the University of Macedonia team will bring the past to life, in the most un-cliched sense.

I asked Economides about the possibility of using the test system for other types of website, science blogs for instance as it is open for anyone to use. “With some modifications (e.g. regarding e-services) MuseumQual could be used to also test other types of site,” he says, “However, other dimensions (e.g. usability, technical) are almost common for any type of site.” He points out that the tool is not automatic. “It is a quality framework that includes many criteria,” he explains,
“A user would use it to evaluate a website and consider the proposed criteria.”

Thomas Fotakis, Anastasios A. Economides (2008). Art, science/technology and history museums on the web International Journal of Digital Culture and Electronic Tourism, 1 (1) DOI: 10.1504/IJDCET.2008.020134

Science Blogging 2008

sciblog-2008-logoFirst off, just to say thanks to everyone who made sciblog2008 possible, already looking forward to its successor. It was fun to put faces to names of many of my fellow science bloggers and others out there who were at the Ri on Saturday. Quite amazing how so many look as young as their avatars! The conference, the breakouts and the unconference were fun and informative albeit if a certain keynote speaker was wont to use rather too many expletives (is that how conferences are these days?)

Anyway, speaking of how conferences are “these days” it was interesting to see just how much of the interaction at the conference went on online – through liveblogging – even between delegates sitting in the same room. Funny to see someone type and another raise an eyebrow in response. It was a bit like passing secret notes around the classroom, except the whole science blogosphere beyond the hallowed halls of the Royal Institution [why cerise seatcovers, why?] was reading those notes.

In particular, several bloggers were commentating on happenings via the sciblog08 FriendFeed room. Me, I didn’t even have my mobile phone switched on the whole day (it was Saturday, after all). I’ve no intention of duplicating the efforts of those Feedfriends nor of the various Nature bloggers and staff who have reported, blogged, and podcast the event and will soon be vidcasting it. Great logo by the way Euan, been there, done that, got the teeshirt.

However, I do want to raise the possibility of a mashup that occurred to me during the unconference thread on tracking conversations through the blogosphere. Cameron Neylon mentioned researchblogging.org (previously known as BPR3.org), and their DOI-citation capture system. Egon Willighagen and others mentioned Chemical Blogspace (Cb) and Adie’s Postgenomic (Pg) together with the Zemanta plugin and Mozilla Ubiquity. Willighagen has already written a post-conference script to act as a handler for DOIs, I see.

But, it was the unconference discussion that got me thinking that a science-specialised version of Zemanta and/or Ubiquity could monitor your latest blog post and on the basis of the names and keywords it sees as you type could suggest likely literature references. It would be a straightforward matter to display the titles of all relevant papers and as you blog you could add a star to the main paper about which you’re righting and tick any others that might be worth citing in the post.

A blog plugin could then fetch the researchblogging.org formated reference and paste into the foot of the blog post automatically, perhaps flagging your post once you hit “publish” in Connotea too. This way you would not have to remember to visit researchblogging.org nor have to be able to locate the DOI for pasting into their citation wizard the Science-based Zemanta tool would do the job for you.

Invisible Fishnets and Baby Boomer Pain

fishnetsIt’s that time of the month again, so here’s the latest round-up from my column over on SpectroscopyNOW, covering a whole range of science and medical news with a spectral twist from magnetic resonance to Raman by way of fishnets and infra-red.

Fishnet invisibility cloak – It is what fans of science fiction and technologists have been waiting for since HG Wells’ Invisible Man first came into view – or not, as the case may be. Scientists at the University of California, Berkeley, have engineered three-dimensional meta materials that can reverse the natural direction of visible and near-infrared light, which could one day lead to an invisibility device.

Bending MRI to diagnose joint disease – Osteoarthritis has turned out to be the bane of the Baby Boom generation, causing joint pain and disability for millions of people, more than half of those over the age of 65 in fact. Unfortunately, current approaches to diagnosing the disease cannot provide definitive results until the disease is in the advanced stages. This is often when symptoms have become severe and irreversible joint damage may already have occurred. Magnetic resonance imaging could provide an early diagnosis of osteoarthritis (OA), the most common form of arthritis, according to scientists speaking at the recent ACS meeting.

Universal detector – A team in Japan has used UV spectroscopy and microscopy to study the interaction between liposome clusters and endocrine-disrupting chemicals (EDCs) as a model of how living cell plasma membranes might be affected. The work could lead to the development of a universal detector for EDCs

Dance of the xenons – An NMR study of xenon atoms has demonstrated a fundamental new property – what appears to be chaotic behaviour in a quantum system ? in the magnetic spin of these frozen atoms. The work could lead to improvements in our understanding of matter as well as in magnetic resonance imaging.

Handling chirality with X-rays – X-rays are rather useful in determining the structure of materials and biomolecules, but are relatively insensitive to chirality. Now, a team of scientists in Japan has shown that circularly polarized X-rays at an appropriate wavelength can distinguish ‘left’ from ‘right’ in alpha-quartz. The work could have implications for studies of other inorganic organometallic materials, including industrial catalysts, liquid crystals, biomolecules, and pharmaceutical products.

Hybrid technology – Surface-Enhanced Raman Scattering (SERS) was first used in 1977 and since the has proven itself as an extremely sensitive analytical technique requiring only small volumes of sample and with wide application. Researchers have suggested that it is so sensitive that it could be used as a new tool in single molecule detection to augment or even displace techniques such as laser-induced fluorescence, frequency-modulated optical absorption at low temperature, and electrochemical detection of redox-active species. SERS of silicon nanostructures coated with a gold-silver substrate can be used to detect DNA hybridisation for taxonomic, biomedical and medical diagnostics purposes, according to a new study by researchers in Singapore.

Oh, and speaking of fishnets…anyone been thinking about the modelling career of John McCain’s running mate Sarah Palin?