Top Trumps for Science Competition

top-trumps-scienceSomething a little different today. A tale of family playtime, a poll, and a competition to win prizes from the RSC and the CentreoftheCell.org.

Card Competition

Okay, here’s the competition bit. What you have to do to be in with a chance of winning is to drop me a line, giving me a good reason (in not more than 15 words) for your choice of chemical trumps or cell trumps and you win the pack of your choice – trumpcomp-AT-sciencebase.com is the address to use.

I’ll pick the best ideas from the comments and emails and announce the winner in the next few weeks. Judge’s decision will be final and if no entries come up to scratch then I reserve the right to throw my rattle out of the pram.

And, now on with the story…

This term, both my kids are learning about the elements at school. My daughter, who is still in primary school is learning about the ancient elements – earth, air, fire, water. While my son, who is half way through high school returned home with tales of electron shells and the elements of the periodic table.

It was, therefore quite timely that Royal Society of Chemistry press officer and Satrianialike, Jon Edwards, should send me a pack of Visual Elements Trumps. The cards follow in the classic tradition of the Top Trumps game, my friends and I collected and played when we were at school – trains, planes, automobiles and a few more sciencey ones, including dinosaurs were around at that time. There have been others since, including a spinoff from the defunct BBC TV show and magazine Tomorrow’s World, Star Wars, and Harry Potter have also fallen under the trumping spell. And, of course, Pokemon and Digimon cards, which swept through playground a few years ago, are also based on the trump theme, albeit with a few more bizarre properties than top speed and height.

Anyway, the kids and I had a quick round of Elemental trumps. My daughter won, having quickly latched on to the notions of automatic atomic radius and ionistation ionisation energy. She was also rather intrigued by the idea of hydrogen gas having a boiling point. We all enjoyed the game, but obviously it’s the educational and promotional value it may have for kids studying science and chemistry that underpin its production by the RSC. I have to admit the writing in the element description bubble is too small for me to see in dim light. With a magnifying glass, however, I can see that they have packed the main elemental essentials on to each card together with the RSC’s well-known artistic images associated with each.

A related product on the science educational stuff market is Elementeo, which is a hybrid of sword & sorcery game and science, with a Sodium Dragon and Oxygen the Lifegiver. They’re very tongue in cheek but there’s not as much chemical information. So unless you’re a science fantasy addict, I’d opt for the RSC game. Another variation on the theme is available from the WebElements shop and was developed by the University of Brighton. This version has more facts and also comes with approval (for what that’s worth) from the Top Trumps people .

We then moved on to a game of Cell Trumps, produced by the Centre of the Cell at Queen Mary University of London, which arrived at roughly the same time as the RSC cards. However, a quick fan of the deck reveals them to be slightly simpler, swapping number in body for 1st ionisation energy, and number of their scientists working on the particular type of cell for atomic radius. But the kids coped, although my daughter favoured the slightly more esoteric Element Trumps over the cell. She was quite taken by the adipocytes having spotted the connection with the name of the fatty, alien Adipose characters from a recent Doctor Who episode.

Scientific Trumps seemed just right for introducing some scientific concepts in a fun way to kids at the higher end of primary school or even heading towards high school exams. They might even be inspirational to money-free undergraduates lacking beer towards the end of term, who knows? And, if you arrived here looking for science education materials check out the learn with Sciencebase page, which has links to various science project resources.

Sex and Social Networking

social-sexUltimately, the only truly safe sex is that practised alone or not practiced at all, oh, and perhaps cybersex. However, that said, even these have issues associated with eyesight compromise (allegedly), repetitive strain injury (RSI) and even electrocution in extreme cases of online interactions (you could spill your Mountain Dew on your laptop, after all). And, of course, there are popups, Trojans, packet sniffers and viruses and worms to consider…

No matter how realistic the graphics become in Second Life or how good the 3rd party applications in Facebook, however, unless you indulge in direct human to human contact in the offline world, you are not going to catch a sexually transmitted disease, STD. Real-world social networking is, of course, a very real risk factor for STD transmission, according to a new research report in the International Journal of Functional Informatics and Personalised Medicine. This could be especially so given the concept of six five-degrees of separation through which links between individuals are networked by ever short person-to-person-to-person bonds.

According to Courtney Corley and Armin Mikler of the Computational Epidemiology Research Laboratory, at the University of North Texas, computer scientist Diane Cook of Washington State University, in Pullman, and biostatistician Karan Singh of the University of North Texas Health Science Center, in Fort Worth, sexually transmitted diseases and infections are, by definition, transferred among intimate social networks.

They point out that although the way in which various social settings are formed varies considerably between different groups in different places, crucial to the emergence of sexual relationships is obviously a high level of intimacy. They explain that for this reason, modelling the spread of STDs so that medical workers and researchers can better understand, treat and prevent them must be underpinned by social network simulation.

Sexually transmitted diseases and infections are a significant and increasing threat among both developed and developing countries around the world, causing varying degrees of mortality and morbidity in all populations.

Other research has revealed that approximately one in four teens in the United States will contract a sexually transmitted disease (STD) because they fail to use condoms consistently and routinely. The reasons why are well known it seems – partner disapproval and concerns of reduced sexual pleasure.

As such, professionals within the public health industry must be responsible for properly and effectively funding resources, based on predictive models so that STDs can be tamed. If they are not, Corley and colleagues suggest, preventable and curable STDs will ultimately become endemic within the general population.

The team has now developed the Dynamic Social Network of Intimate Contacts (DynSNIC). This program is a simulator that embodies the intimate dynamic and evolving social networks related to the transmission of STDs. They suggest that health professionals will be able to use DynSNIC to develop public health policies and strategies for limiting the spread of STDs, through educational and awareness campaigns.

As a footnote to this research, it occurred to me that researchers must spend an awful lot of time contriving acronyms and abbreviations for their research projects. Take Atlas, one of the experimental setups at the Large Hadron Collider at CERN in Geneva Switzerland. Atlas stands for – “A Toroidal LHC ApparatuS”. So they used an abbreviation within their acronym as well as a noise word – “A” and the last letter of one of the terms. Ludicrous.

But, Atlas is not nearly as silly as the DynSNIC acronym used in Corley’s paper, I’m afraid. Dynamic Social Network of Intimate Contacts, indeed! I thought the whole idea of abbreviating a long research project title was to make it easier to remember and say out lead. DynSNIC, hardly memorable (I is it a y or an I, snic or snick or sink or what. Students will forever struggle with such contrivances. They could’ve just as easily used something like Sexually Transmitted Infections Contact Social Intimate Networks – STICSIN. This would be a double-edged sword that would appeal to both to the religious right and to the scabrous-minded, depending where you put the break (after the Contact or after Social.

Courtney D. Corley, Armin R. Mikler, Diane J. Cook, Karan P. Singh (2008). Dynamic intimate contact social networks and epidemic interventions International Journal of Functional Informatics and Personalised Medicine, 1 (2), 171-188

What on earth and off earth is dark energy?

TL:DR – A reprint of a feature article of mine on Dark Energy that was published in StarDate magazin in July 2007.


Type 1a Supernova Credit: NASA/Swift/S. Immler)Forget the Large Hadron Collider (LHC), with its alleged ability to create earth-sucking microscopic black holes, its forthcoming efforts to simulate conditions a trillionth of a second after the Big Bang 100 metres beneath the Swiss countryside. There is a far bigger puzzle facing science that the LHC cannot answer: What is the mysterious energy that seems to be accelerating ancient supernovae at the farthest reaches of the universe?

In the late 1990s, the universe changed. The sums suddenly did not add up. Observations of the remnants of stars that exploded billions of years ago, Type Ia supernovae, showed that not only are they getting further away as the universe expands but they are moving faster and faster. It is as if a mysterious invisible force works against gravity and pervades the cosmos accelerating the expansion of the universe. This force has become known as dark energy and although it apparently fills the universe, scientists have absolutely no idea what it is or where it comes from, several big research teams around the globe are working with astronomical technology that could help them find an answer.

Until type Ia supernovae appeared on the cosmological scene, scientists thought that the expansion of the universe following the Big Bang was slowing down. Type Ia supernovae are very distant objects, which means their light has taken billions of years to reach us. But, their brightness could be measured to a high degree of accuracy that they provide astronomers with a standard beacon with which the vast emptiness of space could be illuminated, figuratively speaking.

The supernovae data, obtained by the High-Z SN Search team and the Supernova Cosmology Project, rooted in Lawrence Berkeley National Laboratory, suggested that not only is the universe expanding, but that this expansion is accelerating. to make On the basis of the Type Ia supernovae, the rate of acceleration of expansion suggests that dark energy comprises around 73% the total energy of the universe, with dark matter representing 24% of the energy and all the planets, stars, galaxies, black holes, etc containing a mere 4%.

HETDEX, TEX STYLE

Professor Karl Gebhardt and Senior Research Scientists Dr Gary Hill and Dr Phillip McQueen and their colleagues running the Hobby Eberly Telescope Dark Energy Experiment (HETDEX) based at the McDonald Observatory in Texas are among the pioneers hoping to reveal the source and nature of dark energy. Those ancient supernovae are at a “look-back time” of 9 billion years, just two-thirds the universe’s age. HETDEX will look back much further to 10 -12 billion years.

HETDEX DomeHETDEX will not be looking for dark energy itself but its effects on how matter is distributed. “In the very early Universe, matter was spread out in peaks and troughs, like ripples on a pond, galaxies that later formed inherited that pattern,” Gebhardt explains. A detailed 3D map of the galaxies should reveal the pattern. “HETDEX uses the characteristic pattern of ripples as a fixed ruler that expands with the universe,” explains Senior Research Scientist Gary Hill. Measuring the distribution of galaxies uses this ruler to map out the positions of the galaxies, but this needs a lot of telescope time and a powerful new instrument. “Essentially we are just making a very big map [across some 15 billion cubic light years] of where the galaxies are and then analyzing that map to reveal the characteristic patterns,” Hill adds.

“We’ve designed an upgrade that allows the HET to observe 30 times more sky at a time than it is currently able to do,” he says. HETDEX will produce much clearer images and work much better than previous instruments, says McQueen. Such a large field of view needs technology that can analyze the light from those distant galaxies very precisely. There will be 145 such detectors, known as spectrographs, which will simultaneously gather the light from tens of thousands of fibers. “When light from a galaxy falls on one of the fibers its position and distance are measured very accurately,” adds Hill.

The team has dubbed the suite of spectrographs VIRUS. “It is a very powerful and efficient instrument for this work,” adds Hill, “but is simplified by making many copies of the simple spectrograph. This replication greatly reduces costs and risk as well.”

McQueen adds that after designing VIRUS, the team has built a prototype of one of the 145 unit spectrographs. VIRUS-P is now operational on the Observatory’s Harlan J. Smith 2.7 m telescope, he told us, “We’re delighted with its performance, and it’s given us real confidence in this part of our experiment.”

VIRUS will make observations of 10,000 galaxies every night. So, after just 100 nights VIRUS will have mapped a million galaxies. “We need a powerful telescope to undertake the DEX survey as quickly as possible,” adds McQueen. Such a map will constrain the expansion of the universe very precisely. “Since dark energy only manifests itself in the expansion of the universe, HETDEX will measure the effect of dark energy to within one percent,” Gebhardt says. The map will allow the team to determine whether the presence of dark energy across the universe has had a constant effect or whether dark energy itself evolves over time.

“If dark energy’s contribution to the expansion of the universe has changed over time, we expect HETDEX to see the change [in its observations],” adds Gebhardt, “Such a result will have profound implications for the nature of dark energy, since it will be something significantly different than what Einstein proposed.”

SLOAN RANGER

Scientific scrutiny of the original results has been so intense that most cosmologists are convinced dark energy exists. “There was a big change in our understanding around 2003-2004 as a triangle of evidence emerged,” says Bob Nichol of the University of Portsmouth, England, who is working on several projects investigating dark energy.

SDSS M51

First, the microwave background, the so-called afterglow of creation, showed that the geometry of the universe has a mathematically “flat” structure. Secondly, the data from the Type Ia supernovae measurements show that the expansion is accelerating. Thirdly, results from the Anglo-Australian 2dF redshift survey and then the Sloan Digital Sky Survey (SDSS) showed that on the large scale, the universe is lumpy with huge clusters of galaxies spread across the universe.

The SDSS carried out the biggest galaxy survey to date and confirmed gravity’s role in the expansion structures in the universe by looking at the ripples of the Big Bang across the cosmic ocean. “We are now seeing the corresponding cosmic ripples in the SDSS galaxy maps,” Daniel Eisenstein of the University of Arizona has said, “Seeing the same ripples in the early universe and the relatively nearby galaxies is smoking-gun evidence that the distribution of galaxies today grew via gravity.”

But why did an initially smooth universe become our lumpy cosmos of galaxies and galaxy clusters? An explanation of how this lumpiness arose might not only help explain the evolution of the early universe, but could shed new light on its continued evolution and its ultimate fate. SDSS project will provide new insights into the nature of dark energy’s materialistic counterpart, dark matter.

As with dark energy, dark matter is a mystery. Scientists believe it exists because without it the theories that explain our observations of how galaxies behave would not stack up. Dark matter is so important to these calculations, that a value for all the mass of the universe five times bigger than the sum of all the ordinary matter has to be added to the equations to make them work. While dark energy could explain the accelerating acceleration our expanding universe, the existence of dark matter could provide an explanation for how the lumpiness arose.

“In the early universe, the interaction between gravity and pressure caused a region of space with more ordinary matter than average to oscillate, sending out waves very much like the ripples in a pond when you throw in a pebble,” Nichol, who is part of the SDSS team, explains. “These ripples in matter grew for a million years until the universe cooled enough to freeze them in place. What we now see in the SDSS galaxy data is the imprint of these ripples billions of years later.”

Colleague Idit Zehavi now at Case Western University adds a different tone. Gravity’s signature could be likened to the resonance of a bell she suggests, “The last ring gets forever quieter and deeper in tone as the universe expands.” It is now so faint as to be detectable only by the most sensitive surveys. The SDSS has measured the tone of this last ring very accurately.”

“Comparing the measured value with that predicted by theory allows us to determine how fast the Universe is expanding,” explains Zehavi. This, as we have seen, depends on the amount of both dark matter and dark energy.

The triangle of evidence – microwave background, type Ia supernovae, and galactic large-scale structure – leads to only one possible conclusion: that there is not enough ordinary matter in the universe to make it behave in the way we observe and there is not enough normal energy to make it accelerate as it does. “The observations have forced us, unwillingly, into a corner,” says Nichol, “dark energy has to exist, but we do not yet know what it is.”

The next phase of SDSS research will be carried out by an international collaboration and sharpen the triangle still further along with the HETDEX results. “HETDEX adds greatly to the triangle of evidence for dark energy,” adds Hill, “because it measures large-scale structure at much greater look-back times between local measurements and the much older cosmic microwave background,” says Hill. As the results emerge, scientists might face the possibility that dark energy has changed over time or it may present evidence that requires modifications to the theory of gravity instead.

Wiggle-Z

The Anglo-Australian team is also undertaking its own cosmic ripple experiment, Wiggle-Z. “This program is measuring the size of ripples in the Universe when the Universe was about 7 billion years old,” Brian Schmidt at Australian National University says. Schmidt was leader of the High-Z supernovae team that found the first accelerating evidence. SDSS and 2dF covered 1-2 billion years ago and HETDEX will measure ripples at 10 billion years. “Together they provide the best possible measure of what the Universe has been doing over the past several years,” Schmidt muses.

INTERNATIONAL SURVEY

The Dark Energy Survey, another international collaboration, will make any photographer green with envy, but thankful they don’t have to carry it with them. The Fermilab team plans to build an extremely sensitive 500 Megapixel camera, with a 1 meter diameter and a 2.2 degree field of view that can grab those millions of pixels within seconds.

The camera itself will be mounted in a cage at the prime focus of the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, a southern hemisphere telescope owned and operated by the National Optical Astronomy Observatory (NOAO). This instrument, while being available to the wider astronomical community, will provide the team with the necessary power to conduct a large scale sky survey.

Over five years, DES will use almost a third of the available telescope time to carry out its wide survey. The team hopes to achieve exceptional precision in measuring the properties of dark energy using counts of galaxy clusters, supernovae, large-scale galaxy clustering, and measurements of how light from distant objects is bent by the gravity of closer objects between it and the earth. By probing dark energy using four different methods, the Dark Energy Survey will also double check for errors, according to team member Joshua Frieman.

WFMOS

Subaru 51

According to Nichol, “The discovery of dark energy is very exciting because it has rocked the whole of science to its foundations.” Nichol is part of the WFMOS (wide field multi-object spectrograph) team hoping to build an array of spectrographs for the Subaru telescopes. These spectrographs will make observations of millions of galaxies across an enormous volume of space at a distances equivalent to almost two thirds the age of the universe. “Our results will sit between the very accurate HETDEX measurements and the next generation SDSS results coming in the next five years,” he explains, “All the techniques are complimentary to one another, and will ultimately help us understand dark energy.”

DESTINY’S CHILD

If earth-based studies have begun to reveal the secrets of dark energy, then three projects vying for attention could take the experiments off-planet to get a slightly closer look. The projects all hope to look at supernovae and the large-scale spread of matter. They will be less error prone than any single technique and so provide definitive results.

SNAP, SuperNova/Acceleration Probe, is led by Saul Perlmutter of Lawrence Berkeley National Laboratory in Berkeley, California, one of the original supernova explorers. SNAP will observe light from thousands of Type Ia supernovae in the visible and infra-red regions of the spectrum as well as look at how that light is distorted by massive objects in between the supernovae and the earth.

Adept, Advanced Dark Energy Physics Telescope, is led by Charles Bennett of Johns Hopkins University in Baltimore, Maryland. This mission will also look at near-infrared light from 100 million galaxies and a thousand Type Ia supernovae. It will look for those cosmic ripples and so map out the positions of millions of galaxies. This information will allow scientists to track how the universe has changed over billions of years and the role played by dark energy.

Destiny, Dark Energy Space Telescope, led by Tod Lauer of the National Optical Astronomy Observatory, based in Tucson, Arizona, will detect and observe more than 3000 supernovae over a two-year mission and then survey a vast region of space looking at the lumpiness of the universe.

LIGHTS OUT ON DARK ENERGY

So, what is dark energy? “At this point it is pure speculation,” answers Hill, “The observations are currently too poor, so we are focusing on making the most accurate measurements possible.” Many scientists are rather embarrassed but equally excited by the thought that we understand only a tiny fraction of the universe. Understanding dark matter and dark energy is one of the most exciting quests in science. “Right now, we have no idea where it will lead, adds Hill.

Supernovae (NASA collage)

“Despite some lingering doubts, it looks like we are stuck with the accelerating universe,” says Schmidt. “The observations from supernovae, large-scale structure, and the cosmic microwave background look watertight,” he says. He too concedes that science is left guessing. The simplest solution is that dark energy was formed along with the universe. The heretical solution would mean modifying Einstein’s theory of General Relativity, which has so far been a perfect predictor of nature. “Theories abound,” Schmidt adds, “whatever the solution, it is exciting, but a very, very hard problem to solve.”

This David Bradley special feature article originally appeared on Sciencebase last summer, having been published in print in StarDate magazine – 2007-07-01-21:12:X1

Night at the Web Museum

night-at-the-museumYears ago when BioMedNet’s HMSBeagle was still sailing the high seas, I wrote a feature for the Adapt or Die careers column on scientific jobs in museums, the feature, which is available on Sciencebase is still relatively valid, but one big aspect of museums that has changed significantly since the Beagle was abandoned in dry dock is that museums the world over have virtualised themselves.

It has reached the point now, that a museum without a web presence and moreover without a beautifully designed, powerful, comprehensive, informative and interactive web presence is no museum at all. I’m sure that curators the world over reading this will by now be gnashing their teeth, and if any of them have no teeth to grit, teeth will be provided.

Anyway, it’s not to say that a purely offline museum would not be attractive to passing trade. However, Thomas Fotakis and Anastasios Economides of the University of Macedonia, in Thessaloniki, Greece, a country of multiple museums to say the least, suggest that museums cannot afford to remain offline. In a rapidly evolving and changing society where the internet has become an almost ubiquitous tool and entertainment device museums – whether art, history, or science and technology – have to get online if they are to stay current. Moreover, they suggest that the creation of a virtual meta museum is the next logical step.

Museums should collaborate in order to develop a unified huge multimedia database that will contain all of their artefacts (either exhibited or in repository), information about art, culture, science, history, artists, exhibits, related events,” the researchers say.

Virtual and offline visitors would be able to search such a meta museum to find things of interest, participate in discussions with other visitors, and from the research point of view, unearth relationships between disparate artefacts in different parts of the world that may not be apparent to those merely browsing dusty shelves and cabinets in the offline world.

Through their websites, museums would create an environment in which visitors are not only able to explore the exhibited objects and art works, but also deal with new original experiences and ideas. Therefore, visitors can have a more essential and important experience.

But before that happens there are various concerns that have been raised about the web presence of almost all museums. For instance, many museum sites have been built without firm foundations and with no particular aims other than to create a web site. Further, many have never been assessed as to how well they actual meet their visitors’ needs and with no clear aim for the site, this is perhaps not surprising. Most worryingly though, is that much of the material on museum web sites simply duplicates in digital form the materials found in the bricks and mortar museums rather than rethink and developing the possibilities offered by the web. There are exceptions, of course.

Now, Fotakis and Economides have reviewed more than 200 museums having an online presence setting a range of criteria to reveal successes and to uncover problems. Their system, MuseumQual, provides a way to evaluate museum websites based on quality, quantity and most of all, user experience. “It is not enough to present a lot of information on the website,” they affirm, “It is also important that a visitor easily explores and utilises it.”

They tested 70 art, 70 science/technology, and 70 history museums using their program, looking at layout, multimedia, interactivity, feedback, and technical aspects such as whether or not a site force-fed the user browser cookies, had any “under construction” pages, privacy or security issues, and good navigation.

“Our evaluation showed that museums present websites that stay at a satisfying, yet not exceptional, level,” the researchers say, “Science museums’ sites lead the list, with art museums’ sites following closely and history museums’ sites coming in next. Almost all sites outperformed with respect to technical characteristics. However, many sites present inefficiencies regarding Interactivity and Feedback.”

They also emphasise that virtual museums have a role to play in providing access to people with special needs and unfortunately only few of the 210 sites tested provided access to culture, art and knowledge equally, having failed to account for people with alternative accessibility requirements. One aspect of accessibility could benefit anyone and that is the creation of virtual tours in the form of audio podcasts or video files or newsfeeds.

Nevertheless, having tested sites around the world, they specifically highlight the Miami Science Museum (www.miamisci.org), Fort Lauderdale Museum of Discovery and Science (www.mods.org), and Science Museum of Minnesota (www.smm.org) as offering a huge amount of practical and valuable information. Most scitech sites were easy to navigate and aesthetically pleasing, with Scitech Science Museum (www.scitech.org.au), Pusat Sains Negara Science Museum (www.psn.gov.my/en/) and Science Museum in London (www.sciencemuseum.org.uk) being particularly strong on multimedia. The best sites for interactivity and e-services were the Science Museum of Minnesota (www.smm.org) and Infoage Science Center (www.infoage.org).

The overall winners on all aspects of the MuseumQual analysis were:

  1. Museum of Fine Arts of Boston (www.mfa.org) (4.55/5.00)
  2. Centre Pompidou (www.centrepompidou.fr) (4.53/5.00)
  3. The Science Museum in London (www.sciencemuseum.org.uk) (4.48/5.00)

Some museums have been on the web for a decade or more but most, even the old hands, are failing to exploit the full potential of the modern multimedia internet. No one wants a museum that exists only in the past, it’s time to bring them together into the twenty-first century to enrich our lives whether in science, art or history. By doing so the virtual meta museum suggested by the University of Macedonia team will bring the past to life, in the most un-cliched sense.

I asked Economides about the possibility of using the test system for other types of website, science blogs for instance as it is open for anyone to use. “With some modifications (e.g. regarding e-services) MuseumQual could be used to also test other types of site,” he says, “However, other dimensions (e.g. usability, technical) are almost common for any type of site.” He points out that the tool is not automatic. “It is a quality framework that includes many criteria,” he explains,
“A user would use it to evaluate a website and consider the proposed criteria.”

Thomas Fotakis, Anastasios A. Economides (2008). Art, science/technology and history museums on the web International Journal of Digital Culture and Electronic Tourism, 1 (1) DOI: 10.1504/IJDCET.2008.020134

Science Blogging 2008

sciblog-2008-logoFirst off, just to say thanks to everyone who made sciblog2008 possible, already looking forward to its successor. It was fun to put faces to names of many of my fellow science bloggers and others out there who were at the Ri on Saturday. Quite amazing how so many look as young as their avatars! The conference, the breakouts and the unconference were fun and informative albeit if a certain keynote speaker was wont to use rather too many expletives (is that how conferences are these days?)

Anyway, speaking of how conferences are “these days” it was interesting to see just how much of the interaction at the conference went on online – through liveblogging – even between delegates sitting in the same room. Funny to see someone type and another raise an eyebrow in response. It was a bit like passing secret notes around the classroom, except the whole science blogosphere beyond the hallowed halls of the Royal Institution [why cerise seatcovers, why?] was reading those notes.

In particular, several bloggers were commentating on happenings via the sciblog08 FriendFeed room. Me, I didn’t even have my mobile phone switched on the whole day (it was Saturday, after all). I’ve no intention of duplicating the efforts of those Feedfriends nor of the various Nature bloggers and staff who have reported, blogged, and podcast the event and will soon be vidcasting it. Great logo by the way Euan, been there, done that, got the teeshirt.

However, I do want to raise the possibility of a mashup that occurred to me during the unconference thread on tracking conversations through the blogosphere. Cameron Neylon mentioned researchblogging.org (previously known as BPR3.org), and their DOI-citation capture system. Egon Willighagen and others mentioned Chemical Blogspace (Cb) and Adie’s Postgenomic (Pg) together with the Zemanta plugin and Mozilla Ubiquity. Willighagen has already written a post-conference script to act as a handler for DOIs, I see.

But, it was the unconference discussion that got me thinking that a science-specialised version of Zemanta and/or Ubiquity could monitor your latest blog post and on the basis of the names and keywords it sees as you type could suggest likely literature references. It would be a straightforward matter to display the titles of all relevant papers and as you blog you could add a star to the main paper about which you’re righting and tick any others that might be worth citing in the post.

A blog plugin could then fetch the researchblogging.org formated reference and paste into the foot of the blog post automatically, perhaps flagging your post once you hit “publish” in Connotea too. This way you would not have to remember to visit researchblogging.org nor have to be able to locate the DOI for pasting into their citation wizard the Science-based Zemanta tool would do the job for you.

Invisible Fishnets and Baby Boomer Pain

fishnetsIt’s that time of the month again, so here’s the latest round-up from my column over on SpectroscopyNOW, covering a whole range of science and medical news with a spectral twist from magnetic resonance to Raman by way of fishnets and infra-red.

Fishnet invisibility cloak – It is what fans of science fiction and technologists have been waiting for since HG Wells’ Invisible Man first came into view – or not, as the case may be. Scientists at the University of California, Berkeley, have engineered three-dimensional meta materials that can reverse the natural direction of visible and near-infrared light, which could one day lead to an invisibility device.

Bending MRI to diagnose joint disease – Osteoarthritis has turned out to be the bane of the Baby Boom generation, causing joint pain and disability for millions of people, more than half of those over the age of 65 in fact. Unfortunately, current approaches to diagnosing the disease cannot provide definitive results until the disease is in the advanced stages. This is often when symptoms have become severe and irreversible joint damage may already have occurred. Magnetic resonance imaging could provide an early diagnosis of osteoarthritis (OA), the most common form of arthritis, according to scientists speaking at the recent ACS meeting.

Universal detector – A team in Japan has used UV spectroscopy and microscopy to study the interaction between liposome clusters and endocrine-disrupting chemicals (EDCs) as a model of how living cell plasma membranes might be affected. The work could lead to the development of a universal detector for EDCs

Dance of the xenons – An NMR study of xenon atoms has demonstrated a fundamental new property – what appears to be chaotic behaviour in a quantum system ? in the magnetic spin of these frozen atoms. The work could lead to improvements in our understanding of matter as well as in magnetic resonance imaging.

Handling chirality with X-rays – X-rays are rather useful in determining the structure of materials and biomolecules, but are relatively insensitive to chirality. Now, a team of scientists in Japan has shown that circularly polarized X-rays at an appropriate wavelength can distinguish ‘left’ from ‘right’ in alpha-quartz. The work could have implications for studies of other inorganic organometallic materials, including industrial catalysts, liquid crystals, biomolecules, and pharmaceutical products.

Hybrid technology – Surface-Enhanced Raman Scattering (SERS) was first used in 1977 and since the has proven itself as an extremely sensitive analytical technique requiring only small volumes of sample and with wide application. Researchers have suggested that it is so sensitive that it could be used as a new tool in single molecule detection to augment or even displace techniques such as laser-induced fluorescence, frequency-modulated optical absorption at low temperature, and electrochemical detection of redox-active species. SERS of silicon nanostructures coated with a gold-silver substrate can be used to detect DNA hybridisation for taxonomic, biomedical and medical diagnostics purposes, according to a new study by researchers in Singapore.

Oh, and speaking of fishnets…anyone been thinking about the modelling career of John McCain’s running mate Sarah Palin?

Zen and the Art of Global Maintenance

van-gogh-yin-yangA discussion a while back, over a few beers, with a Buddhist friend about life, the universe, and everything (what else?) got around to the subject of null physics and the notion that the universe may always have existed and may exist for eternity to come.

Sciencebase regulars will know that this concept is covered in a rather bizarre book I mentioned a few posts back entitled How to Discover Our Universe. While there is certainly room for improvement in current cosmological models this notion of an always having existed universe is not to everyone’s taste, at least in terms of conventional Western ideals. Indeed, it positively reeks of pseudoscience in the eyes of many of us raised on the conventional cyclic observation-explanation-prediction rote of modern science.

Anyway, it was almost inevitable that a paper with a Zen, or should I say, Daoist, inclination would land in my inbox. And so completing the circle in drops a paper from Philosopher of Science Anthony Alexander. Alexander is currently Director for Studies and Research at a structural engineering, conservation and urban design consultancy that is apparently pioneering sustainability in the built environment. But, that is not the focus of his paper.

He notes that the physics of the 18th century Western world was fundamental in establishing the basic concepts for the study of economics and our understanding of the fledgling Industrial Revolution. However, industry and physics have moved on, not least as a product of the almost exponentially increasing pace of technological change. It is perhaps this seeming progress and our need to consider a passage through time that many people cannot contemplate a universe without a beginning.

But, before you run to the hills or roll into a potential energy well, this post is not about to go all mystical and misty eyed. There are no implications or allusions to an Ayurvedic notion of quantum mechanics. There is no incense burner on my desk. And while there might be a yoga teacher working on my accounts as I type, there is certainly no ambient crystal and phoenix rising yoga therapy session planned for this evening in a padded room with all-natural oxygen bubbling through gently illuminated vials of dihydrogen monoxide.

Anyway, back to Alexander’s technique… He suggests that 18th century physics has been “comprehensively displaced by progress within Western science. The new larger field of understanding encompasses the complex, the chaotic, unpredictable and the fluid aspects of the real world. Unfortunately, the institutions of the modern world, the industries, the money movers, the pen pushers, remain firmly entrenched in a clockwork Newtonian world view whereas science is all about non-linearity of systems, probability of sub-atomics, and duality of energy and matter. This staid view considers the world to be stable and ordered, and human activity to be somehow fundamentally distinct from nature.

While environmentalism and green economics have the grand aims of redressing the balance it is actually globalisation, according to Alexander, that has raised our awareness of other cultures and their disparate world view that could provide us with the means to reconcile the Newtonian industries with modern physics and systems theory.

Alexander turns to one of his leanings – the martial arts – for inspiration as to how this might happen. The martial arts, kung fu, karate, judo, and their Daoist counterparts, invert the logic of Western combat. Training in the kicks, punches and locks of these various martial arts are aimed not at causing pain or injuring one’s training partner but in providing health benefits to both. A Western perspective might see an arm lock as a route to pain, whereas a practitioner of a particular martial art will see it as a way to build muscular stretch, for instance. Alexander sees parallels between this inverted logic of the martial arts not only with the concepts of modern physics but with the green economics.

The status quo of 20th century Western economics [which persists even now] can be challenged by green economics, [but] does not seek harm to anyone or anyone’s interests. It seeks to promote harmony and longevity – values that are at the heart of common sense, sustainable development and [martial arts] culture, which all parties stand to benefit from.

There really is no mysticism here, we are plunging head-first into global environmental crises. Physics underwent a paradigm shift to shake of Newton’s clockwork universe, perhaps, as Alexander suggests, we should work through his analogy and see green economics as the new paradigm for industry across the globe.

Alexander, A. (2008). Different paths, same mountain: Daoism, ecology and the new paradigm of science. International Journal of Green Economics, 2(2), 153. DOI: 10.1504/IJGE.2008.019997

Intelligent Molecular Design

Alchemist LogoFirst up in The Alchemist this week is a tale of reactions where size really does matter! News of why “non-smokers cough” emerges from the American Chemical Society meeting this month and a new physical process has been revealed by NMR spectroscopy of frozen xenon atoms that could provide a chaotic link in quantum mechanics back to Newton’s era. Biotech news hints at a novel way to flavour your food and Japanese chemists have made a gel that undulates like intestinal muscle. Finally, this week’s award goes to my good friend AP de Silva of Queen’s University Belfast for his highly intelligent work in the development of market-leading sensor technology and intelligent molecules.

You can grab the complete headlines and abstracts in the latest issue of The Alchemist on ChemWeb.

Overton Overturned

A century-old rule used throughout the pharma industry may have been overturned by new research in the UK. Researchers at the University of Warwick have demonstrated that drug transport rates across cell membranes may be hundreds of times slower than are predicted by Overton’s Rule, which could have serious implications for developing and testing new drugs.

Put simply, Overton’s Rule says that the more lipophilic a compound is, the faster it will enter a cell. The Rule was first outlined in the 1890s by Ernst Overton of the University of Zürich. He quantified the rule to allow biochemists and others to predict how fast the membrane crossing would take place. One of the key parameters in his equation is K, lipophilicity. Bigger K, faster transport.

For over a century, medicinal chemists have used this relationship to shape their studies and clinical trials.

Now, a confocal microscopy study used in conjunction with an ultramicroelectrode led by Patrick Unwin has allowed the team to follow every step of the membrane-crossing process. The results are stunning. While the test compounds (acids) did diffuse across a lipid membrane, they did so at rates that were diametrically opposed to those prediction by Overton’s Rule. The researchers studied four acids (acetic, butanoic, valeric, and hexanoic) that had increasingly larger “acyl” (or carbon) chains. The longer the carbon chain, the easier the chemical dissolves in lipids and, therefore, according to Overton, the faster they should diffuse across a lipid membrane.

The Warwick work showed instead that the most lipophilic molecules were actually transported the slowest.

REFERENCE: Proc Natl Acad Sci: Quantitative visualization of passive transport across bilayer lipid membranes

Boris Johnson, Fop or Geneticist?

boris johnsonFor Scousers, Londoners, fans of BBC’s Have I Got News for You satirical news quiz, and especially to everyone who watched this Beijing to London Olympic handover this week the name Boris Johnson likely drums up an image of some blonde, floppy haired, bedraggled and totally confused Tory toff, who just happens to be Mayor of London.

Well, it turns out that he has quite an interesting ancestry of which he was almost totally unaware until another BBC TV show (Who Do You Think You Are?, which is all about family history and genealogy of the rich and famous) helped him dig deep into the roots of his family tree. First off, not only was his great grandfather, Ali Kemal an outspoken journalist turned politician (like Johnson) who was apparently lynched by the state in the founding years of modern Turkey but his great-great-great-great-great-great-great-great grandfather was King George II of England (illegitimately due to a “wrong-side-of-the-sheets liaison between Johnson’s great grandmother and a descendant of George II. Such ancestry means Johnson is related to all the royal families of Europe.

How’s that for a bit of name dropping? Of course, there are probably tens of thousands of people who have illegitimate links to European royals, but it’s an interesting find nevertheless.

However, for those who think Johnson is nothing more than a blithering fop, it was his final words in this episode of “Who Do You Think You Are?” that were most poignant to lineage, heredity and most of all genetics, which is why I thought they warranted a holiday mention. I just hope they were spontaneous and unscripted.

We’re all just great, our genes just pulse down the lines. We’re not the ultimate expression of our genes. We’re the temporary custodians of these things. We don’t really know where they’ve come from, where they’re going, and the whole process is incredibly democratic.

You can view a segment from the show here, unfortunately, the closing quote is not included in this Youtube segment.

UPDATE: Following on from Mr Johnson’s genetic insights, I see there’s a paper in this week’s Nature from Cornell researchers that says: “One day soon, you may be able to pinpoint the geographic origins of your ancestors based on analysis of your DNA. The researchers describe the use of DNA to predict the geographic origins of individuals from a sample of Europeans, often within a few hundred kilometres of where they were born.”

Novembre, J et al (2008). Genes mirror geography within Europe Nature