Finding Experts

Finding expertsOne of the main tasks in my day-to-day work as a science writer is tracking down experts. The web makes this much easier than it ever was for journalists in decades since. There are times when a contact in a highly specialist area does not surface quickly but there are also times when I know for a fact that I’ve already been in touch with an expert in a particular area but for whatever reason cannot bring their name to mind. Google Desktop Search, with its ability to trawl my Thunderbird email archives for any given keyword is a boon in finally “remembering” the contact.

However, finding just a handful of contacts from web searches, email archives and the good-old-fashioned address book pales into insignificance when compared to the kind of industrial data mining companies and organisations require of their “knowledge workers”.

According to Sharman Lichtenstein of the School of Information Systems at Deakin University, in Burwood, Australia, and Sara Tedmori and Thomas Jackson of Loughborough University, Leicestershire, UK: “In today’s highly competitive globalised business environment, knowledge workers frequently lack sufficient expertise to perform their work effectively.” The same concern might be applied to those working in any organisation handling vast amounts of data. “Corporate trends such as regular restructures, retirement of the baby boomer generation and high employee mobility have contributed to the displacement and obfuscation of internal expertise,” the researchers explain.

The team explains how knowledge is increasingly distributed across firms and that when staff need to seek out additional expertise they often seek an internal expert to acquire the missing expertise. Indeed, previous studies have shown that employees prefer to ask other people for advice rather than searching documents or databases. Finding an expert quickly can boost company performance and as such locating experts has become a part of the formal Knowledge Management strategy of many organisations.

Such strategies do not necessarily help knowledge workers themselves lacking the search expertise and time required to find the right person for the job, however. So, Jackson developed an initial expertise locator system, later further developed with Tedmori, to address this issue in an automated way. The researchers discuss an automated key-phrase search system that can identify experts from the archives of the organisation’s email system.

Immediately on hearing such an intention, the civil liberties radar pings! There are sociological and ethical issues associated with such easy access and searchability of an email system, surely? More than that, an expert system for finding experts could become wide open to misuse – finding the wrong expert – and abuse – employees and employers unearthing the peculiar personal interests of colleagues for instance.

The first generation of systems designed to find experts used helpdesks as the formal sources of knowledge, and comprised simply of knowledge directories and expert databases. Microsoft’s SPUD project, Hewlett-Packard’s CONNEX KM system, and the SAGE expert finder are key examples of this genre, the researchers point out. Such systems are akin to Yellow Pages and are essentially electronic directories of experts that must be maintained on a continual basis. They allow anyone with access to tap into expertise, but unless the experts keep their profiles up to date, they can quickly lose relevancy and accuracy.

Overall, when large numbers of employees are registered and profiles are inaccurate, credibility is rapidly lost in such systems which are increasingly ignored by knowledge seekers.

Second generation expertise locators were based on organisations offering their staff a personal web space within which they could advertise their expertise internally or externally. Convenient for those searching but again relying on the experts in question to keep their web pages up to date. Moreover, simple keyword matching when searching for an expert would not necessarily find the best expert because the search results would depend on how well the expert had set up their web pages and whether and how well they had included keywords in those pages. In addition, keyword searching can produce lots of hits that must then be scanned manually, which takes time.

The third generation of expert searching relies on secondary sources, such as tracking the browsing patterns and activities of employees to identify individual experts. Such an approach raises massive privacy concerns, even for companies with a strict web access policy. Activity on forums, bulletin boards, and social networks falls into this third generation approach.

The fourth generation approach mashes the first three and perhaps adds natural language searching again with various efficiency and privacy concerns. Again, it does not necessarily find the best expert, but often just the person whose data, profile, and web pages are optimised (deliberately or by chance) to reach the top slot in the search results.

An approach based on key-phrase identification in e-mail messages could, however, address all requirements but throws up a new wave of privacy concerns, which Lichtenstein and colleagues discuss.

There are several features of email that make it popular and valuable for organisational knowledge work, and relevant to to finding an expert:

  • It attracts worker attention
  • It is integrated with everyday work
  • It provides a context for sense-making about ideas, projects and other types of business knowledge
  • It enables the referencing of work objects (such as digital documents), and provides a history via quoted messages
  • It has high levels of personalised messages which are appealing, meaningful and easily understood
  • It encourages commitment and accountability by automatically documenting exchanges
  • It can be archived, so providing valuable individual, collective and organisational memories that may be mined
  • It facilitates the resolution of multiple conflicting perspectives which can stimulate an idea for a new or improved process, product or service.

All these factors mean that email could become a very useful tool for finding experts. Already many people use their personal email archives to seek out knowledge and experts, but widen that to the organisational level and the possibilities become enormous.

The researchers have developed an Email Knowledge Extraction (EKE) system that utilises a Natural Language ToolKit (NLTK) employed to build a key-phrase extraction “engine”. The system is applied in two stages, the first of which “teaches” the system how to tag the speech parts of an email, so that headers and other extraneous information become non-searched “stop words” within the email repository. The second stage extracts key-phrases from the searchable sections of an email once it is sent. This extraction process is transparent to the sender and takes just milliseconds to operate on each email. A final stage involves the sender being asked to rank each identified key-phrase to indicate their level of expertise in that key-phrase area. A database of experts and their areas of expertise is gradually developed by this approach. Later, employees searching for experts can simply consult this database.

The EKE system has been implemented at Loughborough University and at AstraZeneca in trials and found to be able to capture employee knowledge of their own expertise and to allow knowledge workers to correctly identify suitable experts given specific requirements. The researchers, however, highlights the social and ethical issues that arise with the use of such as system:

  • Employee justice and rights and how these might conflict with employer rights.
  • Privacy and monitoring, as there is more than a small element of “Big Brother” inherent in such a system
  • Motivational issues for sharing knowledge, as not all those with expertise may wish to be data mined in this way, having enough work of their own to fill their 9-to-5 for instance
  • Relationships, as not everyone will be able to work well together regardless of expertise
  • Ethical implications of expert or non-expert classification, as the system could ultimately flag as experts those employees with little or no expertise.
  • Deliberate misclassification of experts, as all systems are open to abuse and malpractice.
  • Expert database disclosure, as such a comprehensive database if accessed illicitly by an organisation’s rivals could wreak havoc in terms of stealing competitive advantage, headhunting or other related activities.

Lichtenstein, S., Tedmori, S., Jackson, T. (2008). Socio-ethical issues for expertise location from electronic mail. International Journal of Knowledge and Learning, 4(1), 58. DOI: 10.1504/IJKL.2008.019737

Sexy Worms

Some people would pay anything for a quickfix pill for their sex lives or to slow the inevitable aging process. Now, US scientists have found a new class of small molecule in the molecular biologist’s favorite nematode worm, Caenorhabditis elegans, a blend of which apparently not only attract mates but also slow the development of larvae for months.

The soil-dwelling nematode is used as a model organism for lots of human diseases and in aging research because despite the apparent differences between ourselves and the nematode, we share much of our underlying biology with the worm.

Writing in Nature, Frank Schroeder, of Cornell University, Jagan Srinivasan, of California Institute of Technology, and colleagues describe the new compounds, ascarosides, and reveal data that show how they extend lifespan in C. elegans as well as acting as sex pheromones for the wriggly critter. The work essentially ties together at the molecular level two superficially disparate life processes – sex and death.

So, are we likely to see a human version of the ascarosides for attracting a sex partner and warding off old age. Short answer is no. Although we share some of the molecular biology of this nematode, there are a few too many differences to make such a pill even remotely possible…at any price.

Srinivasan, J., Kaplan, F., Ajredini, R., Zachariah, C., Alborn, H.T., Teal, P.E., Malik, R.U., Edison, A.S., Sternberg, P.W., Schroeder, F.C. (2008). A blend of small molecules regulates both mating and development in Caenorhabditis elegans. Nature DOI: 10.1038/nature07168

Therapeutic Alchemist

Over on ChemWeb and wearing my Alchemist hat I hear of a discovery that could lead to a new therapeutic target for a whole range of diseases in which the inflammatory response is involved, almost a medical Panacea. An out of this world approach to liquid telescopes could overcome the big obstacle in making such a device useful for astronomy. The protein spike on the surface of the Ebola virus is laid bare by X-ray crystallography and could lead to new treatments for slowing outbreaks. Birds of prey could be the new environmental “canary” when it comes to toxic heavy metal, according to Spanish researchers. Japanese researchers have taken individual rotaxanes for a spin and obtained some dynamic snapshots. Finally, the Michael J Fox Foundation has announced its annual round of funding for Parkinson’s therapeutic lead research.

Get the fully skinny and the links here

Mass v Gas and the Biomass Buzz

Biomass on the road to RouenThere are two main schools of thought when it comes to oil supply. There are those who believe that oil supplies are strictly limited and that we have passed the peak and will soon (40 to 60 years) run out of oil with which to power our vehicles. Then are those who believe supplies could last much longer than current predictions suggest. The latter school of thought believes there are either reserves that are simply too expensive to extract at today’s oil prices but they will be tapped ultimately or they believe that new sources will be found as the pressure rises. There is actually a third school of thought: those who believe oil is not a fossil fuel at all, but a continuously renewed material that will never run out, but that’s a different story.

During the 20th century and now into the 2000s, petroleum has predominated in fuelling transport hinging on the enormous growth of chemical engineering and chemical technology since the beginning of the industrial revolution. Currently, fuels from crude oil is used to fulfil 96 to 98% of the worldwide energy demand for cars, ships and planes.

Whichever school makes the grade in the end shouldn’t really matter, as Chemical engineers Maria Sudiro and Alberto Bertucco of the University of Padova, Italy, and many others have pointed out:

The currently known reserves of methane and of coal exceed those of crude oil by factors of about 1.5 and 25, respectively.

They reckon any analysis of fuel supply and the potential for using biomass as a so-called renewable source is not quite as clearcut as one might imagine. They explain that the problem of producing synthetic liquid fuels by alternative routes to conventional petrochemical means is mixed. The industrial processes of Gas To Liquid (GTL), Coal To Liquid (CTL), and Biomass To Liquid (BTL) use natural gas, coal, and biomass as feedstocks, respectively, all with varying efficiencies and resulting energy densities.

Now, writing in the International Journal of Alternative Propulsion (2008, 2, 13-25), the researchers have modelled each process on a weight basis per unit of feedstock (natural gas, coal and biomass/wood). For hypothetical plants running at a production rate of 100 tonnes per hour, they found that yields are about 70, almost 33 and just under 17%, respectively. Moreover, the carbon dioxide emitted per unit mass of liquid fuel is a relatively low at 0.9 kg for GTL, almost 5 kg for CTL, and over 6 kg for BTL processes. So, on the face of it, it would seem that gas-to-liquid fuel beats coal and biomass hands down in terms of efficiency and carbon footprint.

Of course, in some sense, the use of biomass can be thought of as carbon neutral as it is purportedly a renewable resource. However, such fudging of any analysis always seems to ignore the environmental impact of sourcing the biomass, whether that is the assimilation of waste for conversion, the planting of fuel crops, and the energy use and waste products of the conversion process. So, truly no solution is entirely clearcut when one takes into account the complete lifecycle of the fuel production process, well-to-wheel, as it were.

At first, site gas to liquid, may not seem necessarily to be the perfect option. This is especially so if one takes into account costs and political issues, such as access to a ready supply for any region hoping to exploit GTL. As such, the Padova team has evaluated production costs of synthetic fuel in a GTL process. They considered two different scenarios: the case of a production plant close to a natural gas supply. The second case is of a GTL plant remote from the country with the gas supply.

Their financial analysis reveals that the return on investment for a GTL plant with a local supply occurs in less than two and a half years, whereas it is almost seven years if the supply is in a country remote to the manufacturer. They conclude that given our reliance on oil, its derivatives, and putatively petroleum substitutes (and despite hybrid and hydrogen):

The economical and financial analysis has shown that it is extremely convenient to invest in a GTL plant located in countries where natural gas is available at a low price, thanks to the favourable return of investment.

It could be that as biomass becomes more accessible, possibly to the detriment of food and water supply, that the BTL approach to fuel becomes more viable. However, given the abundance of natural gas and the potential to release that from locked in sources, such as frozen methane hydrates, GTL could be the way forward for some regions of the world faced with dwindling oil supplies, especially given the lower carbon footprint compared with liquid fuels derived from coal or biomass.

Sudiro, M., Bertucco, A. (2008). Production of synthetic gasoline and diesel fuels by alternative processes using natural gas, coal and biomass: process simulation and economic analysis. International Journal of Alternative Propulsion, 2(1), 13-25. DOI:

Muddled Environmental Meddling

limestone-lifesaverThe idea of using carbon sequestration to reduce atmospheric CO2 levels has been bandied about for years, I vaguely recall writing about it when I first freelanced for New Scientist in 1990. It struck me then as a ludicrous approach to tackling climate change akin to sweeping the problem under the carpet. Now, a press release from another journal for which I once wrote on a regular basis, Chemistry & Industry (published for the UK’s Society of Chemical Industry) is suggesting yet another madcap approach to climate.

The “open source” concept being put forward by cquestrate.com and reported in C&I suggests that we could reduce atmospheric CO2 levels and so ameliorate anthropogenic global warming by heating millions of tonnes of limestone in the world’s deserts to release its locked in CO2, ship the resulting lime to the seaside and dump the rock into the oceans where it will apparently absorb twice as much dissolved CO2. It’s backed by multinational petrochemical giant Shell.

Now forgive me for being uber skeptical but isn’t there something just a little hypocritical about an oil company looking to macro scale chemical engineering to manipulate the environment. The C&I article quotes Tim Kruger, formerly of Shell, and now consulting on the project for Corven as mentioning Australia’s Nullarbor Plain as being a prime location for the process. Lots of energy from sunlight to heat the abundant limestone to calcify it. And, presumably excess energy to sequester the huge volumes of CO2 released at source.

The mention of Australia reminds me of another particularly crass attempt by humanity to control the environment that went badly wrong – the cane toad. The cane toad was introduced into Queensland, Australia, en masse, in the 1930s in an effort to control the cane beetle that was ravaging the sugar cane industry. Of course, cane toads are now one of the most widespread and biggest pests in the region, with no obvious way of controlling their numbers, other than introducing an exogenous, but unidentified predator species.

I suspect that, overall, dumping lime into the oceans will be as successful as dumping cane toads into sugar cane plantations. There will be unknown after shocks that will cause more harm to the environment and global ecosystems overall than anyone could predict.

First, off, there’s the problem of what to do with the CO2 released from the limestone mined in the deserts that serves as the raw material for the process. Secondly, the huge tonnages involved are going to be so big that this project really will never work, especially as shipping all that lime from the desert to the oceans will require energy and release its own huge quantities of CO2 before dumping even begins. But, more than that there will be enormous, unforeseen environmental effects of dumping this material into the oceans on such a scale.

The idea of using even “stranded” energy to release CO2 from limestone, ship the limestone to the oceans, where it will apparently absorb dissolved CO2, has to be fundamentally flawed. There are issues of pH, absorbancy, equilibria, and marine ecosystems to consider. Surely, it would be simpler and more efficient to find a way to tap the stranded energy and supply it to population centres directly, thus cutting our dependency on fossil fuels without attempting to tamper with the oceans. Several macroscale engineering ideas have been bandied about and some, such as iron seeding and nitrogen control, have even been trialled, with little success and evidence of detrimental environmental impact. Let’s not add lime to the list.

I interviewed Kruger for the August issue of Intute Spotlight in which I will cover this topic in more detail. You might also be interested in checking out the Sciencebase endsjobs.co.uk/environmental page.

Regulatory Placebo

st johns wort flowerMy recent article on the subject of an unnatural approach to diabetes led to some quite intriguing comments from readers especially as I suggested that it would make sense from a safety perspective to bring herbal remedies under the same regulatory umbrella as regular pharmaceutical products.

Eric W a chemistry teacher from Minnesota who goes by the online monicker of “Chemgeek” pointed out that the US Food & Drug Administration (FDA) had attempted to do this several years ago (it’s something that has been mooted in the European Union too), but in the US at least all that happened is that products now have a warning such as: ‘These statements have not been evaluated by the FDA…’ Which is obviously a rather weak stand to take on products that can be potentially lethal in combination with the wrong disease or other medication.

The problem as I see it, is that if any given herbal product has potent physiological activity, then it is too all intents and purposes, a medicinal drug, and should be tested and labelled as such so that consumers can be warned of contraindications. If the herbal product has no physiological activity, other than perhaps to provide some spurious antioxidants for which our needs are not known, then it is little more than a placebo.

Stef Levolger, aka Slevi, is a Dutch medical student with an interesting observation on the issue. “It’s not like [herbal is] all that different from our own basic medicine. Take for example willow bark, sold as a herbal treatment and advertised with [various] uses by the Chinese, by [several] companies and others since the ancient Greeks. And, how do you get it home? Right, you get a bottle filled with pills. I wouldn’t be surprised if it was even produced in the same pharmaceutical factory as where regular aspirin comes rolling down the line, just got a price tag twice as high slammed on it,” he says.

I think Slevi is on to something, certainly a huge number of our so-called modern pharmaceuticals have natural product origins while many so-called natural herbal products are manufactured on neighbouring production lines by a division of the pharma companies.

Canadian blogger Mina Isabella Murray of Weird Science strongly disagrees with the idea of bringing all herbal remedies under the pharma umbrella. “That’s not to say I don’t support the notion of thorough research, safety, monitoring and accountability, but to compare the vast majority of herbals to pharmaceutical preparations is misguided and excessive,” she says, “I vote stricter regulations but ones that are separate from current pharma protocol/guidelines.”

She suggests that the issue is not black-and-white. “The vast majority of herbal preparations aren’t viewed to treat disease in the way that pharmaceuticals are presented nor are they presented as ‘cure alls’, she says, “Sure, we all know the stories of dangerous herbs and dubious advertising but I don’t think that warrants all mainstream herbal preparations being subject to the same testing required for pharmaceuticals. She cites cranberries as an important preventative for urinary tract infection that, of course, cannot displace antibiotics should someone contract a full-blown infection.

But, I’d argue that the efficacy of such a “natural” remedy is dubious at best, but other products, may have genuine effects and so can interfere with the metabolism of certain medications. St John’s Wort is probably a case in point. This product widely used to treat mild depression is contraindicated for anyone with thyroid problems. But, buying St John’s over the counter, off prescription as it were, does not provide the necessary warnings.

I asked chemical consultant Hamish Taylor of Shinergise Partners Ltd how he felt about regulation of yet another area of his broader industry. “I think the answer should be YES,” he affirms, “but perhaps a sensible step is part-way i.e. strong warnings on possible side-effects and a cautionary ‘there is no clinical trial which demonstrates absolute efficacy’.”

Such an approach would allow the placebo effect of “it’s natural, I believe in it, it does me good” to shine through, which is probably no bad thing. “Feeling good is pretty much proven to make people healthier and certainly happier,” Taylor adds. He says that some degree of labelling would however prevent some of the nastier side-effects that can occur.

“By going to this quasi-interim step, it may even encourage manufacturers to undertake proper clinical trials to demonstrate effectiveness and therefore encourage greater use,” Taylor says. Of course, it is not as if the manufacturers have not investigated the potential. “The problem is that if the more popular herbal remedies were indeed 100% effective the drug component would have been isolated and purified by now!” adds Taylor.

Steve Bannister, Scientific Director & Principal Consultant at Xcelience, LLC, a drug development company based in Tampa, Florida, suggests that really the question we should be asking is what level of regulatory resources can we afford?

“Natural products have long provided leads for drug discovery,” Banniester says, “In modern natural-product drug discovery, activity-guided fractionation (often using an isolated receptor) identifies an active molecule and a single molecular entity results from semisynthetic improvement of the phytochemical’s drug properties. The drug-regulatory process includes guidelines for determining safety, efficacy, and quality, as well as for setting acceptance criteria for each.”

However, some herbal products may have efficacy as a result of a combination of components, additionally adverse reactions to such a product may be due to a different combination. “To pass through the current drug regulatory process, a product specification, including identification of the specific active components and their requisite levels, along with identification and limits for the unsafe impurities, is needed,” adds Bannister. He further explains that this specification must be defined in terms of safety and efficacy in humans. “This is an extraordinarily difficult undertaking, and if required, would prevent herbal products from reaching approval,” he says, “This, combined with the fact that some herbals really do work, is the principal reason for a different different set of regulations.”

One has to consider what is an acceptable level of risk and how it is managed? What risks are associated with adulterated products, what are the adverse reactions, and to what extent does lack of efficacy lead consumers away from different, effective, treatments.

As it stands, the fully implemented US Good Manufacturing Practices for herbals are meant to control adulteration but can only do so if there is adequate compliance inspection. Of course, evidence of a significant incidence of adverse reactions is sufficient to lead to a product being banned but again this requires significant surveillance resources. Also, disease-treatment claims can be made only for approved drugs, so consumers may have been led to a particular herbal for their symptoms through hearsay or lifestyle magazine “evidence”. Monitoring of such claims for herbals, such as the neurochemical effects of St John’s Wort also requires surveillance resources. It all costs.

Delano Freeberg, Chief Technical Officer at API Purifications Inc, suspects that the devil is in the details. “In special cases, increased regulation may be beneficial,” he says, “I believe public health is not served by increased regulation. In fact, I believe we will benefit from the application of well-understood scientific and medical criteria for the reduction of regulation. The benefit of this approach to the public is the timely availability of lower-cost drugs. I believe this can be accomplished with no effect on safety.”

He points out that regulation is not required for extracts. “Many herbal remedies are known to have efficacy, even in a crude extract. The composition of this extract often does not differ much from that of the starting biomass (natural herb). If the herb being used is “GRAS” (generally regarded as safe), there is good scientific foundation for believing the extract will likewise be safe,” adds Freeberg, “An important caveat: “Natural” does not imply “safe.” Hemlock extract (Conium maculatum) is 100% natural but also 100% deadly.”

He also suggests that regulation should depend on potency. “Traditional over the counter pharmaceuticals contain a single active component. Natural remedies usually contain several compounds of known efficacy. Green tea, for instance, contains high concentrations of around a dozen different catechin antioxidants. The traditional pharmaceutical OTC preparation must be held to a higher standard of regulation because of the higher purity and potential potency of the active ingredient,” he says.

Regulation should also depend on drug type. “I feel the FDA over-regulates and does not take into consideration scientific findings that provide a basis for reduced regulation,” says Freeberg, “I provide one example. Synthetic statins have moderate to severe side effects in 5% of patients. Cholesterol can be equally reduced by the intake of policosanol (One recent study found no efficacy; however, dosage levels may not have been appropriate), a group of long-chain alcohols from beeswax. Side effects of policosanol were mild – no worse than placebo.”

There is certainly a scientific basis to demonstrate that many natural compounds have little adverse physiological effect, even in high-purity forms. This can be determined from structure-activity relationships and historical clinical and toxicological data.

Florence Leong, an Investment Director at ATP Capital Pte Ltd, believes the answer lies not in regulation but in consumer education. “Consumers need to be educated on the difference between OTC herbal remedies and pharmaceutical products,” she says, “Not all natural products are safe, the reason why OTC have fewer listed side effects as compared to pharmaceutical is because OTC products are not as thoroughly evaluated. Often the long list of side effects in pharmaceutical drugs frightens the consumer and the cursory listing of side effects gives consumer the false perception of safety.” Unfortunately, she adds, consumer education is a long and tedious process and regulation is consider the efficient ‘quick fix’.

She echoes my own earlier sentiment regarding efficacy claims where OTC products are not as robustly proven as pharmaceutical products. “This is an area where regulation should be tightened as many manufacturers of OTC products do try to push the limits by making specific efficacy claims in non-print advertisements. And often they do get away with it,” she says.

The active constituents of pharmaceuticals products are consistent, which ensures reliable efficacy and, on for established products, predictable side-effects. In contrast, efficacy of OTC herbal medications can be variable due to natural variations of the chemicals in different batches of raw materials, contamination, and manufacture unscrupulousness. This means herbal products can swing between no effects and no side-effects, via those that work and have their own side-effects to the wildly hazardous batch of mercury-laden dessicated ordure.

Atmospheric, Spectroscopic, Arsenic

Arsenic poisoningRemote arsenic assessment – A topic I’ve come back to again and again since I first covered for The Guardian the breaking news of arsenic contaminated tubewells on the Indian sub-continent in 1995. Now, an informatics approach to surface data could allow geologists and environmental scientists to identify regions of the world where people are at risk of exposure to arsenic in their drinking water without the need for widespread sampling to be undertaken. More…

Listening to tomographic tales – Researchers in the USA and The Netherlands have pieced together a picture of the most exquisite of molecular machines using electron-microscopic tomography. The team has for the first time obtained a three-dimensional structure of the gossamer-like filament of proteins found within the inner ear that gives us our sense of hearing and balance. More…

Atmospheric NMR – Nuclear magnetic resonance spectroscopy has been used to study the kinetics of atmospheric pollutants in the gas phase for the first time. The method provides an empirical correlation between the atmospheric lifetimes of atmospheric pollutants and their relative reaction rates with chloro radicals at ambient temperatures. Read on…

Ebola spiked – An X-ray structure of the surface spike of the Ebola virus could explain how this lethal pathogen infects human cells and may help researchers devise preventative measures to stop the virus spreading during an outbreak. Full story…

Fair Use Rights

Creative Commons frownIntellectual property, copyright, creative commons, copyleft, open access… These are all terms high on the science and other agenda these days. For example, public-funded scientists the world over are calling for research results to be available free to them and their peers for the public good and for the good of scientific advancement itself. Librarians likewise are also interested in the fullest dissemination and sharing of knowledge and information, while user-creators and the new breed of citizen journalists that are the result of the Internet Age are also more liberal in their outlook regarding the proprietary nature of creative works.

On the other hand, traditional publishers, database disseminators, and the commercial creative industry consider the investment they put into the creation and distribution of works as a basis for the right to charge readers and users and for profit-making. Meanwhile, adventurous organisations that are not necessarily beholden to shareholders, to other commercial concerns, and to learned society memberships, are experimenting with alternative business models with varying degrees of success.

One aspect of copyright that arises repeatedly in any discussion is what is considered fair use and what kind of usage warrants a cease & desist order from the owner of copyright in their works.

Now, Warren Chik, an Assistant Professor of Law at Singapore Management University, is calling for a reinvention of the general and flexible fair use doctrine through the simple powerful elevation of its legal status from a legal exception to that of a legal right.

Writing in the International Journal of Private Law, 2008, 1, 157-210, Chik explains that it is the relatively recent emergence of information technology and its impact on the duplication and dissemination of creative works – whether it is a photograph, music file, digitised book, or other creative work – that has led to a strengthening of the copyright regime to the extent that it has introduced “a state of disequilibrium into the delicate equation of balance that underlies the international copyright regime”.

Copyright holders have lobbied for their interests and sought legal extension to the protection over “their” creative works. But, the law in several countries has undergone a knee-jerk reaction that is not necessarily to the benefit of the actual creator of the copyright work or of the user. Chik summarises the impact this has had quite succinctly:

The speedy, overzealous and untested manner in which the legal response has taken has resulted in overcompensation such that the interests of individuals and society have been compromised to an unacceptable degree.

For some forms of creative works, such as music and videos, there has emerged a protectionist climate that has led to the creation of double protection in law the form of the digital rights management (DRM) system and anti-circumvention laws that allows copyright owners to prosecute those that attempt to get around such restrictive devices. This, Chik affirms, has “inadvertently caused the displacement of the important fair use exemptions that many consider the last bastion for the protection of civil rights to works.”

Chik points out that this tightening of the laws run counter to the increasing penetration of electronic forms of storage and communication, the borderless nature of the Internet and the invention of enabling technologies such as the so-called “Web 2.0”. This in turn is apparently leading to a general social shift towards more open collaborative creativity, whether in the arts or the sciences, and what he describes as “the rise of a new global consciousness of sharing and participation across national, physical and jurisdictional borders.”

Whether that view is strictly true or not is a different matter. At what scale will those who like to share a few snapshots among strangers or a small-scale collaboration between laboratories realise the need for a more robust approach to their images and data? For example, if you are sharing a few dozen photos you may not see any point in protecting them beyond a creative commons licence, but what happens when you realise you have tens of thousands of saleable photos in storage? Similarly, a nifty chemical reagent that saves a few minutes in a small laboratory each week could take on global significance if it turns out to be relevant to cropping a synthesis in the pharmaceutical industry. Who would not wish to receive full credit and monetary compensation for their creative works in such cases?

Chik proposes not to destroy or even radically overhaul the present copyright regime, instead he endorses a no less significant reinvention of the general and flexible fair use doctrine through the simple powerful elevation of its legal status from a legal exception to that of a legal right, with all the benefits that a legal right entails. This change, he suggests could be widely and rapidly adopted.

Currently, he says, fair use exists formally only as a defence to an action of copyright infringement. But, DRM and other copyright protection threaten this defence and skew the playing field once more in favour of copyright holders. “Fair use should exist in the law as something that one should be able to assert and be protected from being sued for doing,” Chik says.

Such a change will render copyright law more accurately reflective of an electronically interconnected global society and also acknowledge the importance and benefits of enabling technologies and its role in human integration, progress and development.

Chik, W. (2008). Better a sword than a shield: the case for statutory fair use right in place of a defence. International Journal of Private Law, 1(1/2), 157. DOI: 10.1504/IJPL.2008.019438

Young Scientists

Young scientistChildren are so bright these days. Allegedly. It seems that not a week goes by without some juvenile prodigy whizzkid, or child genius solving an engineering problem, revising some scientific data or inventing some world-saving gadget.

You probably saw the recent news about a 13-year old boy from Germany, taking part in a science festival, who corrected NASA calculations on the likely impact of asteroid Apophis, shortening the odds of a killer collision with Earth. Agence France Press reported it as follows:

NASA had previously estimated the chances at only 1 in 45,000 but told its sister organisation, the European Space Agency (ESA), that the young whizzkid had got it right.

Prior to that there was the case of the 17-year old Ottawa high school student who, while contributing to a science fair, apparently invented a way to identify and cure influenza. A team of ten students, won second prize for identifying genes that help plants thrive in salty soil, while third prize went to a 15-year-old in the same competition for her discovery of a human gene variation that may help in dealing with bipolar disorder. (More on science comp here).

Then there was the Manitoba student who received the first-ever Triple Crown for high school science with a project finding alternatives to chemotherapy in the fight against cancer, winning first-place finishes in biotech-industry sponsored national challenge. There’s the teen inventor who came up with a new way to make biodegradable plastic bags.

And, of course, who could forget the 11-year old who invented a diabetic dress for her sister. Then there’s the six-year old who discovered rare pterosaur bones on the Isle of Wight. One student even won a major science fair prize for a project entitled: “Identification, Characterization, and DNA Sequencing of the Homo Sapiens and Mus Musculus COL20A1 Gene (Type XX Collagen) with Bioinformatics and Polymerase Chain Reaction (PCR)”, which makes squeezing an egg into a bottle seem a little trivial to say the least.

Schoolboy boffin beating cancer, cracking computer codes, finding life on Mars? You got it!

The list goes on. There are so many teenage geniuses out there.

Now, is it just me being a cynical old hack with an insider’s view of the media and public relations industries, or does this strike anyone else as odd? All these instances of child prodigies in science, have seemingly achieved what international research teams with industry and academic funding cannot seem to, all for the sake of a science fair or with some direct link to another organisation seeking publicity. Indeed, many of these junior scientific discoveries get reported widely across all media and internationally too. And, therein lies the rub.

While I don’t wish to disrepect the efforts and achievements of any of the students taking part in science fairs and working hard to put together a great project, I wonder just how many of these so-called breakthroughs ever come to anything more than a framed certificate and a nice fat payout from the industry sponsor. Of course, one might level the same criticism at countless science projects being undertaken by adults on a professional basis the world over.

Meanwhile, Jessica Shepherd writing in The Guardian reports that science exams really are more difficult than the arts and humanities exams, and researchers at the UK’s Durham University have the data to prove it. Apparently, there were “substantial differences in the average grades achieved by the same or comparable candidates.”

But, according to the Royal Society of Chemistry, “There is now clear evidence of an alarming gap between the high-quality teaching and curriculum material presently available for science lessons in schools in the UK , and the corresponding simplistic questions being set in national examinations for 14 year-olds.” RSC spokesman Brian Emsley gave me several examples of just how tough the science questions can be at this age, for example:

  • Why is copper used for wires in a circuit?
    (a) Copper does not stick to a magnet
    (b) Copper is a brown metal
    (c) Copper is a good conductor of electricity
    (d) Copper if a good conductor of heat

  • Some stars are bigger than the Sun but they look smaller. Why do they
    look smaller than the Sun?

    (a) They are brighter than the sun
    (b) They are further away than the Sun
    (c) They are the same colour as the sun
    (d) They are nearer the Sun

  • What is hot melted rock called when it is UNDERGROUND?
    (a) Sand
    (b) Magma
    (c) Lava
    (d) Mud

  • In very cold weather, why is a mixture of salt and sand is spread on roads (choose two options from a-c and i-iii
    (a) Salt makes the roads white
    (b) Salt makes water freeze
    (c) Salt makes ice melt
    (i) Sand dissolves in water
    (ii) Sand increases friction between car tyres and the road
    (ii) Sand makes water freeze

One final question Emsley showed me, reminded me of a silly anecdote about vermin and compost heaps and asked: What powers a solar-powered mole-scarer?

“Vast sums of money are being spent by government, industry and other educational bodies to enhance the excitement and delivery of STEM subjects, covering science, technology, engineering and mathematics,” according to the RSC, “but this is being negated by examiners who are reinforcing low expectations, and setting the standards that the weaker schools teach to.”

So, one the one hand, we have a global public relations system marketing high school students as geniuses because they use some big words in a science fair project and yet, there is a major disparity between what these children are learning at school and how they are being tested. So, who are the geniuses here? Answer (a), (b), or (c).

Dirty Dozen Chemicals

Dirty dozen chemicalsWe live in an age of chemophobia, an insidious disease that threatens our way of life, precludes R & D that might solve many of the environmental issues we face and prevents disease-stopping compounds being deployed where they are most needed in the developing world. Chemophobia is an irrational fear of all things chemical and is usually contracted by those already with naturophilia, the irrational love of all things natural.

It usually starts with a dose of nostalgia, pangs for a time when the world was simpler, and an aching for a natural world that we have long since lost. Unfortunately for sufferers, there never was a time of simplicity and natural living. In those halcyon days of yore, infectious disease was rife, infant mortality rates were high, and life expectancy was very low.

Natural, at that time meant, inept remedies for lethal diseases such as polio, tuberculosis, bacterial infections, and plague. It meant poor harvests and widespread famine, and if disease didn’t catch you young, only those who kept their heads very low were safe from interminable wars, rock-breaking on distant sun-bleached shores, or the hangman’s noose, guilty or otherwise. Today, we may have more obesity and diabetes and certainly fare more incidences of the diseases of old age, but that’s because we have more food to eat (in the developed world, at least) and live longer.

Certainly, natural does not equate to good for you – think snake venom, belladonna, and deadly toadstools, whereas most synthetic chemicals have a strong pedigree and have tested safety and toxicity. But throw in the fact that most chemophobics also have risk assessment blindness as well as dystatistica and we see pronouncements on all things chemical and synthetic as being bad.

It is from this, that the UNEP Dirty Dozen Chemicals list emerges. Not only has it a far too conveniently tabloid name to be believed, but several of the entries are not single chemicals but whole families.

Needless to say, several of these, while appearing to be the harbinger’s of doom media hyperbole would have us believe, are not necessarily as dangerous to us or the environment as you might think, and others, such as DDT could be used to help eradicate one of the biggest global killers. Indeed, the WHO now allows for the use of DDT to fight malaria-bearing mosquitoes.

  • Aldrin (pesticide)

  • Chlordane (pesticide)

  • DDT (pesticide, highly effective against malaria-carrying mosquitoes)

  • Dieldrin

  • Heptachlor

  • Mirex

  • Toxaphene

  • Polychlorinated biphenyls (PCBs, a whole group of diverse compounds, each with
    its own properties)

  • Hexachlorobenzene

  • Dioxins (a whole diverse group of compounds)

  • Furans (a whole diverse group of compounds, each with its own properties)

These compounds are now banned under UNEP, but were not used in manufacturing before this list was created.

There are other lists, such as the List of RoHO prohibited substances, which includes lead, cadmium, mercury, hexavalent chromium, polybrominated biphenyl (PBB) polybrominated diphenyl ether (PBDE) flame retardants, which is fair enough. And, industry-specific lists, such as the Volvo manufacturing black list, which lists all the compounds that may not be used in its production lines, including CFC cooling agents, the paint hardener methylenedianiline, and the previously discussed carbon tetrachloride

In a forthcoming issue of the International Journal of Sustainable Manufacturing (2008, 1, 41-57), Jack Jeswiet, of the Department of Mechanical Engineering, at Queen’s University, Kingston, Ontario, Canada and Michael Hauschild of the Danish Technical University, Denmark, argue the case that market forces need to inform environmental design. One can only assume that this should be one of the drivers rather than media scare stories, chemophobia and the simplistic blanket precautions of lists.

Greenhouse gas emissions, environmental impact, and toxic substances to be avoided must all be addressed by the EcoDesigner in any design situation, they say. The ecodesigner cannot control market forces, but must aware of them and rules should be followed to reduce the eco footprint.

At the time of writing, a news release from the UK’s Royal Society’s summer science exhibition presented findings from consumers tests being carried out during the event which is open to the public. The researchers involved, from the National Physical Laboratory, are working towards producing the world’s first model that will predict how we perceive “naturalness”. They claim that the results could help manufacturers produce synthetic products that are so good they seem “natural” to our senses and are fully equivalent to the “real thing”, but with the benefits of reduced environmental impact and increased durability.

Meanwhile, a new study shows that companies are significantly hijacking the language of environmentalists to their own marketing ends, presumably hoping to leverage the best out of the movement in selling their products.