Genetic Research Hits Pay Dirt

DNAThe budget for the Human Genome Project and all that post-genomic, proteomic, metabonomic, immunomic…research was almost on a par with defense spending; it was almost c-omical really. Well, maybe not quite, but it stretches out with a lot of zeros nevertheless. At the time the grants were written and the funding given, we, as a society, were promised all kinds of medical miracles from gene therapies and new treatments to cure all those nasties – cystic fibrosis, sickle cell, thalassemia, cancer, heart disease and more.

We were promised personal medicine courtesy of pharmacogenomics. This would allow your doctor to profile your genome and tailor your medication to the particular set of enzymes running in your liver and whether or not you were likely to respond positively, suffer adverse effects, or simply not respond at all. We have even seen, this last few days, the sequencing of James Watson’s genome; an effort that cost less than $1m and took under four months. But do any of these promises add up to very much beyond myriad PhD theses and thousands of biotech startups many of which have already crashed?

Hopefully, the answer is yes. In the next few years, gene science will hit pay dirt as genes finally give up their real secrets and the true meaning of so-called junk DNA will become clear. Our understanding and ability to treat a wide range of disease from breast cancer and obesity to hypertension and bipolar disorder will come of age and perhaps finally succumb to all this genetic scrutiny and manipulation.

Nature, Science and the Wellcome Trust provided a useful summary of the genetic state of the art for a recent Times report by Mark Henderson on our genetic future. In the summary Henderson highlighted the latest “in press” results, most of which are now online, so I am providing here the hyperlinked executive summary:

Breast cancer – Three papers published in Nature and Nature Genetics at the end of May reported four new genes and one genomic region associated with increased risk. 10.1038/nature05887, 10.1038/ng2075, 10.1038/ng2064

Obesity – An obesity gene, the FTO gene, was published in Science in April and reported in Sciencebase at the time.

Diabetes – Again in Science (and 10.1126/science.1142382 and 10.1126/science.1142358, three common genes for increased diabetes type 2 risk were reported, bringing the total known genes associated with diabetes to nine.

Alzheimer’s disease – New results also published this week in Neuron discuss an Alzheimer’s gene

Data that were still under press embargo at the time Henderson’s feature appeared in The Times, however, meaning he could only hint at the true potential of human genome results were revealed today.

The largest ever study of genetics of common diseases in which almost 10 billion pieces of genetic information were analysed were published just one minute ago, so I can now outline the findings in a little detail. The new study compared 2000 cases each of seven common diseases with 3000 shared control patients, and reveals new genetic associations with these disorders. A pair of related papers in Nature Genetics (a and b) offer further insights into two of the seven diseases investigated.

In the Nature article, scientists from the Wellcome Trust Case Control Consortium report genetic variants associated with the development of bipolar disorder, Crohn’s disease, coronary heart disease, type 1 and type 2 diabetes, rheumatoid arthritis and hypertension. This is the first study from this large scope and it, the scientists found one genetic region newly associated with bipolar disorder, and another with coronary artery disease. A separate group of three markers have been found to be associated with rheumatoid arthritis. The researchers also identify nine new genetic associations for Crohn’s disease and ten chromosome regions that contain genes related to diabetes.

These new results would suggest a medical revolution is at hand and that the Human Genome Project and its spinoff -omics really are about to hit pay dirt. But, are we really on the verge of a new era in medicine, or are the various genetic revelations simply more grant-baiting hyperbole?

Testing your rotten organics

Tobacco skyWill your molecules rot, is biodegradability an intrinsic property of those chemicals you handle on a daily basis? A study published today in the journal Molecular Systems Biology reveals whether or not thousands of chemicals will be biodegradable. The work could help in environmental risk assessment of production, transportation and disposal of organic compounds.

Biodegradability is determined primarily by whether or not there are microbes in the environment that can diget any given compound. Victor de Lorenzo and colleagues at the National Biotechnology Centre in Madrid, Spain, used a database of all known microbial metabolic reactions to train a computer algorithm to distinguish between the biodegradable from the recalcitrant compounds. With this in silico test kit they looked at almost 10000 chemicals.

This automatic predictive approach to assessing biodegradability could help researchers evaluate the potential of new compounds to pollute the environment and help in the implementation of international regulations on the use of new chemicals.

Perhaps not surprisingly, the press release associated with this work focused on those compounds, including herbicides, that are most resistant to biodegradation, but fails to mention the even larger group of compounds that are intrinsically biodegradable. The usual news write-ups about toxic chemicals and the environment 9999 times out of 10000 will inevitably highlight those that are the nastiest.

The huge benefits of the thousands of organic compounds used in the pharma, biotech, plastics, and other industries as well as medicine and agriculture will simply be ignored whether or not those compounds accumulate in the environment or not. Biodegradation is only one route by which thousands of compounds are destroyed naturally in the environment (heat, light and interaction with other non-living materials, are others). The predictive system will be useful, certainly, but its wider applicability should consider these other routes and the risk factors and toxicity associated with any particular chemical, rather than tarnishing all entries in the database simply on the basis of whether or not a microbial enzyme exists to digest it.

The original research paper can be read here.

Down to Earth Spectroscopy

Cheminformatics could help save forests from the damage caused by runaway widlfires. As long as there have been forests, there have been forest fires, from the bushfires of the Australian outback, across Africa, Asia, and Europe, to the Americas. Such fires often thought of has having a regenerative effect on old woodland, but predictions of an increasing frequency and intensity of wildfires because of climate change could lead to loss of forest and soil erosion rather than dendritic rebirth. Spanish researchers have now used near infrared spectroscopy, a simple calibration technique and cheminformatics analysis of their spectra to determine a key parameter of soil damage – the MTR, or maximum temperature reached. You can read the full story in the latest issue of SpectroscopyNOW.

Spiked innovations

Spiked logoSome time ago, the editors at Sp!ked-Online asked me to suggest what I thought was the greatest innovation of all time. I tried to be a bit esoteric and opted for the inorganic chemistry of ammonia and sulfuric acid, certainly not the most exciting sounding of entries in the sp!ked innovation survey, but I hope the chemists among their readership would appreciate it among all the more electrical technological suggestions and the tools of molecular biology.

It seems I was among some eminent participants, “key thinkers in science, technology and medicine” allegedly with some half a dozen Nobel laureates in their number. The survey aimed to identify the greatest innovations and a live discussion is scheduled to take place in London on June 6.

Surveying the responses, Mick Hume, sp!ked’s editor-at-large, says the survey “Provides some illuminating insight both into the important developments of the recent and more distant past, and into the way those involved at the cutting edge see the issue of innovation today.” My colleague Philip Ball, a fellow freelance science writer with a chemical bent, also stuck up for chemistry in his submission opting for innovations in analytical chemistry, including NMR spectroscopy and X-ray crystallography.

Among the other innovations suggested were The Internet, the alphabet, the discovery of nuclear fusion, X-rays, the brick, rockets, the eraser. I surely must posit that without sulfuric acid and ammonia not one of those innovations would ever have reached its full potential. Maybe I should also add an upside down exclamation mark, just to emphasize my point!

Among the other contributors to the event are Anjana Ahuja, science columnist, The Times, Ken Arnold Head of Public Programmes, Wellcome Trust, Peter Cochrane co-founder of ConceptLabs, and former chief technologist at BT, Marcus Du Sautoy professor of mathematics, Wadham College, Oxford, Sir Tim Hunt (FRS) principal scientist, Cancer Research UK, and David Roblin VP, Clinical R&D, Pfizer Global Research & Development.

Digg for Chemists

Berkeley chemist Mitch Garcia, who runs ChemicalForums.com has come up with a “novel” way to evaluate the chemical literature that will complement current ways of evaluating a particular paper. At the moment, the only way to determine whether a particular experiment is valid are to trust the quality of the peer review process for the particular journal in question, attempt to repeat the experiment yourself, check out how many other chemists are citing the paper, or somehow try to relate the quality of the paper to the author’s h-index, an altogether more ephemeral and perhaps elitist quality.

Garcia has now launched ChemRank, which will augment this unwieldy process by allowing individuals to post a reference and then see others vote on whether or not the paper in question is any good or not. Those papers with the most votes will rise to the top, while the less worthy will essentially sink. The system will rely on building a big enough userbase and somehow ensuring that chemists don’t simply spam their own papers or vote arbitrarily for their friends and Profs. How effective Garcia’s system will be remains to be seen as it has only just launched. The number of papers currently being voted on is small and the number of votes is low, so take a look and if you feel like digging chemistry, make your vote count.