How low can you go? Come to that, how high can you go?
yotta zetta exa peta tera giga mega kilo hecto deca deci centi milli micro nano pico femto atto zepto yocto
Y Z E P T G M k h da d c m µ n p f a z y
Many readers will probably be familiar with measuring millilitres of solution, more than likely conversant in micromolar concentrations, well aware of picoamps and certainly not averse to a discussion on nanometre dimensions. Probe the majority about zepto and yocto and you may draw a blank. For the analytical chemist though, zepto and yocto represent the smallest of the small (at the moment). Is there any point to the infinitessimal? asks David Bradley.
Zepto and yocto, I should explain, are the rather bizarre extension of the sliding scale that drops by thousandths from milli to atto and beyond. Zepto is 10-21 (symbol z) and yocto 10-24 (symbol y) and on a molar scale they represent quantities of about 600 to 600 000 molecules (zeptomolar) and from 1 to about 600 (yoctomolar). These terrifyingly small amounts characterise actual detection limits for certain analytical methodology and while demonstrating the technical prowess of the scientists and the power of their instrumentation which, in the last decade or so, have attained them, they seem to be taking things a bit too seriously. After all, what possible relevant effect could such miniscule concentrations of any particular analyte have?
According to Steve Morton of instrument manufacturer Cambridge-based company Unicam Atomic Absorption analytical scientists are interested in two different types of detection limit, which make the ability to observe yocto and zepto amounts of material crucial in every arena in which the analytical chemist works, from pollution detection to watching biochemical reactions in single nerve cells and detecting enzymes and traces of drugs, for instance.
First, there is the instrumental detection limit (IDL), “This,” he explains, “is a measure of the detecting power of the instrument.” Perhaps the most important selling point of a machine. It can be thought of more specifically as the machine’s ability to distinguish between a sample with a nominal concentration of zero units of the analyte one is interested in – a so-called blank – and the sample being sampled. (For anyone particularly interested in the details about what is meant by a blank, there are no doubt several heavy statistical analysis books available.) Put simply, however, a blank is ideally a sample that is physically and chemically identical to the sample of interest but does not contain the analyte of interest.
Adam McMahon of Manchester Metropolitan University adds that there are various definitions of detection limits. “All,” he says, “should be based on the principle that the smallest quantity or concentration detectable is that which can be shown to give a significantly larger analytical response than a blank.”
Morton points out that the limitations of an analysis are not necessarily those of the machine but the quality of this blank. “Good blanks are increasingly difficult to obtain as the detecting power of the instruments improve,” he says. For example, calcium is a very sensitive and very common element. To measure it by a technique such as graphite furnace atomic absorption spectroscopy, the detection limit is determined by the quality of the blank sample, not the instrument as one might intuitively expect.
The second type of detection limit is determined more by the operator and the equipment, the method detection limit (MDL). This is a measure of the performance of the method, and says Morton includes all the sample preparation and dilution steps as well as the actual measurement – it is what he refers to as a ‘real world’ limit. It represents the minimum concentration in the sample that can be distinguished from ‘none’. Again there are numerous statistics books that will lead the interested reader through the ins and outs of standard deviations and what they mean in terms of this ‘none’.
While miniaturisation is very familiar to electronics engineers it is only really in the last few years that analytical scientists have begun to apply the lithographic and other techniques to creating micromachined devices for separation and analysis. The likes of Andreas Manz at the SmithKlineBeecham/Zeneca Centre for Analytical Science at Imperial College London have expended a great deal of research time in reducing the volume of analytical devices. As I mentioned in my ChemWeb round-up of 1997 they can create microchannels in glass chips which work as high performance liquid chromatography machines with the equivalent power of a million theoretical plates. One of the standard analytical tools, a gas chromatograph, can take anything up to half an hour to carry out the equivalent of 100 000 theoretical plates. Manz and others working on similar devices can cut the time considerably, or produce far better separations on minute volumes.
Nanotechnology is not only about tiny machines for chopping up arterial plaques and assembling miniature steam engines. Instead, it has the power to create yet more powerful analytical devices that can measure ever smaller quantities. There is a world of difference between measuring the glucose in a single heart cell and the sugar in a soft drink!
According to Bill Marton, an independent consultant and former manager of the analytical development laboratory at one of the largest pharmaceutical companies, “One place that nanotechnology and measuring single molecules will be important is in neurochemistry.”
When such small volumes are involved in an analysis the signals received are concomitantly small. Labelling an analyte with a glowing tag or an enzyme can boost the signal and cover up the noise without the need for soundproofing. For those chemists working in genomics another approach has been analyte amplification: if you don’t have enough of the stuff, just make more! The Nobel-winning PCR (polymerase chain reaction) boosts the tiny amounts of nucleic acids being studied to the point at which they can be sequenced, or at the least mapped.
According to William Heineman of the University of Cincinnati, “The key to achieving zepto- and yocto-mole detection limits is to combine two or more strategies.” He says that the likes of capillary electrophoresis, a separation technology that can work on the tiniest of scales but with big molecules, such as proteins, can work with laser techniques, like laser-induced fluorescence to push the limits down.
To achieve the ultimate detection limits, high sensitivity must be combined with selectivity and minimization of blank signals. High sensitivity can be achieved by an inherent gain mechanism that is most frequently provided by an electron multiplier or photomultiplier. Selectivity excludes signals from interfering species and can be achieved by a separation method, such as gas chromatography, liquid chromatography and electrophoresis, by the selectivity of enzyme or antigen reactions, or by spectroscopic selectivity particularly in atomic spectroscopy.
|There are a few spare Latin|
prefixes left for the ambitious analyst looking to go lower than
zepto and yocto
Contamination of the sample with the analyte (or loss of analyte from the sample) must be minimised by careful sample handling, use of clean reagents and even the use of clean-room facilities. McMahon considers such areas as being the way forward for the ultimate detection limit: “These three themes,” he posits, “indicate the directions for future research effort.”
There are a few spare Latin prefixes left for the ambitious analyst looking to go lower than zepto and yocto although a sample containing less than one molecule of analyte is perhaps going a bit far.
For the etymologists among you: zepto is an adaptation of the Latin for seven with the s changed for z to avoid confusion with other esses and the a swapped for an o to make it consistent with nano, pico etc. Yocto has similar roots as octa, but with the y added to clarify the abbreviation.
M A Cousino, T B Jarbawi, H B Halsall and W R Heineman, Anal Chem, 1997, 69, 544A-549A.