Thread: Evolution
View Single Post
Old 18th July 2000, 11:59     #163
Fred
 
Post

Boofhead and Purple Kush:

With respect to KAr dating I should mention that I have actually worked on a mass spectrometer doing this kind of dating. The work required that I understand the theory and practice behind this dating method - so I feel I can speak from a position of knowledge about the subject.

Lets explain a little about the process. As has been mentioned by Yautja KAr relies on the assumption that natural potassium is trapped at the point of cooling of the rock and decays into Ar following a well observed half life decay series. Remelting of the rock destroys this relationship so care must be taken that the rock has been formed once and then only weathered.

The rock sample is first powdered to aid gas release when heated and then placed into a metal cruicible in a vacuum sample holder ready for introduction into the instrument. The sample holder is pumped down, usually for around 10-12 hours, to get as hard a vacuum as possible. Because this pump down is so long multiple samples are held in the one container to speed the measurement process up.

The sample is then heated by radio induction (it heats the cruicible and by extension then the sample.) to release the gas. Care must be taken to adequitely pump down between samples to avoid cross-contamination. This is then vented into an inlet stage for the removal of non-inert gasses - usually by a titanium sublimation filter. Failiure to correctly clean the sample up introduces non-inert gasses into the mass spectrometer which react with the electrometer producing spurious readings.

Further complicating matters the metal walls of the spectrometer itself leech impurities into the system which must be compensated for. The electrometer plate itself can have impurities and defects that alter it's measuring properties. To compensate for this two initial runs, a background and a standard, are done. The background runs the mass spectrometer on emtpy and measures the 'ambient noise' the instrument has by default. Often good labs will do daily mass scans across the entire magnet/high voltage ioniser range looking for leaks as a part of their routine. The standard introduces a sample of fresh air, cleaned and purified to inert gasses as the default reference for the ratios expected to be observed. (Actually I'll have to do some checking, I might be confusing this with the ArAr method which definitely uses an air spike. There is definitely a default standard introduced in KAr dating though.) This is required because the detector plate frequently has dead spots on it which lower the measured amount of a particular isotope. (This is usually more of a problem for magnet deflection mass specs than the fixed field machines but both suffer from it.) Based on the standard, which has a known ratio, a correction factor is applied after the background has been removed to give scaling factors to be applied for each isotope peak measured.

Furthermore peak centering must be done frequently to ensure that the dead centre peak is being measured when doing the run. The Earth's magnetic field fluctuates enough to require often daily rechecks of the centering and the temperature of the instrument alters the strength of the deflection magnetic (be it fixed field or variable field). These too must be compensated for - even slightly off peak will produce wildly skewed results.

Finally once these have been done you can begin the measurement itself. Timing is important because the mass spectrometer itself acts as a filter drawing the sample away so delays in the measurement between sample inlet and actual measurement of the run must be compensated for. Even more fun is if an air spike is being used then because the standard sample is a fixed volume of gass every tapping of it reduces the pressure and introduces a correspondingly weaker standard sample when inlet into the mass spectrometer. This must be compensated for.

Phew. My point in detailing the above procedure? To show that there are quite a few steps where measurement error, instrument error and handler can creep in. Frequently this level of error means that the instrument is deliberately trusted to only a fraction of it's total sensitivity to compensate.

To make matters worse the size of the sample gasses and the voltages they produce on the electrometer are quite small. We are talking on the order of pico-amperes _after_ signal amplification. And amplification introduces noise as well which must be compensated for.

This is a very delicate process and one that has pretty well known limits. Expecting it to date rocks 20-25 years old is asking the machine, based on a standard sample size, to almost quite literaly count individual atoms. Which is well beyond the useful resolving power of the most sensitive mass spectrometers. And is likely to remain so simply because the effects of even slight containmination is so great at that level that the repeatability of the experiment is virtually nill - as observed by the 'test' conducted.

Of course if the people conducting the test had understood the limitations of the method then their amazement at the results would not nearly be as great. Also if they had told the lab that the samples were less than 500,000 odd years old then any lab worth it's salt would be warning that the results would be pretty meaningless for the reasons I have outlined and most likely recommending a better dating series to be used.

|THAT|-fred
'fred is not dead, fred is resurrected!'
__________________
|O-bot|-fred
'fred is not dead, fred is resurrected!'
"It is only in the tales humans tell, that the hunters win in the end."