BuiltWithNOF

The real power of Systematic Analytical Errors,
the misconcept of “Comparability ONLY” and the fundamental problems caused by
Analytical Method (over)regulation

 

Systematically false analytical data result fundamentally in systematically false decisions.
This has to be considered as a natural law.

The forensic laboratory - or laboratories - of FBI in the United States of America delivered  for years systematically false analytical results (hair analyses) which resulted in errors of judgements. Wrong verdicts resulted in 60 wrong death penalties. Detailed information about further facts and background details have been reported by the Washington Post on April 18, 2015. As these are shocking facts about results of systematically false analytical data radio stations and the television in Germany reported on April 20 and 21 2015. As a poor translation of such information  into “broken English” can in addition add systematic errors, the author of this SITE (rek) uses his mother language and rewrites one of the German news reports he received through
“ HEUTE.de “ on April 20.04.2015 at 6:47:

“Experten der US-Bundespolizei FBI haben möglicherweise jahrzehntelang falsche kriminaltechnische Analysen geliefert. Fehlerhafte Haaranalysen könnten dazu geführt haben, dass seit den 70iger Jahren viele Angeklagte (in den USA) zu Unrecht verurteilt wurden, berichtet die “Washington Post”.
Meist habe es sich um Fälle von Mord oder Vergewaltigung gehandelt. Die Zeitung spricht von einem der größten forensischen Skandale in den USA. Bei den betroffenen Verfahren habe es auch 32 Todesurteile gegeben”

Well: the details reported above differ in the death penalty number but this may also be caused by the fact, that the final procedure differs in waiting times up to many years . The Washingtom Post Journal reported also not just this year but in 2012 and even in 1997 about serious laboratory errors made by using many years old and technically outdated regulations.

A serious abortive development of the last 20...30 years is method regulation in the whole area of analytical chemistry. Many bad results of this “systematically wrong political decision” mainly made by lobbyists who want to push and keep for a longest possible period their hardware and software into this life important field of analytical chemistry. This develops down to conditions to allow only and exclusively a given (hopefully very expensive) instrument in if possible each analytical regulated method.

Here it is a good place to define such regulated methods as “over regulated”.

If for an analytical problem overregulated methods exist, it is standard, to keep them untouched by order and thus to block any progress independent of whatever happens globally or locally by invention or grolwing special knowledge. Especially if a progress would result in lower costs of analysis, more accurate and precise data - especially in CHROMATOGRAPHY if a constantly quality checked (big) product suddenly shows after a method change some or many peaks more than prior that change, desaster and resistance to any improvement will result. Already because of the  new situation: “we made wrong data”. Therefore keep the old method unchanged.

As progress in analytical methods, instruments and materials is visible at any qualified national or international experts conference, the state of the art of (over) regulated analytical methods is “no change , no error testing” and thus a full range blockage which keeps the danger of systematically false analytical data at a high level. This may also be the reason why even the not over regulated methods and thus nearly all official analytical regulation is free from error testing.

Officially regulated methods reached even this unbelievable “error” level: if by any method a change of the analytical data happens because of the “new” or “next” method show a smaller repeatability standard deviation of quantitative values this change is not acceptable and the analytical values are declared as never been measured.

Important in this respect is also the widely regulated concept of repeatability statistics: it is mostoften based on N = TWO data sets. If four repetitions would be the rule, the analyst would see a possible value drift, which would tell him, that the found mean AND the repeatability standard deviation both are false and the reason of data drift must be found. The analysis must be redone completely, at best by a complete change of the procedure. Well: N = 3 offers a first idea if there is a data drift. So N = 4 measurements now must follow.

Error tests itself are widely unknown. One sample only, one method only : this team of steps is blind to any systematical error. If there are ones, more than one - enough differing - method would cause alarm in case of serious differences quantitatively and in case of qualitative discrepancies there is no question, at least one systematic identity error exists which damages both: the qualitative  AND the quantitative results of an analysis.

Well: there are strict limitations in industrial mass analysis: these are time, costs and materials consumption like solvents. If methods would be developped and accepted which drastically reduce the time and material consumption for instance by miniaturisation of instruments and material consumption, a triple run under a double set of analytical modes would show that a concept to produce analytical data with no systematic analytical errors is unbeatable better than in reduction of economical risks and analysts reputation. Situations like the years long desastrous analyses errors as demonstrated by the US FBI discussed at the start of this chapter are avoided. What miniaturization can do for instance in planar chromatography can be seen in µPLC (click here). In two ZDF films about Planar Chromatography - one made by the  BKA labs in Germany , the second made by protest about the shown technique in Wiesbaden - IfC could show: the DRUG identification was based upon an 3600 seconds separation run made at BKA labs, an identification of a cheque falsification made by IfC at the ZdF under their camera lamps, IfC needed 25 seconds development time. ZdF = Zweites Deutsches Fernsehen, BKA = Bundeskriminalamt lab Wiesbaden, IfC = Institut f. Chromatographie.

The best mode and tools for inorganic and organic, industrial and environmental  trace analyses is CHROMATOGRAPHY. What most chromatographers either do not know or at least - if they know it - do not like is the fact: chromatography is especially prone to errors.

Using more than one column or capillary in gas or column liquid chromatography already offers more methods and if the capillaries or columns used are short and fast as well and in addition are time based switchable in series or in parallel [heart cutting], we even reach a qualitative analysis level of more than two methods. The danger of systematically false analytical results are clearly detectable in case of discrepancies. Thus especially quantitative errors caused by peak overlapping - a very often found problem in GC and HPLC - are mostoften excluded.
There are lots of legal problems solvable this way.
If for instance a certain substance is banned in a main product or is strictly concentration limited like benzene in gasoline court cases are avoided in case the heart cut mode of separation delivers the correct and mostoften much smaller concentration of the substance in question.
Or the real source of product quality trouble up to now with a one column system may be lets say was caused by a seven peak impurity group but heart cutting finds 5 additional traces overlapped, all effort of quality improvement is now possible by a truly substance correlated action. This way the author could help in the “big synthetic fiber industry” by showing “the trouble” is caused by oxygen traces in vacuum distillations and not by a limited distillation effect. In the latter one would have to enlarge and enlarge distillation columns and reduce the production scale. In the former - oxygen - one had to learn, that no inlet-outlet-valve is never really completely air tight. Thus only flushing nitrogen around such inlet-outlet systems does it.
Years of a daily 24 hours multi process chromatography could be finished and replaced by industrial (high temperature) heart cut GC. Already one capillary connected to a standard packed column GC did it.

Let us try to define, what are systematical errors. Where do they come from. How can they happen.

Error Definition


The wrong place, time and mode of taking a sample from the bulk material to be analyzed can cause both: qualitatve and quantitative systematic analysis errors.

Mostoften GIVING a sample is prone to errors. DNA or Protein analyses are falsified if sample material is transferred by standard pipette tips because of specific protein sorption.  Anti sorption plastics as pipette tip material as well as other plastic based tools used reduces this systematic sample falsification problem [www.mt.com/RaininLR] .
 

The widely used glass syringes for taking, manipulating and giving samples in chromatography sorb specifically parts of the sample materials. One may believe, that - if the sample is an aquatic one, clean water cleans the syringe. But the next sample may have a composition nearly exactly as the former one, but may slightly stronger desorb adsorbed specific substances. If just these materials have to be quantitized, the result will be systematically falsified. Reason for a huge analytical mass desaster caused by official (and exclusively regulating) laboratories in the “Diethylene Glycol Wine” trouble in the nineties. As quite some news paper producers in Germany blindly believe, official state laboratories never produce systematically false analyses, our report was rejected. Only a protest using a top TV station made the ball rolling.
Years later in court cases wintners told the judges, that about 90 % of all official analyses were wrong (much too high values wich means Wine falsification by illegal adding of DEG) and thus caused the close down of thousands of wintners production places in Europe. DEG = Dietyleneglycol, a natural trace in each original wine.
The author of these lines were responsible to give publicly experts reports to judges, got officially closed wineries open again, used a non regulated Diethylenglycol-in-Wine quantification technique which was a full factor ten more sensitive than the officially regulated one, used a really sensitive quantitative detector and not the officially regulated MS-detection procedure.
He could show that Diethylenglycole is a natural compound existing already in wine leafs - if one can measure it sensitively enough below the 1 ng/L value and could publicly help to stop the nonsense of a weeks long TV, Radio and Newspaper series of shocks to wine friends about this dangerously toxic compound in a fine culture product.

Even 10 gram of DEG in one liter wine did never harm but DEG was misused by one or a few wine falsifiers to push the wine price of simple low quality wines drastically up by sweetening with DEG. This was done barrelwise and in all aspects illegal. The old classical wine analyses methods could not identify this quality falsification.

 
                             
Comparability Only
 
Comparability Only was (and partially still is)  the big systematically wrong message.

In this case method regulation is a must, even the use of overregulated methods. The producer as well as the user of products need ONLY analytical modes which compare products qualitatively and quantitatively without any checking for a possible systematic error . This would avoid or reduce court cases in case of detectable discrepancies in the “numbers” was the legal background of this development starting in the seventies of the last century.

Therefore the strictly equal (and overregulated) methods  have to be the only analytical practice. Every laboratory controlling the quality of incoming or outgoing chemical products has to use exactly the same method.
If the numbers found are equal, there is no reason for any further quality or even any precision control effort. This is a desaster for the practice. In nature only the real composition down to traces even below the ppm - ppb - level causes a final realistic result. This way even the pharma industry took and used main products from critical sources. A drastic procedure for outsourcing quality control labs or for reducing the inner company quality control effort had its critical impact caused by this wrong “politics” of “Comparability Only”. Comparability only stopped methods and instrument development, blocked any improvement (do not touch our rules !). It pushed experts out of their job as special knowledge and analytical research had no future. This downgraded the science “Analytical Chemistry”. Even the number of teaching centers was reduced and thus product falsifiers have since the start of “Comparability only” a good time.



Special Method Regulation

 Just one example about an as toxic considered trace in WINE and a short report what resulted by two wrong official statements:

1.: The substance Diethylenglykol [DEG] is toxic.

2.:If it is detected even at its lowest detectability level it must have been illegally mixed into wine, as “theoretically” (for biochemical reason) nature does not produce DEG within the about 10,000 traces we regularly find in red, white or rosee wine.

Both statements are wrong. The author of this lines developed a capillary GC method using a top sensitive flame ionization detector, which could quantitize DEG traces even in the grape leafs and so in all types of natural wine in case he avoided to use the “special method” officially to be taken ONLY when checked wine for DEG. His “IfC” method reached a lowest detectability level a factor 10 smaller than the official one. He used a flame ionization detector instead the for DEG especial low sensitiv mass spectrometer. He used a top separation silica capillary impregnated by himself.

Why to discuss this topic here ?
It makes understandable the danger of systemtic errors in analytical chemistry, as the one error by the statement “DEG is toxic” and if in wine it has been illegally added to a life product falsifying its market level value results in completely wrong legal decisions than DEG is a natural trace in natural wine. Well: some falsifyers who added DEG illegally in amounts up to grams per liter can be identified by DEG concentrations if far above the natural one. But if the official method itself produces too high values it looks like we have a falsified product in the bottle. (Wine quality is checked in the bottle, not in a barrel - by regulation).

So millions of users got for about a period of one year daily alarm messages from news papers, radio and television, which wine is illegally intoxicated already based by quantitized false data near the natural level.. This damaged not only the wine reputation but thousand of wintner jobs. Their production places have been closed down or -because of the shocked former customer who feared an intoxicated wine got into economy trouble.

The second systematical error made by a few official laboratories as mentioned above was external calibration. The analysts did not realize that DEG sorbs quite strongly on glass. The calibration solution of DEG (in water ?) kept DEG spüecifically sorbed on the syringe glass surface. Only wine and its acids plus ethanol desorbed this “calibration” DEG thus falsifying the real sample composition.  The finally found DEG value in regular samples was now higher than a natural one and looked now like illegally falsified.  At least this was the basis for the statement of wintners in the final judge session about DEG in wine: “90 % of the official DEG analysis data had much too high wrong DEG values”.
A systematic “mass error” followed by a false legal decision caused quite some trouble in quite some countries in Europe.

The author of this lines helped a wintner to get his winery back open again and was free to continue producing top quality wine with its low natural DEG concentration.

The editor in chief of Bad Duerkheims local news paper  refused to publish the data and the story because he was sure: officially measured data like a DEG value can never be wrong. He insisted: official analytical data are ALWAYS error free. This is a next special type of a systematic error, an only “belive statement” again a basis for systematically false decisions.
Only with the help of a protest report published by the largest German TV station we got the ball running towards finally into a quite huge court case. The author of these lines however had to show nearly in front of the public with a certified food analyst, a lowyer and  an additional technical cowerker the complete analysis running through the whole procedure from illegaly opening the officially sealed sample bottle formerly used by the “official wine laborator”, down through sampling, separation, quantitation and calculation  what is the correct DEG value. Result: The correct DEG value was significanly lower than the officially measured one and far outside any critical range. The officially sealed bottle was only opened “illegally” after a judge gave his agreement for this public experiment.  The far over one factor of ten quantitation error could be thus found and witnessed. It started a protest wave with a whole series of follow up trouble including lots of analyses for interested wintners groups. This included also more than 100 wines selected for one Duerkheimer Wurstmarkt - all 100 wines had only their natural trace concentration of DEG but the timing was critical: at this time we had the largest public “DEG scandal” period with its mass information about “Fostschutz intoxication of German wine”.
By the way: NON of the about 600,000 wine festival visitors or its at least 300,000 stronger wine drinkers and friends had any question about this “Frostschutz”. Only TV journalists had questions at this 10 days lasting festival and looked like frustrated that there was no story to report.

By the way: each larger chemical company is free to use for all internal measurements its own specific analytical methods. Only in case data are to be discussed and used outside the inner company questions the regulated as well as over regulated analysis method has to be used until we will succeed, to stopp this source of additional possible systematic errors and put all method knowledge into the internet.

Everything can be found in everything in case the used analytical methods are sensitive enough.
Everything is toxic above a certain concentration level. Including even super clean water, which can be deadly dangerous.

Many as toxic to life declared substances may at certain lower limits be necessary for healthy life, especially in inorganic chemistry. So we need an optimal very small concentration of mercury.
In real life no one product around us is a clean chemical singularity, in fact all around us are substance mixtures. But toxicity often is measured quantitatively only by non natural test mixtures neglecting the facts, that in a natural mix of byproducts the toxicisty may be drastically differing, so that we again get systematically falsified results with all the consequences of false data.

Thus the mostoften only politically as toxic declared amount of low levels of many critical substances may itself be systematically wrong due to the methods under which toxicity has been really found in series of on animal based test runs. Often only irreal test mixtures produce this way again systematic errors.

 Is this water drinkable ?

So what can we do with drinking water tests using chromatography which separates critical test mixtures and finally delivers the 128 (or 290) toxic trace level data telling: “not drinkable” ?

We succeeded in the past 100 years to push first some thousands, now about 30,000 pharma products into water resources. Waste water sources may easily be too strongly in contact with million years old clean and thus drinkable ground water.
We found, that the production regulation in an engine producing company sent a critically toxic mix of metal cleaning solvents into the ground water resource  (mainly organic halogen compounds). it was just disposed into the ground.  From there at least near by “clean mineral water” was bottled and sold in huge numbers. As the (over) regulated water quality test method did not cover methylene chlorid and/or chloroform traces only sensitiv taste effects alarmed the environmental analyst, who quickly could (and had to) stop the drinking water bottle production.

About 30,000 pharma products of today enlarge by bio degradation and other effects the number of organic traces in formerly drinkable water resources. About the toxicity of these 300,000 traces we know nothing and there is no hope to keep up or even only to start anything useful to reduce the systematic error of a widely incomplete quality analysis of most of our ground water sources which are in contact with larger industrial areas.
No detector, including the best of MS, NMR, UV, Fluorescence etc. etc. would guarantee a sensitive enough signal for identification and quantitation even in case prior to the detection we have multi dimensional chromatography separation.

No computer program would make these types of data analysis readable or understandable  The answer to solve this analytical problem which looks like to have  utmost importance for today but surely for to-morrow is, when drinkable water becomes too short: our answer:

NO more separation. Summing up prior something of quantitation. If wee see nothing by testing possibly such drinkable water quality, we may need only global biotesting using dirt sensing animals. By Micro Planar Chromatography we could enrich organic traces from a full one ml sample at room temperature in a clean air flow into the thin layer plate. This would enrich water impurities into a sharp circle line. If in IR-, UV-, fluorescence-light we see no substance circle, it looks that this water has chances to be or become drinkable after sensitive bio tests agree. By now no water sample was “circle free”.

If we see  strong dirt signals in focussed impurities after some minutes PLC enrichment  however, it is wasting time and energy to report results besides this one: not drinkable or in case, we already are in water trouble: Continue cleaning this water  More under µPLC .
Although as mentioned, chromatography is especially prone to analytical errors, lets nevertheless summarice the analytical possibilities and limitations as of the knowledge level with respect to systematic errors:

 

Chromatography


Chromatography looks (often to day in high quality product prospects) like as simple You only need to “push a button” mode of analysis, no special knowledge necessary. It looks like clever computer software “does it”. Both is not true.
Software is the most important helper of the chromatographer, but software cleverness is  missing. The correctness of the found results depends on the analysts knowledge, which must cover the whole range from identifying the analytical problem, getting the correct time, place, source and tool to take representing samples, to make the samples still correlated with the identified problem ready for analysis. this would offer the really needed  precision and would still provide the needed accuracy which may solve the identified problem. Of course as quickly as needed, and error tested.

Chromatography is the most widely applicable analytical mode of getting answers to questions of qualitative and quantitative substance composition, trace concentration of certain specific substances in a synthetic bulk material, in pharma products, in water, in air to mention the top areas.

Whilst in the last century at least back to the fifties only trace concentrations of below tenth of weight % had an impact on quality control of materials and products, the industrial mass production of materials like pharma products today pushed analytical quality control demand more and more down to the ppm, even ppb and finally to the ppt concentration level.

Already tenth of percentes of the wrong optic isomer in a pharma main product could cause a global desaster against the health of millions of born babies, also still today critical discussions continue, whether really an opötical isomer in the tenth of weight percentages could cause this health desaster.

There are very many sources and reasons to get less precise or non accurate analytical data. Precise and complete enough analytical results are basis for correct decision making giving the analyst high responsibility. Sometimes “data readers” are far away from the chromatographer, his sample taking problems (by mode, place, time, frequency....) so that the necessary information chain is broken and miss decision is possible causing serious trouble. Thus top analysts must overlook the complete question and data chain from identifying the problem till to the decisions made by non analysts upon his analytical results, only this way serious systematic errors can be realized. Otherwise we have data producers who cannot carry responsibility.

Chromatography data can show an excellent precision and nevertheless be completely wrong. It needs many years of broad experiences in this very wide range of chromatography application and it needs very intensive own work in theory, instrumentation and data handling. “Experts” who proudly declare “I have never touched a syringe, I do not need it” are for sure not the best partner to help. Those who declare one theoretical concept as “the ONLY one” had (probably) no time to read, understand and try a next, easier, more practice oriented or even more accurate theoretical concept. He however has no good chances, to get in touch with the latter, as the morst serious systematical error in analysis are (over) regulations as already mentioned in many places

Analytical methods information should be concentrated in “know how” sources, available to all labs and analysts globally.  “Over”-regulated methods have nothing to say in the analysts lab and head.
.

Systematic Error Testing

Here is only a part of a list of error testing modes which may have a broader applicability. The list cannot cover special substance depending error tests.

Water traces in natural gas can cause expensive gas tube corrosions. Thus correct water concentration data are important and even topics of court cases. We found, that on a quite dry metallic gas tube wall changing water concentrations could be correctly quantitized only by series  of forthcoming analyses using computerized fully automatic gas analysers after the seventies consecutive run. 17 runs, 170 minutes staedy sample gas flow, in a 1.0 i.d. smooth clean steel capillary !
Thus the residence time of specific substances on solid surfaces  has to be known in constantly kept flowing samples even on well known surface materials. A statistically safe TREND analysis is a must.

Chromatography acts always and everywhere when two or three phases coact like differing phase conditions coact: like an aquatic sample in a glass syringe. Sorption and displacement is chromatography controlled and will happen. This can drastically change the sample composition and we will be confronted with sample memory effects. So replacing the standard high precision glass syringes in Planar Chromatography by micro brushes improved quite remarkably the sample correctness with respect to traces, reduced sample memory, reduced sampling errors, as a brush which despite a quite strong substance sorption effect on the brush fibers could easily be brought in equilibrium with the sample by quick brush rotation in the sample bulk. So sample taking and sample giving must be checked under real chromatography conditions long prior to the final chromatography analysis. The change of quantities in such actions should be known exactly enough.

If the sample complexity is small enough and all identified  compounds are available as clean individua, a synthetic quantitative mix can be made and compared with the original sample. If the two analyses are done - the chromatograms look quantitatively equal within the needed precision please send the author an e-mail. There will be no server overload.

If all individua are known quantitatively by C,H,N,O,S,Hal etc., a mikcro elemental quantitative analysis should be equal in the results between calculated and measured. This way industrial critical analyses methods could be corrected and optimized.

More than one column in GC and HPLC could run with two differing mobile phases - for instance in GC with He and H2 or H2 and N2. If the carrier gas mix entered into the sampling system but also into the connection position of the series coupled columns and repeated chromatographic runs were compared under changed gas mixtures, the separation selectivity showed drastic changes, as chromatography is TIME and not single column polarity dependend - see the chapters “Chromatography Combination” and “Alternative Chromatography Theory” in this SITE. A changed gas flow in the two series coupled GC columns or capillaries, a changed mobile phase flow speed in series coupled HPLC colums or capillaries and changed temperatures along the nonpolar/polar column series does the same with drastic selectivity changes from run to run, optimizing the specifity of separations and figthing peak overlapping.

If the outlet of a chromatography column or capillary in GC or HPLC is fed into a short high speed coupled capillary with temperature changes  from low to high for each separated peak, or often enough along the whole peak, overlapping gets an add on separation and produces thousends of follow up chromatograms as Multichromatography. Well: now grandfathers line chromatograms become unreadable and we need a two dimensional picture of the  peaks-under-peaks analysis. This procedure can run so sensitive, that sea water analyses showed over thousand “peaks” as a two diemsnional picture.

If the eluate of one column / capillary is refocussed and injected - peak after peak - into a next chromatography system having other stationare and may be even other mobile phase, peak overlapping - the source of sometimes desastrously false analytical results -  allow to use the whole analytical power of Chromatography. In top international conferences you find training courses on GC x GC, HPLC X HPLC, GC x HPLC but not yet on  GC x PLC, HPLC x PLC which also are powerful methods. One HPLC peak can be completely stored and focussed on a PLC plate up to one full milliliter volume. The mobile phase of this quite large volume is dried off at room temperature and the HPLC-peak substance is refocussed into a sharp narrow circle. Now this circle is reseparated by a volatile mobile phase in the micro circulare mode of planar chromatography. Hopefully you see only on sharp circle of the former HPLC peak. Otherwise the HPLC substance is a mix. MS online micro PLC is possible with high sensitivity. See the MS/PLC training courses in Swizzerland and Germany. Some information you will find here at µPLC .

If at the outlet of chromatographic systems are more than one detector dead volume free connected, we add the power of FID with ECD, FID with MS, MS with MS and in case the problem is not trace analysis we even succeed with FID plus NMR, MS plus UV, IR plus heat conductivity detection.

Now consider this wealth of analytical modes with “comparability only (over) regulated methods”.

And think of the discrepancy between analysis costs versus the costs of systematically false analytical results and the costs of the follow up wrong decisions: wether politically wrong with global impact or just for a court case locally. One to milllions can easily be the range between the costs to get information and the costs when using  wrong results.

Well: We can overlook a life dangerous ppt concentration for toxic traces killing  life in sea water, or we reach only a +- 0.5 weight % repeatability standard deviation instead of +- 0.05 weight % for an important substance concentration in an important product, which nevertheless is still near to 0.2 weight % unsave.

Analysts have a lot of responsibility and need the freedom to use their top knowledge, which has to cover a much wider range than just pushing one button.