The main topic of the whole site is
Detection, Reduction, Correction of
SYSTEMATIC ERRORS in
This covers systematic errors in qualitative and in quantitative
GC [gas chromatography , including standard and
fast micro capillary chromatography]
HPLC [liquid column chromatography including micro HPLC]
PLC [planar chromatography, linear HPTLC and TLC,
circular and anticircular HPTLC]
We developed a new mode of error detection in chromatography
based on the quantitative quality number “sf4” )1
which is a special type of the “one-step-forward” repeatability standard deviation.
This allowed us first to detect systematic quantitative errors and then step by step to reduce them - as an example: in micro capillary gas chromatography down to the not believed level of plus minus
0.002 % absolute repeatability standard deviation for one main compound. It is important to realize, that a small repeatability standard deviation is directly - and linearly - correlated with reduced costs in analysis but mainly in product costs or economical success. The possible question “who needs such a small repeatability standard deviation” is fundamentally wrong.
With the “sf4” procedure we located serious systematic quantity errors in planar chromatography of pharmaceutical products and natural gas analysis. This new “data quality check number sf4” helped in all other chromatography areas including in sophisticated latest micro process GC instrument development and corrections.
To get “sf4” data one must repeat an analysis at least 7 times (N=7) but N=20 runs offer data which allow a deep critical look into hardware and software details. This costs material and man power but is of no problem if automatic sampling repetitions and a fast chromatography technique is available. This may look like making analytical work too expensive. But “sf4” is used rarely and who understands the importance of energy costs - as an example in case of the precision analysis of natural gas - will realize, that accuracy is quite valuable for the energy bill. Only high precision allows to reach high accuracy. Thus excellent precision pays back fast. At mass delivery of quality analyzed products the improvement of the repeatability standard deviation from a poor +- 2 % to an excellent +- 0.002 % can bring very much money per day to the producer in case he can economically use high precision and accuracy on the market.
Top analytical accuracy and precision also helps the user of a product.
In order to bring precision data directly and linearly in correlation with money we propose a further control value, the Standard Certainty. Asking GOOGLE with the key words “analytical certainty” and “analytical uncertainty” and looking into the thousands of coexisting definitions one may understand that the use of the negative expression “uncertainty” may be regulated, but that critical thinking in this respect was not completed. The positive expression “Standard Certainty” has reasons to exist.
We therefore define and introduce the “Standard Certainty” )2 value - see formula below - which is, as to our results, a much better “number” to qualify quantitative analytical data as compared with the many differently regulated “Uncertainty Numbers”. GOOGLE tells what Analytical Uncertainty means globally. For the Standard Certainty formula click here.
)1 : “sf4” data are stepwise calculated repeatability standard deviation values based on four consecutively repeated values. The groups of four overlap by one and result in N-3 final “sf4” values - see in this site under “sf4”. These “quality values” are time correlated. Therefore they show facts unseen by one single standard deviation “s”. In fact there is NO information in one single “s” value about many of the error sources like changed sample composition, changed pressure, flow, temperature, polarity, selectivity and so on. Checking “sf4” values over time in a graphics display switches on critical thinking. For instance: why is there a huge value hump during the first 5 to 7 values and at higher numbers of repetitions the value “sf4” is small and constant? Why a steep increase at the end of a series of 15 -20 strictly equal runs ? Of course there is some calculation work to be done, but we have “table calculation software” like EXCEL.
NOTE however: MICROSOFTS Excel 2003 has a serious weakness. It allows (and provokes) the use of semicolon and minus characters as delimiter and not only the colon between the first and the last measured value for s-, X and sf4 calculations. The use of the semicolon ends up with mathematical errors (wrong numerical calculation by excluding series data) ranging easily from minus 20 to plus 80 % relative. The same weakness was found in Apple Appleworks 6 table calculation and under LINUX in OpenOffice.org 1.1 table calculation programs.
)2 : The “Standard Certainty” = stC is in disagreement with the regulated “Uncertainty”. “stC” plus the found analytical value “u” must be considered as equal to “V”. Thus “stC+u” equals “V” by 99% probability. We could also name it as “analytical non-sharpness”. Thus if a legal contract about delivering an important substance is fixed to the value W weight-% (or any other quantitative unit) , the value W plus “stC” is practically equal W, because there is no measurable difference between W and W + “stC”. In case of analyses in environment or based on other legal limits for which a certain substance concentration or amount must not exceed, the value “W minus stC” must not be reached, as there is no 99% safe difference between W and W minus stC.
The fundamentals behind this definition of Standard Certainty “stC” is given in “Statistic” but its simple formula is given already here:
stC = s * t(99, N-1) / sqrt(N)
s = repeatability standard deviation of analysis
t(99, N-1) = 99% STUDENT factor for N, but N-1 must be taken
N = number of repeated analyses