The main topic of the whole site is

Detection, Reduction, Correction of


Analytical Chromatography.

This covers systematic errors in qualitative and in quantitative

GC [gas chromatography , including standard and
fast micro capillary chromatography]
HPLC [liquid column chromatography including micro HPLC]
PLC [planar chromatography, linear HPTLC and TLC,
circular and anticircular HPTLC]

We developed a new mode of error detection in chromatography
based on the quantitative quality number  “sf4

which is a special type of the “one-step-forward” repeatability standard deviation.
This allowed us first to detect systematic quantitative errors and then step by step to reduce them - as an example: in micro capillary gas chromatography down to the not believed level of plus minus
0.002 % absolute repeatability standard deviation for one main compound. It is important to realize, that a small repeatability standard deviation is directly - and linearly - correlated with reduced costs in analysis but mainly in product costs or economical success. The possible question “who needs such a small repeatability standard deviation” is fundamentally wrong.
With the “sf4” procedure we located serious systematic quantity errors in planar chromatography of pharmaceutical products and natural gas analysis. This new “data quality check number sf4” helped in all other chromatography areas including in sophisticated latest micro process GC instrument development and corrections.

To get “sf4” data one must repeat an analysis at least 7 times (N=7) but N=20 runs offer data which allow a deep critical look into hardware and software details. This costs material and man power but is of no problem if automatic sampling repetitions and a fast chromatography technique is available. This may look like making analytical work too expensive. But “sf4” is used rarely and who understands the importance of energy costs - as an example in case of the precision analysis of natural gas -  will realize, that accuracy is quite valuable for the energy bill. Only high precision allows to reach high accuracy. Thus excellent precision pays back fast. At mass delivery of quality analyzed products the improvement of the repeatability standard deviation from a poor +- 2 % to an excellent +- 0.002 % can bring very much money per day to the producer in case he can economically use high precision and accuracy on the market.

Top analytical accuracy and precision also helps the user of a product.

In order to bring precision data directly and linearly in correlation with money we propose a further control value, the Standard Certainty.  Asking GOOGLE with the key words “analytical certainty” and  “analytical uncertainty” and looking into the thousands of coexisting definitions one may understand that the use of the negative expression “uncertainty” may be regulated, but that critical thinking in this respect was not completed. The positive expression “Standard Certainty” has reasons to exist.

We therefore define and introduce the “Standard Certainty” )2 value - see formula below -  which is, as to our results, a much better “number” to qualify quantitative analytical data as compared with the many differently regulated “Uncertainty Numbers”. GOOGLE tells what Analytical Uncertainty means globally. For the Standard Certainty formula click here.

)1 : “sf4” data are stepwise calculated repeatability standard deviation values based on four consecutively repeated values. The groups of four overlap by one and result in N-3 final “sf4” values - see in this site under “sf4”. These “quality values” are time correlated. Therefore they show facts unseen by one single standard deviation “s”. In fact there is NO information in one single “s” value about many of the error sources like changed sample composition, changed pressure, flow, temperature, polarity, selectivity and so on. Checking “sf4” values over time in a graphics display switches on critical thinking. For instance: why is there a huge value hump during the first 5 to 7 values and at higher numbers of repetitions the value “sf4” is small and constant? Why a steep increase at the end of a series of 15 -20 strictly equal runs ? Of course there is some calculation work to be done, but we have “table calculation software” like EXCEL.

NOTE however: MICROSOFTS Excel 2003 has a serious weakness. It allows (and provokes) the use of semicolon and minus characters as delimiter and not only the colon between the first and the last measured value for s-, X and sf4 calculations. The use of the semicolon ends up with mathematical errors (wrong numerical calculation by excluding series data) ranging easily from minus 20 to plus 80 % relative. The same weakness was found in Apple Appleworks 6 table calculation and under LINUX in OpenOffice.org 1.1 table calculation programs.

)2 :  The “Standard Certainty” = stC is in disagreement with the regulated “Uncertainty”. “stC” plus the found analytical value “u” must be considered as equal to “V”. Thus “stC+u” equals “V” by 99% probability. We could also name it as “analytical non-sharpness”.  Thus if a legal contract about delivering an important substance is fixed to the value W weight-%  (or any other quantitative unit) , the value W plus “stC” is practically equal W, because there is no measurable difference between W and W + “stC”.  In case of analyses in environment or based on other legal limits for which a certain substance concentration or amount must not exceed, the value “W minus stC” must not be reached, as there is no 99% safe difference between W and W minus stC.

The fundamentals behind this definition of Standard Certainty “stC” is given in “Statistic” but its simple formula is given already here:

          stC = s * t(99, N-1) / sqrt(N)

          s = repeatability standard deviation of analysis
          t(99, N-1) = 99% STUDENT factor for N, but N-1 must be taken
          N = number of repeated analyses

Sources for qualitative errors in GC, HPLC, PLC and sister modes:

The mobile phase flow speed and the temperature of the separation system are strong factors changing the quality values in chromatography. These are TIME-dependent values. The residence time in the mobile phase  (symbol : tm) adds to the residence time in the stationary phase (symbol ts). The sum of both time values (symbol tms) is called the raw, non corrected, non adjusted retention time. Thus tms = tm + ts. But only ts is correlated with separation. Quality data based on “tms” only or on relative tms data, are prone to systematic errors. Therefore often qualitative inner standards are used, to make identifications safer. However only ONE qualitatively known inner standard is not enough - two must be added to a sample and co chromatographed, then qualitative results become safer.
Only top quality chromatography instruments have temperature controlled electronic pressure and flow controllers which handle the gas pressure at the millibar level and only highly qualified mobile phase pumps in HPLC can provide a constant flow of that precision level which results in qualified chromatography data.

Complex samples may consist of many very similar chemical compounds, which may not show differing “ts” values. Whole groups of substances may this way remain non separated; their peaks “overlap”. Perfect overlapping results in serious quality errors as even the best highly substance specific detector like the mass spectrometer (even as MS/MS online the separation system) gives wrong answers to the quality question. There are some statements in the literature, that spectroscopic detectors like MS, UV, IR solve such overlapping problems. This is simply NOT true in case peaks overlap precisely enough, that is at the same retention position.

Quantity error sources are
- weak integration software which cannot accurately enough decide between the baseline of the chromatogram, the start and the end of a peak, small peaks close to a large one as shoulder or as signals “sitting on the main peak tail”
- detectors (and signal amplifiers) with too a small quantitative working range (see under working range)
- detectors with fundamentally non linear calibration characteristics;
- not used correct substance and detector specific correction factors to get weight from peak area;
- as quantitative values depend on the correct substance identification, we need accurate and sharp retention time data. Only top integration software is able to calculate the precise retention time value based on fifth polynomial interpolation of 7 signal height over time values around the peak top. We could show, that retention time precision of +- 0.01 second in 5 to 20 minute long runs per sample is necessary for precision identification. 
- as quantitative data accuracy depends on the absence of peak overlapping a very accurate peak width value in half height (b05) in seconds down to +- 0.001 second is a very helpful warning tool. Just in high resolution chromatography - especially in isothermal and temperature programmed GC - a graphics display of peak width data over the retention time is most informative. Only very qualified integration software is able to get accurate b05 data. The calculation of peak width values based on the assumption, the peak shape is Gaussian, is useless. It is not - see later in this site.
- detectors measure substance specific. Thus it is possible that even a main compound remains invisible. This causes drastic systematic quantity errors if not corrected either by chromatogram combination or by quantitative inner standard techniques.
- columns in GC and HPLC can chemisorb sample parts. This effect is a further source for quantity errors if not kept under control by inner standard techniques.

Latest chromatography symposia feature especially the comprehensive separation techniques which means “total separation” using multidimensional modes. The end of a column or capillary is connected with a second one using either different stationary or mobile phases. The attached columns or capillaries are short and so fast, that during the elution of a substance from the first separation system several consecutive and complete separations of timely “cutted” peak parts are done, detected, integrated and stored. Classical records of chromatograms are no longer possible but by two dimensional spot graphics with intensity colors in the graphics display the analyst gets a “picture information”. Two dimensional planar chromatograms and comprehensive GC or HPLC runs look similar. It may soon be realized, that this is no accurate analysis anymore to get a single answer to a single question. Humans can make use of only a very few information portions. 10 000 peak data are a good answer to complexity questions and chemical class informations is available easily and also often importan. But may be thats it. Lots to do, to reduce those data drastically but without loss of applicable information.

But as overlapped peaks mostoften give wrong analytical answers it remains: Clean single peaks with a POP value of 100 % are needed.  (POP = purity of peak)
Thus use at least two dimensional, better multidimensional separations, NEVERTHELESS.
It may be possible that instrument companies learn from modern Computer Tomography Techniques (for example from magneto resonance techniques and its data taking/storing concept) how to take 20 to 200 high speed graphics into computer memory and make this huge bunch of data available to the analyst using just those techniques which allow the medical doctor to see immediately by some mouse movement what he wants to understand.
It might be also that instrument companies change their restriction on the possible precision and accuracy of retention time taking and storing. It is easy to reach a retention time precision of
+- 0.001 second despite of noise - this is one order “sharper” than mentioned above. This is possible with qualified data statistics to get the peak maximum by interpolation after clever noise reduction without falsification of what the real retention time value is. Already fifth degree polynomial interpolation of only seven raw data points around the peak maximum does it. Immediately now precision retention data in well controlled chromatography (if temperature, pressure, flow precision or programming is perfect) become such a powerful tools for positive identification, that - as only one example - GC/MS/MS does no longer sound so important in the routine analysis field. Well: the chromatographer should always have economy aspects available at background thinking and reasons why the better mode is the enemy of the less good one.

There are several modes of multi chromatography techniques known but often not available as commercial product. This is the reason to show one of the many possibilities for micro HPLC - see below.

As mentioned above:

Elution Chromatography is time based; now details.
But classical theoretical concepts are fixed with retention volumes, no longer measurable in micro systems and time is considered to be less important. Single substance static values count like partition coefficients.
Only planar chromatography is done under conditions of an equal separation time valid for all substances as no elution is needed. The author repeats the “time basis” as only this makes the figure below understandable
In elution chromatography - gas chromatography GC and column liquid chromatography HPLC - the separation costs time from the second the sample is injected and the second the signal of separated substances is measured. Let us call this total chromatography time of any separated substance the raw retention time, as used above and still not yet accepted generally with the symbol tms.
tms is the sum of two time portions: the residence time, equal for all substances, which reside in the mobile phase. We use the symbol tm (tm = residence time in the mobile phase).
A chromatographic separation of differing substances in elution chromatography is only possible, if their residence time in the stationare phase differs.
We use the symbol ts (ts = residence time in the stationary phase) 
As the time part tm is equal for all substances there is no separation possible for differing substances with equal ts-values. Thus the art of successful chromatography is based on the knowledge about how to make ts-values differ for substances to be separated. As time not only costs money superior chromatography is fast. But a sample must be taken, given and completely molecularly solved in the mobile phase. If the complete solution into the mobile phase costs time, the ts part of a substance is no longer a sharp value but  becomes a time window starting with the first sample part going into solution or into the mix with the mobile phase up to the last part of the substance completing the transfer into the mobile phase. This time window overlaps with the ts-value differences of differing substances. This deteriorates an otherwise possibly good separation and causes at least quantitative but also qualitative systematic errors.
Thinking in  time values we have much better possibilities to understand and master chromatography than thinking in static fundamentals like solubility, partition coefficients, and other classical models of theoretical chromatography which use preferably single substance data. Chromatography is a highly dynamic process. One enemy is equally active in GC, HPLC and PLC: diffusion, which acts against separation. As separation is possible only by movement, the homogeneity of the local flow speed of the mobile phase is of utmost importance. Local non homogeneity acts against separation as it causes differing local flow speed and thus enlarges the time window of substances with equal ts-values. Let us use the word “peak width b05” for “time window” and understand b05 as the peak width in half of the peak height. There is a further reason to master chromatography exclusively time based: time can be measured to day extremely accurate and precise at very low costs. Volume and or flow cannot be measured accurately and easily.

The figure below shows a capillary HPLC concept with two chemically differing micro packings connected in series. Above the 200 mm long capillary packed with a polar and an apolar stationary phase we position a 100 mm long tube oven (an electrically heated  metal tube with the twin LC capillary inside). We inject a sample and keep the hot tube in position. We inject a next sample but move the tube prior start of the next chromatogram by one mm from apolar to the polar direction. Injecting hundred samples and moving the tube hundred times for one millimeter, we change the selectivity of the stationary phase in hundred steps from highly polar to highly apolar, although chemically the mobile and stationary phases remain completely unchanged.  It is easy to understand the function of this physically caused 100% change of the stationary phase selectivity if we think in time instead of solubility, sorption, partition coefficient a.s.o.
What is changing, is the residence time of the substances in the stationary phase because of the phase temperature. If the apolar phase is hot and the polar phase is cold, the residence time in the apolar phase is small and in the polar phase it is large. Thus the column changes the polarity or selectivity. The change is very strong.

This has nothing to do with the column temperature effect well known in HPLC but not yet widely used.  A changed physical position of the hot tube changes the overall polarity of this “tandem capillary” by a third order function of relative ts-values in relation to the local temperature, whilst the column temperature in HPLC is a first order effect on the retention time tms.


Mechanical (fine-) adjustment of the polarity / selectivity of a HPLC double column or capillary system
with constant mobile phase composition. This concept is so simple and safe that it would be possible to fine tune the polarity of the stationary phase by computer action. That would offer auto optimization with the additional benefit of programmed separations with constant mobile phase composition. As known, this allows for the best possible quantitation conditions as no baseline or selectivity drift can happen. This drift which depends on the mobile phase composition is one of the reasons why HPLC is quite weak in trace analysis application.

Retention time and identification.
As tm values have nothing to do with separation, identification is based exclusively on ts-values. But as we only can measure “tms = tm + ts” values, we sometimes MUST find out the correct tm-value. IfC has a perfectly working concept.

With well designed, well treated and perfectly used equipments the retention time of a peak is its first address for identification. The more precise the time value is found by a qualified integration program, the more power have identification modes if based on comparison. Successful is the use of retention time windows within which a quantitative time comparison is done. In gas chromatography the best selection of window based identification are retention index ranges. The time window from retention index 100 up to 1800 is experimentally found using the saturated normal hydrocarbons C1 to C18. Not all stationary phases in GC however are qualified for hydrocarbon separations. Instead of those saturated normal fatty acid methyl esters can be used. In critical cases of highly polar substance analyses saturated normal chain fatty alcohols are applicable. They keep data tables for auto-identification stable for many months in mass routine analyses. Using precise time data, the retention index can be calculated to an accuracy and precision of better than 0.1, not seldom better than 0.01 retention index units thus offering an identification sharpness good for more than thousand compounds within one carbon number range.


How the retention index data are calculated depends on the mode of GC chromatography: in strictly isothermal runs the retention index is calculated by linear interpolation of the log(ts) values and already two consecutive test hydrocarbons are enough for correct values within the time window given by the two homologues. Under programmed conditions at least seven consecutively selected test hydrocarbons like C10 up to C17 are needed and the calculation is done using fifth polynomial interpolation of seven time data within the tms values for C10 up to C17 (for all of the eight peaks) as an example.
There is an easy to understand most simple correlation of the retention index with the separation power expressed as a value which is applicable in gas-, column liquid and planar chromatography under isocratic and programmed conditions. This value is named “Trennzahl” (introduced in the past century by the author). TZ in HPLC and in PLC is widely unknown. TZ is much more useful than values like the theoretical plate number or values for the plate height. TZ depends on the retention time window where it is measured, thus accuracy and precision is also needed when measuring and maximizing the separation power of GC and HPLC columns / capillaries using the TZ maximum.

More details about the retention index, the HPLC k-value, the “Trennzahl” is given in the specific pages above errors in GC, HPLC, PLC

[Home] [We can help] [Systematic C-Errors] [Statistics] [Error Detector "sf4"] [Sampling/Calibration] [Qual.Error GC] [Quant.Error GC] [Qual.Error HPLC] [Quant.Error HPLC] [Qual.Error PLC] [Quant.Error PLC] [Integration] [Chrom. Combination] [µPLC Micro Planar LC] [Altern.Chrom.Theory] [Contact IfC] [About the Author]
[Home] [We can help] [Systematic C-Errors] [Statistics] [Error Detector "sf4"] [Sampling/Calibration] [Qual.Error GC] [Quant.Error GC] [Qual.Error HPLC] [Quant.Error HPLC] [Qual.Error PLC] [Quant.Error PLC] [Integration] [Chrom. Combination] [µPLC Micro Planar LC] [Altern.Chrom.Theory] [Contact IfC] [About the Author]