Retention time reference

You can discuss anything concerning Clarity in this section.
Post Reply
ABurness
Posts: 1
Joined: Wed Aug 27, 2014 11:23 am

Retention time reference

Post by ABurness »

Hi,

We are currently looking into having a Peak Table for each of our products methods, to enable our production labs to at a glance identify the relevant impurities and solvent residues.
Obviously we need to build in sufficient robustness to this that if the retention of the components shifts between runs we don’t ‘lose’ peaks, but conversely we also can’t afford to have the windows so loose that we wrongly assign peaks and assume solvent residues are lower than they are.
As part of our weekly testing of the systems, we do run a SST, where retention times are one of the factors controlling pass/fail, so we can be fairly confident the retention times will not shift massively
Looking at the options within Clarity, it appears that assigning the main product peak as a reference, and allowing it to adjust retention times with a smaller window will solve these issues, however there are a few questions:
• Presumably we need to set the method to ‘auto’ recalibrate rather than manual on the run?
• If the sample is ran as a ‘sample’ will the retentions adjust automatically, or does it need to be ran as a ‘standard’.?
• Does this adjust the calibration table permanently, or is it temporarily adjusted on a ‘per-run’ basis?
• Are there any other aspects we need to be aware of, or is there a better way of achieving the same result.
Many thanks,
Alex
Alex Burness

User avatar
Daniel Mentlik
DataApex Support
DataApex Support
Posts: 353
Joined: Fri Mar 27, 2009 3:15 pm

Re: Retention time reference

Post by Daniel Mentlik »

Dear Alex,
the suggested way of using reference peak is what I would advise you as well. Let me answer your additional questions and suggests some points of interest in using the functionality:
ABurness wrote:• Presumably we need to set the method to ‘auto’ recalibrate rather than manual on the run?
Setting of "Automatic/Manual" and "Calibration/Recalibration" only references to actions performed in the Calibration window. When using sample type "standard" in the Single Run dialog or Sequence window, the action performed is always the same as performing "Add Existing" operation under "Automatic Recalibration" - you can't do Manual operation as Sequence run is usually unattended and next run must continue, and you can't do Calibration as it does not listen to the Replace/Average/Weight settings and simply replaces the responses. It is also not possible to do anything else then "Add Existing" as a recalibration can't add new peaks into the compound table - those peaks would be missing the Amount value and thus the calculation would not be performed anyway.
ABurness wrote:• If the sample is ran as a ‘sample’ will the retentions adjust automatically, or does it need to be ran as a ‘standard’.?
Yes, the retention times will be adjusted automatically regardless of the sample type. Rather than that, in fact the Calibration retention time windows are recalculated rather than chromatogram retention times. All is governed by the set calibration, which must be connected to the template method. The difference between the "standard" and "unknown" sample types is (outside of where the file is stored - "standards" in /CALIB subdirectory, "unknowns" in /DATA subdirectory) what the linked calibration is used for. For "unknowns", the calibration in the template method is used just for linking it to the resulting chromatogram. For "standards", the calibration set in template method is used for linking to the resulting chromatogram AND the responses from the measured chromatogram calibrated peaks are then used to recalibrate the calibration on set level.
You can read more on calibration in the Clarity Guide, mainly topics on calibration Cloning and Bracketing may be interesting.
ABurness wrote:• Does this adjust the calibration table permanently, or is it temporarily adjusted on a ‘per-run’ basis?
Using the sample type "Standard" or "Blank" (calibration standard on level 0) modifies the calibration permanently, even for the next samples being measured the changes will prevail. You can modify the behavior somewhat by using the calibration cloning and by setting different options in the Calibration Options dialog, but generally this is how the calibration works (changes are linear - what you do with the calibration gets into it).
By modifying I mean that you can disable the calibration retention time shifts - normally this option is switched on and means that when you recalibrate, the retention times from the last recalibration standard replace the retention times in the calibration. You can switch it off by unchecking the Update Retention Times checkbox in the Calibration Options dialog.
Other possible modification is defining the way how the recalibration is applied. By default, the new recalibration response value replaces the value that is currently in the calibration. You can set various averaging options in the Calibration Options.
Last but not least, when using the classic linear approach to calibration without calibration cloning, opening an old chromatogram will link to a new calibration (recalibrated several times from that time) thus changing the results. Each chromatogram can be opened with a stored copy of the calibration from the time it was saved last time.
ABurness wrote:• Are there any other aspects we need to be aware of, or is there a better way of achieving the same result.
Not really - pretty comprehensive description is visible in the Clarity Guide. Please note you can use multiple Reference peaks if needed.

I hope this helps, do not hesitate to ask for more information.
Daniel Mentlí­k
DataApex

Post Reply