This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
The IUPAC has defined chromatography as "a method used primarily for the separation of component of a sample, in which the component are distributed between two phases, one of which is stationary while the other moves .the stationary may be a solid or liquid supported on a solid or a gel and may be packed in a column, spread as a layer or distributed as a film. The mobile phase may be gaseous or liquid".
A modern HPLC apparatus is equipped with one or more glass or stainless steel reservoirs each of which contain 500 ml or more of solvent. The reservoirs are often equipped with a means of removing dissolved gases usually O2andN2 that interfere by forming bubbles in the columns and detector systems. These bubbles cause band spreading; in addition they interfere with the performance of the detector
Some HPLC instruments are equipped with a precolumn, which contains a packing chemically identical to that in a analytical column. Particle size is a large hence the pressure drop across the pre column is negligible with respect to the analytical column. The precolumn is mainly used to remove the impurities from the solvent and thus prevent contamination of the analytical column.
Often the limiting factor in the precision of liquid chromatographic measurements lies in the reproducibility with which samples can be introduced in to column packing. It must be noted that overloading of the sample causes band broadening.
HPLC columns are made of high quality stainless steel, polished internally to a mirror finish. Standard analytical columns are 4-5mm internal diameter and 10-30 cm in length. Shorter columns (3-6 cm in length) containing small particle size (3-5Âµm) produce similar or better efficiencies, in terms of the number of theoretical plates (about 7000), that those of 20cm columns containing 10Âµm irregular particles and are used .hen short analysis time and high through out of samples are required. Microbe columns of 1-2mm internal diameter and 10-25 cm in length have certain advantages of lower detection limits and lower consumption of solvent, the latter being important if expensive HPLC grade solvents are used.
9.1.10 Detectors: [69-72]
The most widely used detectors for liquid chromatography are based upon absorption of ultraviolet or visible radiation. Photometers and columns are available from commercial sources. The former often makes use of the 254 nm to 280 nm lines from a mercury source because many organic functional groups absorb in this region. Deuterium or tungsten filament sources with interference filters also provide a simple means of detecting absorbing species. Some modern filters, which can be rapidly switched in to place. Spectrophotometer detectors are considerably more versatile than photometry and are also widely used in high performance instruments. Often these are diode-array instruments that can display an entire spectrum as an analyte exits the column. Another detector, which has found considerable application, is based up on the changes in the refractive index of the solvent that is caused by analyte molecules. In contrast to most of the other detector is its some what limited sensitivity. Several electrochemical detectors have also been introduced that are based on potentiometric, conductometric and voltametric measurements.
The signals from a detector are recorded as deviations from a base line. Two pen recorder are used with instruments having two detectors. The peak position along the curve relative to the starting point denotes the particular component .with proper calibration, the height or area of the peak is a measure of amount of component in a sample.
9.1.12 Chromatographic Parameters:
126.96.36.199 Retention time (tr):
This is the time of emergence of the peak maximum of the component after injection. This is the sum of the times the component spends in the mobile phase (tM) and in the stationary phase.
188.8.131.52 Adjusted retention time:
It is the time the component spends in the stationary phase and is given by
The value of tM is obtained by measuring the time to elute an un retained substance, e.g. air or Methane.
It is the ratio of the time the component spends in the stationary phase to the time in the mobile phase.
184.108.40.206 Retention volume (VR):
This is the volume of carrier gas required to elute one half of the compound from the column by the peak maximum and is given by
VR=tR x f
220.127.116.11 Adjusted retention volume (VR):
This allows for the gas hold up volume of the column which is due to the interstitial volume of the column and the volume of the injector and detector systems .It is given by
V'R=t'R x f
18.104.22.168 Relative retention volume:
Retention volumes for compounds are expressed relative to the retention volume of a standard compound on the same column under the same conditions of a standard compound examined. Therefore, this ratio is given by:
Relative retention volumes can there fore be represented by ratios of the distances on the recorder chart and are the same as relative retention times.
22.214.171.124 Height equivalent to a theoretical plate (HETP):
The column is considered as being made up of a large number of parallel layers or 'theoretical plates', and when the mobile phase passes down the column the components of a mixture on the column distribute themselves between the stationary and mobile phases in accordance with their partition that equilibrium is established with in each plate. The equilibrium however is dynamic and the components move down the column at a definite rate depending on the rate of movement of the mobile phase.
A column may be considered as being made up of a large number of theoretical plates where distribution of sample between liquid and gas phase occurs. The number of theoretical plates (n) in a column is given by the relationship.
W= peak width, i.e the segment of the peak base formed by projecting the straight sides of the peak to the base line,
=peak width at half height
Chromatographers measure the quality of separation by resolution of adjacent bands
t1 and t2 are retention times of the first and second adjacent bands
W1 and W2 are the base line band width.
126.96.36.199 Column Efficiency (N):
Two related terms are widely used as quantitative measures of the efficiency of the chromatographic columns.
Number of theoretical plates
The two are related by the equation
It measures relative retention of two components .selectivity is the function of chromatographic surface (column), melting point and temperature.
9.1.13 HPLC METHOD DEVELOPMENT: 
HPLC method often follows the series of steps, which are summarized in (fig no 9.1). Systematic approach to HPLC method development should be based on the knowledge of the chromatographic process. In most cases, a considerable amount of experimentation may be needed. A good method development strategy should require only as many experimental runs as are necessary to achieve desired final result.
Information on sample defines separation goals.
2. Need for special HPLC procedure sample
3. 3. Choose of detector.
4. Choose LC method; preliminary runs; estimate
Best separation conditions.
5. Optimize separation conditions.
6. Requirements for separation procedures
7a. Recovery of purified material
7. c Qualitative method
7. b Quantitative method
8. Validated method for laboratories released to routine
9.1.14 METHOD VALIDATION 
Method validation is defined as a process of providing that an analytical method is acceptable for its intended use. Method validation provides the method development extremely specific, linear, precise, accurate and sensitive.
The different parameters of analytical method development are discussed below:
Accuracy is the measure of exactness of an analytical method, or the closeness of agreement between the value which is accepted either as a conventional true value or an accepted reference value and the value found. It is measured as the percent of analyte recovered by assay, by spiking samples in a blind study. For the assay of the drug substance, accuracy measurements are obtained by comparison of the results with the analysis of standard reference material or by comparison to a second, well - characterized method. For the assay of the drug product, accuracy is evaluated by analyzing synthetic mixtures spiked with known quantities of components. For the quantitation of impurities, accuracy is determined by analyzing samples (drug substance or drug product) spiked with known amounts of impurities are not available, see specificity.)
To document accuracy the ICH guideline on methodology recommends collecting data from a minimum of nine determinations over a minimum of three concentration levels covering the specified range (for example, three concentrations, three replicates each).
The data should be reported as the percent recovery of the known, added amount, or as the difference between the mean and true value with confidence intervals.
Precision is the measure of the degree of repeatability of an analytical method under normal operation and is normally expressed as the percent relative standard deviation for a statistically significant number of samples. According to the ICH precision should be performed at three different levels: repeatability, intermediate precision and reproducibility. Repeatability is the results of the method operating over a short time interval under the same conditions (inter-assay precision). It should be determined from a minimum of nine determinations covering the specified range of the procedure (for example, three levels, three repetitions each) or from a minimum of six determinations at 100 % of the test or target concentration. Intermediate precision is the results from within lab variation due to random events such as different days, analysts, equipment, etc. In determining intermediate precision, experimental design should be employed so that the effects (if any) of the individual variables can be monitored.
Reproducibility refers to the results of collaborative studies between laboratories. Documentation in support of precision studies should include the standard deviation relative standard deviation, coefficient of variation, and the confidence interval.
Specificity is the ability to measure accurately and specifically the analyte of interest in the presence of the other components that may be expected to b present in the sample matrix. It is a measure of the degree of interference from such things as other active ingredients, exciepients, impurities and degradation products, ensuring that a peak response is due to a single component only, i.e. that no co- elution exist. Specificity is measured and documented in a separation by the resolution, plate count (efficiency), and tailing factor. Specificity can also be evaluated with modern photodiode array detectors that compare spectra collected across a peak mathematically as an indication of peak homogeneity ICH also use the term specificity, and divide it in to two separate categories: identification and assay/impurity tests.
For identification purposes, specificity is demonstrated by the ability to discriminate between compounds of closely related structures, or by comparison to known reference materials. For assay and impurity tests, specificity is demonstrated by the resolution of the two closest eluting compounds. These compounds are usually the major component or active ingredient and an impurity. If impurities are available, it must be demonstrated that the assay is unaffected by the presence of spiked materials (impurities and /or excipients). If the impurities are not available, the test results are compared to a second well- characterized procedure. For impurity tests, the impurity profiles are compared head- to-head.
188.8.131.52 Limit of Detection:
The limit of detection (LOD) is defined as the lowest concentration of an analyte in a sample that can be detected, not quantitated. It is a limit test that specifies whether are not an analyte is above or below a certain value. It is expressed as concentration at a specified signal-to-noise ratio, usually two-or three-to-one. The ICH has recognized the signal- to- noise ratio convention, but also lists two other options to determine LOD: Visual non- instrumental methods and a means of calculating the LOD. Visual non- instrumental methods may include LOD'S determined by techniques such as thin layer chromatography(TLC) or titration .LOD's may also be calculated based on the standard deviation of the response (SD) and the slope of the calibration curve(S) at levels approximating the LOD according to the formula :LOD=3.3(SD/S). The standard deviation of the response can be determined based on standard deviation of the blank, on the residual standard deviation of the regression line, or the standard deviation of y- intercepts of regression lines. The method used to determine LOD should be documented and supported, and an appropriate number of samples should be analyzed at the limit to validate the level.
184.108.40.206 Limit of quantitation:
The limit of quantitation (LOQ) is defined as the lowest concentration of an analyte in a sample that can be determined with acceptable precision and accuracy under the stated operational conditions of the method. Like LOD, LOQ is expressed as a concentration with the precision and accuracy of the measurement also reported. Sometimes a signal- to-noise ratio of ten-to- one is used to determine LOQ. This signal-to-noise ratio is a good rule of thumb, but it should be remembered that the determination of LOQ is a compromise between the concentration and the required precision and accuracy. That is as the LOQ concentration level decreases the precision increases .If better precision is required, a higher concentration must report for LOQ. This compromise is dictated by the analytical method and its intended use. The ICH has recognized the ten-to-one-signal-to-noise ratio as typical, and also, like LOD, lists the same two additional options that can be used to determine LOQ, visual non-instrumental methods and a means of calculating the LOQ. The method is again based on the standard deviation of the response (SD) and the slope of the calibration curve (S) according to the formula: LOQ=10(SD/S) Again, the standard deviation of the response can be determined based on the standard deviation of the blank, on the residual standard deviation of the regression line, or the standard deviation of y- intercepts of regression lines.
220.127.116.11 Linearity and Range:
Linearity is the ability of the method to elicit test results that are directly proportional to analyte concentration within a given range .Linearity is generally reported as the variance of the slope of the regression line. Range is the interval between the upper and lower levels of analyte (inclusive) that have been demonstrated to be determined with precision, accuracy and linearity using the method as written. The range is normally expressed in the same units as the test results obtained by the method. The ICH guidelines specify a minimum of five concentration levels, along with certain minimum specified ranges. For assay, the minimum specified range is from 80-120 % of the target concentration. For an impurity test, the minimum range is from the reporting level of each impurity, to 120 % of the specification (for toxic or more potent impurities, the range should be commensurate with the controlled level).
For content uniformity testing, the minimum range is from 70-130 % of the test or target concentration, and for dissolution testing Â±20 % over the specified range of the test. That is, in the case of an extended release product dissolution test, with a Q- factor of 20 % dissolved after six hours, and 0 % dissolved after 24 hrs, the range would be 0-100 %.
Ruggedness, according to the USP, is the degree of reproducibility of the results obtained under a variety of conditions, expressed as % RSD. These conditions include different laboratories, analysts, instruments, reagents, days, etc. In the guide line on definitions and terminology, the ICH did not address ruggedness specifically. This apparent omission is really a matter of semantics, however, as ICH chose instead to cover the topic of ruggedness as precision, as discussed previously.
Robustness is the capacity of a method to remain unaffected by small deliberate variations in method parameters. The robustness of a method is evaluated by varying method parameters such as percent organic, pH, ionic strength, temperature, etc and determining the effect (if any) on the results of the method. As documented in the ICH guidelines, robustness should be considered early in the development of the method. In addition, if the results of a method or other measurements are susceptible to variations in method parameters, these parameters should be adequately controlled and a precautionary statement included in the method documentation.