This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
For as long as the practice of medicine has existed, physicians have dreamt of being able to replace an opaque cornea. Egyptian manuscripts dating from around 2000 BC suggest the transplantation of skin and corneas (Albert and Edwards 1996). At that time, corneal scars and associated blindness were probably mainly caused by trachoma and injuries; however treatments were based on religious rituals and limited to topical application of mysterious mixtures. The first description of a surgical approach to the treatment of ulcers and scars on the 'transparent membrane' was offered by Galen (130-200AD). He suggested superficial keratectomy and 'abrasio corneae' as procedures to restore the transparency of an opaque cornea (Albert and Edwards 1996). Poor understanding of the anatomy and physiology of the cornea limited any progress in the treatment of corneal disorders until the 18th century, when the first microscopic observations became available. A procedure very similar to our current approach of corneal transplantation was derived from an idea put forward by Erasmus Darwin, the grandfather of Charles Darwin. He suggested the complete removal of an opaque cornea by trephination: 'Could not a small piece of cornea be cut out by a kind of trephine about the size of a thick bristle, or a small crow quill, and would it not heal with a transparent scar?' (Albert and Edwards 1996). However, it was not until 1905 that Eduard Zirm performed the first successful human corneal graft (Moffatt et al. 2005). This significant milestone was achieved only after many decades of unsuccessful trial and error; however, it did not lead to relatively 'routine' keratoplasty success for several more decades. It was ultimately due to parallel advances in medicine such as anaesthesia and antisepsis that Zirm's success was finally achieved.
Several reports on large series of corneal grafts in the early 1930s indicated success rates of 20-40% despite the lack of any form of immunosuppressive treatment (Moffatt et al. 2005). With the introduction of topical glucocorticoid treatment, the one year graft survival rate increased to over 90% (Price et al. 1993; Randleman and Stulting 2006; Thompson et al. 2003; Williams et al. 1995). Corneal transplantation has become the most commonly performed surgical procedure in transplantation medicine with estimates of more than 60,000 transplants performed world-wide every year (1988). The excellent one-year graft survival rate, the fact that corticosteroid drops are not often perceived as an immunosuppressive treatment and the indiscriminate use of the term 'immune privilege' have led to a common misconception that corneal grafts do not or only rarely undergo transplant rejection. However, the reality is quite different. In a large corneal graft follow-up study the Australian Corneal Graft registry, which now contains prospectively-collected data from more than 10,000 patients with corneal grafts has shown that the five and ten year corneal graft survival rates are only 74% and 62%, respectively (Coster and Williams 2005). We have now amassed more than a century's worth of experience in transplantation of the cornea, making it the oldest procedure in transplant surgery and representative of a milestone in human medicine. However, there are still limitations to corneal transplantation, of which corneal allograft rejection poses the greatest challenge (Coster and Williams 2005).
1.2. Corneal structure and anatomy
The cornea is the clear, dome shaped structure at the front of the eye, lying between the environment-protective conjunctiva and the aqueous humour of the anterior chamber (Figure 1a). The tissue's main function is to refract light as it passes into the eye to the retina. Its avascularity contributes to the corneas transparency and is a vital component in maintaining clarity of vision. The cornea, which is usually 0.5-0.7 mm in thickness, can expand to > 1 mm during rejection-induced inflammation or damage, and is indicative of graft failure.
The cornea consists of five layers- the epithelium, Bowman's layer, the stroma, Descemet's membrane and the endothelium (Figure 1b). The epithelium comprises a 6-7 cell thick stratified squamous layer on the exterior of the cornea and provides a protective barrier against the environment and pathogens. Epithelial cell loss and damage can be recovered by migration and replacement of epithelial cells from the peripheral limbus. The stroma forms the majority of the tissue and is composed of a highly organised arrangement of collagen fibrils and differentiated fibroblasts or keratocytes aligned parallel to the corneal surface. The orderly arrangement of stromal components and the state of corneal hydration are both essential for determining the transparency and visual clarity of the cornea. Similar to the epithelium, fibroblast migration, replication and differentiation replace keratocyte loss upon stromal damage. Immune mediated rejection of corneal allografts can occur in the epithelial, stromal and endothelial cell layers. While epithelial rejection involves the anterior layer of the cornea, stromal rejection is associated with the migration of leucocytes through this layer. In endothelial rejection, endothelial cells are destroyed as a result of alloreactive cells passing through the anterior chamber and adhering to the endothelial surface. This mechanism of rejection has the greatest impact on stability and function of the graft as any damage to endothelial cells is irreversible due to the cells being non-replicative (George and Larkin 2004). Immune-mediated rejection of the corneal endothelial cell layer is the most common pathological cause of corneal allograft failure (Bourne et al. 2001).
Figure 1. Anatomical representation of the eye
(a) A simple anatomical description of the eye, depicting the structural location of the cornea. Image courtesy of www.wuphysicians.wustl.edu. (b) Cross sectional view of the corneal cellular layers- Epithelium, Bowman's membrane, Stroma, Descemet's membrane and the corneal endothelial monolayer. Image courtesy of www.bu.edu/histology/p/08002loa.htm.
1.3 Immune privilege in corneal transplantation
The fact that 20-30% of human corneal allografts survived in the 1930's without any form of immunosuppressive treatment (Price et al. 1993; Williams et al. 1995) is an immunological feature which is unparalleled in other fields of transplantation. Therefore, a corneal graft is often referred to as an immune-privileged tissue transplanted to an immune privileged site. Immune-privileged tissues, such as the cornea, cartilage, testis, ovary and tumours are tissues which, when grafted to non-privileged sites of the body, experience extended survival. Immune privileged sites are organs or tissues such as the eye, the brain, the pregnant uterus or tumours, in which non-privileged tissue grafts experience extended survival (Streilein 1993, 2003b).
Many factors contribute to the immune privilege of the ocular site and at the same time contribute to the immune privilege of the corneal tissue. However, it is also important to note that immune privilege does not prevent immune responses and can be overstated (Williams and Coster 1997). Nevertheless it is clear that there are mechanisms present that prevent or attenuate the allogeneic response to donor cornea. These include mechanisms that contribute to privilege by making the immune system ignorant of the presence of a graft, mechanisms that deviate the immune response into a non-destructive pathway and those that suppress immune mechanisms that do manage to penetrate the graft (George and Larkin 2004).
1.3.1 Immunological ignorance
The concept of ocular immune privilege and the importance of the absence of blood vessels in maintaining privilege originated from Medawar's famous experiments (Billingham et al. 1951). In these experiments he showed that skin grafts, which would otherwise have become rejected quickly, survived after being transplanted into a corneal pocket or into the anterior chamber of the eye, as long as grafts did not become vascularised.
Subsequently, Maumenee demonstrated that rabbits with long-surviving avascular corneal grafts rejected secondary skin grafts in the same time frame as rabbits without corneal grafts, indicating that corneal grafts transplanted into avascular corneal beds did not sensitise recipient animals (Maumenee 1951). In the same experiment, he showed that corneal avascularity did not result in immune ignorance, as long term surviving corneal grafts became rejected when recipients rejected their skin grafts. Together his findings suggested that corneal avascularity enabled a local immune-modulating mechanism (this lack of blood inflow prevents immune effector cells from accessing corneal antigens) to enhance acceptance of corneal grafts, but that this immune privilege could be broken even without the presence of blood vessels in the host cornea. Blood vessels in the recipient cornea are easy to detect clinically but the same stimuli that result in neovascularisation also result in proliferation of lymph vessels and migration of APC into the host central cornea, which is normally devoid of lymph vessels and APC (Jager 1992; Rodrigues et al. 1981). More recently, Cursiefsen et al., have demonstrated that in addition to corneal haemangiogenesis in high-risk corneas, the presence of pathological and clinically non-invisible corneal lymphatic vessels in human vascularised high-risk corneas (Cursiefen et al. 2002). Immunologically, these lymphatic vessels act as the afferent arc of the immune reflex circle and enable direct access of donor-derived APC and antigenic material to the regional lymph node, where an immune response is mounted. It has been shown that corneal haem- and lymphangiogenesis occurring before or after keratoplasty significantly increase the risk of immune rejection (Bachmann et al. 2008; Cursiefen et al. 2004).
Until recently, specific markers to identify lymphatic vessels in the cornea were unknown. Recent studies have identified several molecular markers with good lymphatic specificity useful in studies of corneal lymphangiogenesis (Baluk and McDonald 2008). One of the first signalling events promoting the transformation of venous endothelial cells to a lymphatic phenotype is expression of the prospero-related homeobox-1 (Prox 1) transcriptional factor (Hong et al. 2002). Its expression remains present in the nuclei of lymphatic endothelial cells. Vascular endothelial growth factor receptor 3 (VEGFR-3) gene, also key in development, remains present in lymphatic endothelial cells and while initially present in early development of blood vessels, its expression becomes more lymphatic specific with further development (Karpanen et al. 2006). The ligand of VEGFR-3, VEGF-C, is present in the lining of the corneal lymphatics (Karkkainen et al. 2004). However, similar to other VEGF species, it is also present in inflammatory cells that may infiltrate the cornea. Lymphatic vessel endothelial hyaluronan receptor 1 (LYVE-1) has been used extensively as a marker for lymphatic endothelium in the cornea (Cursiefen et al. 2002). Its expression is not as robust in larger collecting lymphatics and it is also expressed by some immune cells, such as dendritic cells and macrophages. Therefore, corneal lymphangiogenesis is closely linked to haemangiogenesis but also includes partially distinct molecular mechanisms. Combined they serve as major routes of disrupting the corneal immune privilege however, a very recent study showed the greater influence of lymphangiogenesis in mediating immune rejection after corneal transplantation (Dietrich et al. 2010). Studies investigating the effects of inhibitors of lymphangiogenesis, such as VEGF-tyrosine kinase inhibitor- ZK 261991, result in improved corneal allograft survival (Hos et al. 2008).
The relative dearth of conventional dendritic cells in normal cornea is a component of ocular immune privilege and if for example, the number of dendritic cells in the graft is increased, by experimental induction of APC migration into donor cornea before transplantation, or by transplantation of peripheral donor cornea, then rejection is more rapid (Ross et al. 1991).
1.3.2 Immune deviation
Immune deviation, frequently termed anterior chamber acquired immune deviation (ACAID), describes the phenomenon whereby introduction of antigen into the anterior chamber of the eye can induce systemic, antigen-specific immune deviation towards suppression of a DTH reaction. This was shown by the elegant experiments of Streilin and coworkers (Niederkorn 1999; Streilein et al. 1980). ACAID is the product of a complex series of cellular interactions that begin in the anterior chamber of the eye where antigen is captured and processed by F4/80+ APCs, which, under the influence of aqueous humour cytokines, demonstrate a unique array of cytokines and cell surface molecules (Niederkorn 2002; Streilein 2003a). The F4/80+ ocular APCs then migrate to the thymus and spleen where they promote the generation of CD4+CD25+ Tregs and CD8+ Tregs. Other cells involved include, B cells, NKT cells and gamma delta T cells (Ashour and Niederkorn 2006; Lin et al. 2005; Skelsey et al. 2003a, 2003b; K. H. Sonoda et al. 1999; K. H. Sonoda et al. 2001; K. H. Sonoda and Stein-Streilein 2002b; Wang et al. 2001).
ACAID has been shown to prolong graft survival in mice mismatched for their major and minor histocompatibility antigens (Dana et al. 1997; Niederkorn and Mellon 1996).
1.3.3 Suppression of immune responsiveness
The cornea expresses a number of molecules that block immune effectors. These include soluble molecules that prevent complement activation as well as the constitutive expression of Fas ligand on corneal epithelium and endothelium (Li et al. 2003). Fas ligand interacts with Fas on infiltrating leucocytes, inducing their apoptosis. It has been shown that apoptosis induced by Fas ligand results in antigen specific tolerance in the eye, a phenomenon not observed when T cells undergo Fas ligand independent necrosis (Li et al. 2003). Blockade of Fas/ Fas ligand interaction results in rejection of corneal allografts, which otherwise would have survived (Stuart et al. 1997; Yamagami et al. 1997). Table 1 summarises the most important soluble components that contribute to immune privilege of the eye.