Looking At Corneal Transplantation Biology Essay

Published: Last Edited:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

The concept of being able to replace an opaque cornea has been the aspiration of surgeons for as long as the practice of ophthalmology has existed. The first suggestion of transplanting corneas was found in Egyptian manuscripts dating over 4000 years ago (Albert and Edwards, 1996). At that time, trachoma and injuries were the primary causes of corneal scars and blindness; however treatments had no scientific basis and were instead based on religious rituals and topical applications of various concoctions. In 130-200AD, Galen offered the first description of a surgical procedure for the treatment of ulcers and scars on the 'transparent membrane'. He suggested superficial keratectomy and 'abrasio corneae' as procedures to restore the transparency of an opaque cornea (Albert and Edwards, 1996). A limited understanding of the anatomy and physiology of the cornea stalled any progress in the treatment of corneal disorders until the first microscopic observations became available in the 18th century. Interestingly, it was an idea put forward by Erasmus Darwin, the grandfather of Charles Darwin that led to a procedure very similar to our current approach of corneal transplantation. He suggested the complete removal of an opaque cornea by trephination: 'Could not a small piece of cornea be cut out by a kind of trephine about the size of a thick bristle, or a small crow quill, and would it not heal with a transparent scar?' (Albert and Edwards, 1996). After decades of trial and error the first successful human corneal graft was performed by Dr. Eduard Zirm in 1905 (Moffatt et al., 2005). This significant landmark however, did not lead to relatively 'routine' keratoplasty success for several more decades. It was eventually due to parallel advances in medicine such as anaesthesia and antisepsis that Zirm's success was finally achieved.

Several reports on large series of corneal grafts in the early 1930s indicated success rates of 20-40% despite the lack of any form of immunosuppressive treatment (Moffatt et al., 2005). With the introduction of topical glucocorticoid treatment, the one year graft survival rate increased to over 90% (Williams et al., 1995, Price et al., 1993, Thompson et al., 2003, Randleman and Stulting, 2006). Corneal transplantation has become the most commonly performed surgical procedure in transplantation medicine with estimates of more than 60,000 transplants performed world-wide every year (1988). The excellent one-year graft survival rate, the fact that corticosteroid drops are not often perceived as an immunosuppressive treatment and the indiscriminate use of the term 'immune privilege' have led to a common misconception that corneal grafts do not or only rarely undergo transplant rejection. However, the reality is quite different. In a large corneal graft follow-up study the Australian Corneal Graft registry, which now contains prospectively-collected data from more than 10,000 patients with corneal grafts has shown that the five and ten year corneal graft survival rates are only 74% and 62%, respectively (Coster and Williams, 2005). As elegantly put by Coster and Williams; 'we have now amassed more than a century's worth of experience in transplantation of the cornea, making it the oldest procedure in transplant surgery and representative of a milestone in human medicine. However, there are still limitations to corneal transplantation, of which corneal allograft rejection poses the greatest challenge' (Coster and Williams, 2005).

1.2. Corneal structure and anatomy

The cornea is the clear, dome shaped structure at the front of the eye, lying between the environment-protective conjunctiva and the aqueous humour of the anterior chamber (Figure 1a). The tissue's main function is to refract light as it passes into the eye to the retina. Its avascularity contributes to the corneas transparency and is a vital component in maintaining clarity of vision. The cornea, which is usually 0.5-0.7 mm in thickness, can expand to > 1 mm during rejection-induced inflammation or damage, and is indicative of graft failure.

The cornea consists of five layers- the epithelium, Bowman's layer, the stroma, Descemet's membrane and the endothelium (Figure 1b). The epithelium comprises a 6-7 cell thick stratified squamous layer on the exterior of the cornea and provides a protective barrier against the environment and pathogens. Epithelial cell loss and damage can be recovered by migration and replacement of epithelial cells from the peripheral limbus. The stroma forms the majority of the tissue and is composed of a highly organised arrangement of collagen fibrils and differentiated fibroblasts or keratocytes aligned parallel to the corneal surface. The orderly arrangement of stromal components and the state of corneal hydration are both essential for determining the transparency and visual clarity of the cornea. Similar to the epithelium, fibroblast migration, replication and differentiation replace keratocyte loss upon stromal damage. Immune mediated rejection of corneal allografts can occur in the epithelial, stromal and endothelial cell layers. While epithelial rejection involves the anterior layer of the cornea, stromal rejection is associated with the migration of leucocytes through this layer. In endothelial rejection, endothelial cells are destroyed as a result of alloreactive cells passing through the anterior chamber and adhering to the endothelial surface. This mechanism of rejection has the greatest impact on stability and function of the graft as any damage to endothelial cells is irreversible due to the cells being non-replicative (George and Larkin, 2004). Immune-mediated rejection of the corneal endothelial cell layer is the most common pathological cause of corneal allograft failure (Bourne et al., 2001).



Figure 1. Anatomical representation of the eye

(a) A simple anatomical description of the eye, depicting the structural location of the cornea. Image courtesy of www.wuphysicians.wustl.edu. (b) Cross sectional view of the corneal cellular layers- Epithelium, Bowman's membrane, Stroma, Descemet's membrane and the corneal endothelial monolayer. Image courtesy of www.bu.edu/histology/p/08002loa.htm.

1.3 Immune privilege in corneal transplantation

The fact that 20-30% of human corneal allografts survived in the 1930's without any form of immunosuppressive treatment (Price et al., 1993, Williams et al., 1995) is an immunological feature which is unparalleled in other fields of transplantation. Therefore, a corneal graft is often referred to as an immune-privileged tissue transplanted to an immune privileged site. Immune-privileged tissues, such as the cornea, cartilage, testis, ovary and tumours are tissues which, when grafted to non-privileged sites of the body, experience extended survival. Immune privileged sites are organs or tissues such as the eye, the brain, the pregnant uterus or tumours, in which non-privileged tissue grafts experience extended survival (Streilein, 2003a, Streilein, 1993).

Many factors contribute to the immune privilege of the ocular site and at the same time contribute to the immune privilege of the corneal tissue. However, it is also important to note that immune privilege does not prevent immune responses and can be overstated (Williams and Coster, 1997). Nevertheless it is clear that there are mechanisms present that prevent or attenuate the allogeneic response to donor cornea. These include mechanisms that contribute to privilege by making the immune system ignorant of the presence of a graft, mechanisms that deviate the immune response into a non-destructive pathway and those that suppress immune mechanisms that do manage to penetrate the graft (George and Larkin, 2004).

1.3.1 Immunological ignorance

The concept of ocular immune privilege and the importance of the absence of blood vessels in maintaining privilege originated from Medawar's famous experiments (Billingham et al., 1951). In these experiments he showed that skin grafts, which would otherwise have become rejected quickly, survived after being transplanted into a corneal pocket or into the anterior chamber of the eye, as long as grafts did not become vascularised.

Subsequently, Maumenee demonstrated that rabbits with long-surviving avascular corneal grafts rejected secondary skin grafts in the same time frame as rabbits without corneal grafts, indicating that corneal grafts transplanted into avascular corneal beds did not sensitise recipient animals (Maumenee, 1951). In the same experiment, he showed that corneal avascularity did not result in immune ignorance, as long term surviving corneal grafts became rejected when recipients rejected their skin grafts. Together his findings suggested that corneal avascularity enabled a local immune-modulating mechanism (this lack of blood inflow prevents immune effector cells from accessing corneal antigens) to enhance acceptance of corneal grafts, but that this immune privilege could be broken even without the presence of blood vessels in the host cornea. Blood vessels in the recipient cornea are easy to detect clinically but the same stimuli that result in neovascularisation also result in proliferation of lymph vessels and migration of APC into the host central cornea, which is normally devoid of lymph vessels and APC (Rodrigues et al., 1981, Jager, 1992). More recently, Cursiefsen et al. have demonstrated that in addition to corneal haemangiogenesis in high-risk corneas, the presence of pathological and clinically non-invisible corneal lymphatic vessels in human vascularised high-risk corneas (Cursiefen et al., 2002). Immunologically, these lymphatic vessels act as the afferent arc of the immune reflex circle and enable direct access of donor-derived APC and antigenic material to the regional lymph node, where an immune response is mounted. It has been shown that corneal haem- and lymphangiogenesis occurring before or after keratoplasty significantly increase the risk of immune rejection (Bachmann et al., 2008, Cursiefen et al., 2004).

Specific markers to identify lymphatic vessels in the cornea were, until recently unknown. A number of studies have identified several molecular markers with good lymphatic specificity helpful in studies of corneal lymphangiogenesis (Baluk and McDonald, 2008). Hong et al. showed that one of the first signs that indicated lymphangiogenesis had begun and which induced the transformation of venous endothelial cells to a lymphatic phenotype is expression of the prospero-related homeobox-1 (Prox 1) gene (Hong et al., 2002), which lymph vessels continue to express in their nuclei. Karpanen et al. demonstrated that the vascular endothelial growth factor receptor 3 (VEGFR-3) gene, also important in lymphangiogenesis, remains present in lymphatic endothelial cells, and although present in early development of blood vessels, its expression becomes more lymphatic specific with further development (Karpanen et al., 2006). VEGF-C, The ligand of VEGFR-3, has been shown to be present in the lining of the corneal lymphatics (Karkkainen et al., 2004). However, like other VEGF species, its expression is not confined to haem and lymph vessels; being present also in inflammatory cells that may infiltrate the cornea. Lymphatic vessel endothelial hyaluronan receptor 1 (LYVE-1) has been used extensively as a marker for lymphatic endothelium in the cornea (Cursiefen et al., 2002). Its expression is not as specific in larger collecting lymphatics and can be expressed by other immune cells, such as dendritic cells and macrophages. Therefore, it appears that corneal lymphangiogenesis is closely linked to haemangiogenesis but each process has its own partially distinct molecular mechanisms. Together they serve as a significant means of disrupting the corneal immune privilege, however a very recent study showed the greater influence of lymphangiogenesis in mediating immune rejection after corneal transplantation (Dietrich et al., 2010). Studies investigating the effects of inhibitors of lymphangiogenesis, such as VEGF-tyrosine kinase inhibitor- ZK 261991, result in improved corneal allograft survival (Hos et al., 2008).

The relative dearth of conventional dendritic cells in normal cornea is a component of ocular immune privilege and if for example, the number of dendritic cells in the graft is increased, by experimental induction of APC migration into donor cornea before transplantation, or by transplantation of peripheral donor cornea, then rejection is more rapid (Ross et al., 1991).

1.3.2 Immune deviation

Immune deviation, frequently termed anterior chamber acquired immune deviation (ACAID), describes the phenomenon whereby introduction of antigen into the anterior chamber of the eye can induce systemic, antigen-specific immune deviation towards suppression of a delayed-type hypersensitivity reaction (DTH). This was shown by the elegant experiments of Streilin and coworkers (Streilein et al., 1980, Niederkorn, 1999). ACAID is the product of a complex series of cellular interactions that begin in the anterior chamber of the eye where antigen is captured and processed by F4/80+ APCs, which, under the influence of aqueous humour cytokines, demonstrate a unique array of cytokines and cell surface molecules (Niederkorn, 2002, Streilein, 2003b). The F4/80+ ocular APCs then migrate to the thymus and spleen where they promote the generation of CD4+CD25+ Tregs and CD8+ Tregs. Other cells involved include, B cells, NKT cells and gamma delta T cells (Lin et al., 2005, Sonoda et al., 1999, Sonoda et al., 2001, Sonoda and Stein-Streilein, 2002a, Wang et al., 2001, Skelsey et al., 2003a, Skelsey et al., 2003b, Ashour and Niederkorn, 2006).

ACAID has been shown to prolong graft survival in mice mismatched for their major and minor histocompatibility antigens (Niederkorn and Mellon, 1996, Dana et al., 1997).

1.3.3 Suppression of immune responsiveness

The cornea expresses a number of molecules that block immune effectors. These include soluble molecules that prevent complement activation as well as the constitutive expression of Fas ligand on corneal epithelium and endothelium (Li et al., 2003). Fas ligand interacts with Fas on infiltrating leucocytes, inducing their apoptosis. It has been shown that apoptosis induced by Fas ligand results in antigen specific tolerance in the eye, a phenomenon not observed when T cells undergo Fas ligand independent necrosis (Li et al., 2003). Blockade of Fas/ Fas ligand interaction (using FasL KO mice) results in rejection of corneal allografts, which otherwise would have survived (Stuart et al., 1997, Yamagami et al., 1997). Table 1 summarises the most important soluble components that contribute to immune privilege of the eye.