2014 Canadian Society of Transplantation Annual Scientific Conference (CST ASM 2014)

Le Centre Sheraton Hotel, Montréal, Québec, February 27 - March 1, 2014

Abstract#: 4
Psychosocial outcomes of living kidney donation: Results from a study of donor-recipient dyads
Deborah Ummel , Marie Achille
University of Montreal
Background : Given the shortage of renal grafts from deceased persons, living kidney donation (LKD) is increasingly being promoted and practised in western countries. Nonetheless, studies show that LDK entails significant challenges to psychosocial and interpersonal adjustment within the donor-recipient relationship. While authors have stressed the importance of studying donors and recipients conjointly as interactive dyads, few studies have done so thus far. The goal of this paper is to present psychosocial outcomes of a live donation by examining the donor and the recipient as an interactive dyad, and in particular how LKD impacts the relationship between the donor and the recipient.

a Methods : The present research is qualitative and follows a phenomenological approach. Ten members of donor-recipient dyads were interviewed individually. Interviews were audio recorded and transcribed verbatim. Dyads included in the study are diversified (type of relationship between the donor and the recipient ; time since donation). Data were analyzed following the principles of Interpretative Phenomenological Analysis developed by Smith (2009).

Results : Results highlight the importance of the particular interpersonal and social context within which the donation took place in shaping the discourse of each donor-recipient dyad. In related dyads, the gesture offered seemed interpreted as a continuum of the role donors adhere to in a larger social context. For examples, themes of reciprocity/equality/rivalry were especially common within siblings relationships. In determined relationships, donating seemed natural and automatic and receiving was easily integrated. In contrast, when the donor-recipient relationship was non-related, meaning making was more difficult to achieve and there was no easily accessible point of reference to understand receiving.

Conclusions : Results provide in-depth information that can in turn be shared with future candidates to donation and transplantation to help them prepare for the experience and inform their decision process. Results also remind us of the importance of not only considering the physical and psychological experience of donors and recipients, but also the larger and interpersonal social context within which the donation takes place in order to derive a more accurate model of health after transplantation.

Abstract#: 5
A novel hyperbranched polyglycerol-based solution for donor organ preservation: A comparison with University of Wisconsin solution in hypothermic preservation of mouse donor hearts
Sihai Gao 1 , Qiunong Guan 2 , Irina Chafeeva 3 , Donald Brooks 3 , Christopher Nguan 2 , Jayachandran Kizhakkedathu 4 , Caigan Du 2
1 Department of Thoracic and Cardiovascular Surgery, Tongji Hospital, Huazhong University of Science and Technology, Wuhan, P.R. China
2 Department of Urologic Sciences, University of British Columbia, Vancouver, BC, Canada
3 Centre for Blood Research, Department of Pathology and Laboratory Medicine, University of British Columbia Vancouver, BC, Canada
4 Centre for Blood Research, Department of Pathology and Laboratory Medicine, University of British Columbia Vancouver, BC, Canada Department of Chemistry, University of British Columbia, Vancouver, BC, Canada
Background: Donor organ injury during hypothermic preservation has negatively impacts on transplant function recovery and survival. Hyperbranched polyglycerol (HPG) is a novel, biocompatible polymer. This study was to compare HPG-based solution with University of Wisconsin (UW) solution in the hypothermic preservation of donor hearts.

Methods. Human endothelial cell cultures were used as an in vitro model. Heart transplantation in mice was used as an in vivo model. Cell death was indicated by lactate dehydrogenase (LDH) release.

Results: Preservation of mouse hearts with HPG solution at 4oC reduced tissue damage compared to those with UW solution. In isotransplantation, transplanted hearts pre-preserved in HPG solution had a better functional recovery than those in UW solution, which was associated with lower degrees of tissue injury and neutrophil infiltration. In allotransplantation, HPG solution-preserved donor hearts survived longer than those in UW solution, indicated by the fact of that nine out of ten transplants from UW solution group failed within 24 h, while only four of nine transplants in HPG group were rejected, and three of them survived with function for 20 days in cyclosporine-treated recipients (P = 0.0175). In cultured cells, more cells survived during preservation with cold HPG solution than those with UW solution, which was correlated with the maintenance of cell membrane fluidity and intracellular adenosine triphosphate.

Conclusion: Preservation with HPG solution significantly enhances the prevention of cold ischemic injury in donor organs, suggesting that HPG solution is a promising alternative to UW solution for hypothermic storage of donor organs for transplantation.

Abstract#: 6
The Impact of a Dedicated Team on Living Organ Donation
Céline Durand 1 , Jacobien Verhave 1 , Héloïse Cardinal 2 , Jo-Ann Fugère 3 , Michel Pâquet 3 , Marie-Chantal Fortin 2
1 Centre de recherche du CHUM
2 Centre de recherche du CHUM Nephrology and Transplantation Division of the CHUM
3 Nephrology and Transplantation Division of the CHUM
Living kidney transplantation (LKT) offers the best medical outcomes for organ recipients. Historically, our centre had a low rate of LKT (between 10% and 20% of all renal transplantations performed). In 2009, in an effort to increase living organ donation (LOD), a dedicated team was created. Its mandate was to promote LOD at our centre and at referring centres, to coordinate assessments of living organ donors, to facilitate the process, and to ensure long-term follow-up after the donation. The aim of this study was to document the impact of this team by comparing LOD rates at our hospital from 2005 to 2008 and from 2009 to 2012.

Using our electronic database, we conducted a retrospective analysis of all living organ donors who contacted our centre from 01-01-2005 to 31-12-2008 and from 01-01-2009 to 31-12-2012. Follow-up was conducted until 01-10-2013.

During the 2005–2008 period, 191 individuals interested in donating a kidney to 150 recipients contacted our centre (an average of 1.27 donors per recipient). A total of 50 renal transplantations were performed using organs from these living donors (26.2%). During the 2009–2012 period, 305 individuals (including 13 altruistic donors) interested in donating a kidney to 202 recipients contacted our centre (an average of 1.5 donors per recipient). A total of 72 (24%) renal transplantations were performed and one is planned in the next month using organs from these living donors, including 8 LKTs through the LDPE. Roughly 12.5% of the potential donors are still waiting for an assessment or are in the process of being evaluated.

The implementation of a dedicated LOD team increased by 59.2% the number of potential donors who contacted our centre, resulting in 46% more LKTs. These data support the creation of dedicated LOD teams to increase LKT.

Abstract#: 8

Extracorporeal Membranous Oxygenation to Lung Transplantation at the University of Alberta

Jackson Wong , Maria Castro , Luara Weingarten , Kathy Jackson , Laurance Lequier , Holger Buchholz , Ivan Rebeyka , Dennis Modry , Ken Stewart , John Mullen , Steve Meyer , Ali Kapasi , Dale Lien , Justin Weinkauf
University of Alberta
Introduction: Outcome of extracorporeal membrane oxygenation (ECMO) bridged to lung transplantation (LTx) varies across centers and children have particularly poor prognosis.

Methods: A retrospective study of our experience in adult and pediatric ECMO bridge to LTx was conducted between 2002-2012.

Results: A total of 350 patients received lung or heart-lung transplant in our institution between 2002-2012. One pediatric and seven adult patients were bridged to LTx with ECMO; age 4–63 years (median 24.5), three males. Primary conditions at referral were cystic fibrosis (3), primary pulmonary hypertension (1 adult, 1 pediatric), idiopathic pulmonary fibrosis (1), Eisenmenger's (1) and Wegener's granulomatosis (1). Conditions leading to ECMO were cardiac arrest (1, pediatric), hemoptysis/pulmonary hemorrhage (3), respiratory failure (4). One patient had 4 days ECMO before a second double LTx for graft failure. Days on ECMO pretransplantation were 1-40 days, median 3.5. Types of ECMO used were venoarterial (7 VA) and venovenous (1 VV). All patients were mechanically ventilated at the time of ECMO. A 4-year-old pediatric patient was on VA ECMO for 40 days (35 days on a semiambulatory VA ECMO). Transplant types include five double lung, one single lung, and two heart-lung transplants. Donor height mismatch was -27 to +15 cm. Hyperoxia test pO2’s on donors were 333-497 mmHg, median 413. Ischemic time was 166-711 minutes, median 354. Total cardiopulmonary bypass (CPB) time during transplant was 240-494 minutes, median 263. The pediatric patient was back on CPB for 24 minutes post implantation and the chest was left open for five days. One adult patient required reexploration and the chest was closed on day-3. Two adult patients required tracheostomy. One adult patient died on day-1 following a single lung transplant for Eisenmenger's syndrome. Seven patients survived to hospital discharge after 23-100 days: time on ventilator 22-1200 hours (median 216), ICU stay 6-18 days, 1 cardiac arrest (pediatric) and completely recovered, 1 renal failure requiring dialysis and 6 (75%, adult and pediatric) survived at one year post LTx.

Conclusions: ECMO to LTx can be successful. In our experience patients bridged from ECMO to single LTx and urgent re-LTx have poor outcome

Abstract#: 10
Assessment of myocardial performance during ex vivo heart perfusion
Christopher White 1 , Yun Li 2 , Alison Müller 2 , Emma Ambrose 2 , Brett Hiebert 1 , Trevor Lee 3 , Rakesh Arora 1 , Ganghong Tian 4 , Jayan Nagendran 5 , Larry Hryskho 2 , Darren Freed 5
1 Cardiac Surgery, St. Boniface Hospital, University of Manitoba, Winnipeg, Canada
2 Institute of Cardiovascular Sciences, St. Boniface Research Center, University of Manitoba, Winnipeg, Canada
3 Anesthesia and Perioperative Medicine, St. Boniface Hospital, University of Manitoba, Winnipeg, Canada
4 National Research Council Institute for Biodiagnostics, Winnipeg, Manitoba
5 Cardiac Surgery, Mazankowski Alberta Heart Institute, University of Alberta, Edmonton Canada
Ex vivo heart perfusion has been proposed as a means to resuscitate non-utilized donor hearts and expand the pool of organs available for transplant. However, a reliable means of demonstrating myocardial functional recovery and organ viability prior to transplantation is required. Therefore, we sought to identify metabolic and functional parameters that were predictive of myocardial performance during ex vivo heart perfusion.
Six normal pig hearts (220±13 grams) and 8 donation after circulatory death hearts (244±13 grams) were procured and perfused ex vivo at 37 oC with a donor blood-STEEN solution (hemoglobin concentration of 45 g/L). Hearts were transitioned from Langendorff mode into a working heart mode for assessments after 1, 3, and 5 hours of ex vivo perfusion. Myocardial performance was determined by measuring the cardiac output indexed to heart weight at a left atrial pressure of 8 mmHg and an aortic diastolic pressure of 40 mmHg. Myocardial functional parameters were assessed using a conductance catheter placed in the left ventricle. Metabolic function was assessed by measuring myocardial oxygen consumption and lactate production. Linear regression with stepwise selection analysis was performed to determine which metabolic and functional parameters best correlated with myocardial performance.
The minimum rate of pressure change (dP/dtmin) was the best functional predictor of myocardial performance (R2=0.915), while the isovolumic relaxation time (Tau; R2=0.780), maximum rate of pressure change (dP/dtmax; R2=0.621), preload recruitable stroke work (PRSW; R2=0.566), end-diastolic pressure volume relationship (EDPVR; R2=0.226), and end-systolic pressure volume relationship (ESPVR; R2=0.144) correlated to a lesser degree. Myocardial oxygen consumption (R2=0.745) was the best metabolic predictor of myocardial performance, while lactate metabolism failed to demonstrate any correlation (R2=0.004). The combination of the dP/dt minand myocardial oxygen consumption was the most reliable predictor of myocardial performance (R2=0.937).
The combination of the dP/dtmin and myocardial oxygen consumption produced the most reliable assessment of myocardial performance during ex vivo heart perfusion. Further studies are required to determine thresholds of these parameters that predict successful transplantation.

Abstract#: 11
Jayan Nagendran 1 , Sabin Bozso 2
1 Division of Cardiac Surgery, University of Alberta
2 Faculty of Medicine and Dentistry, University of Alberta
Lung transplantation remains the only treatment for advanced end-stage lung disease from a variety of etiologies. A profound lack of donor organs remains the greatest challenge in providing lung transplantation, with stagnant rates of lung transplantation at many large centers. As more patients are being referred for lung transplantation, there is a growing rate of deaths on the recipient waitlist.

A 65-year-old female with pulmonary fibrosis who had been on the recipient waitlist for over 2 years was the prospective recipient. She had recently deteriorated and was in imminent need for mechanical ventilation. The donor lungs were procured from a 69-year-old female who had been in a massive motor vehicle accident. This event had caused multiple contusions and a parenchymal laceration leading to air leak requiring wedge resection. Consequently, the best PO2 challenge gas (ie. P/F ratio) was only 267.
Figure A- Condition of the donor lungs immediately before placement on the Lung OCS
Figure B- Condition of the donor lungs immediately after removal from the Lung OCS

These lungs were chosen given the recent deterioration of the prospective recipient. The ex-vivo normothermic lung perfusion (EVLP) run lasted over 10.5 hours, making it the longest successful clinical EVLP case of very marginal lungs ever completed in Canada thus far. Lung function parameters were continuously monitored throughout the run. The donor lungs met acceptable P/F ratios and as such were transplanted into the recipient. After 2 months in hospital the recipient was successfully discharged home where she continues to function well 6 months post-transplant.

This case report adds to the growing literature on the value of EVLP, as ex-vivo perfusion was possible with successful clinical transplantation after 10.5 hours on the Lung Organ Care System (OCS) system. Furthermore, the fact that these were severely marginal lungs is an encouraging advance in increasing our limited donor lung pool and may ultimately lead to improved rates of transplantation for our growing recipient waitlists of patients requiring lung transplantation.

Abstract#: 12
Factors Affecting Discharge Destination Following Lung Transplantation
Min Tang 1 , Nadir Mawji 1 , Samantha Chung 1 , Ryan Brijlal 1 , Jonathan Lim Sze How 1 , Sunita Mathur 2 , Lisa Wickerson 3 , Lianne Singer 4 , Tania Janaudis-Ferreira 5
1 Department of Physical Therapy, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada
2 Department of Physical Therapy, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada Respiratory Medicine, West Park Health Centre, Toronto, Ontario, Canada
3 Department of Physical Therapy, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada Lung Transplant Program, Toronto General Hospital, University Health Network, Toronto, Ontario, Canada
4 Lung Transplant Program, Toronto General Hospital, University Health Network, Toronto, Ontario, Canada
5 Department of Physical Therapy, Faculty of Medicine, University of Toronto, Toronto Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, St. John’s Rehab Program, Toronto Respiratory Medicine, West Park Health Centre, Toronto
Introduction/Background: Lung Transplant (LT) recipients who require additional care may be referred to inpatient rehabilitation prior to discharge home. The purpose of this study was to determine the factors affecting discharge destination following LT.
Methods : A retrospective chart review was conducted on individuals who received either a single, double LT, or heart-lung transplant at our centre between 2006 and 2009. The following data were extracted: pre-transplant diagnosis, age at transplant, pre-transplant exercise capacity measured by six-minute walk distance, and other data pertaining to patient demographics, clinical characteristics, and healthcare utilization. LT recipients were categorized based on discharge destination into either a “home” (HG) or “rehabilitation” group (RG) for analysis.
Results : Medical charts of 243 patients were identified, 197 (81%) were discharged home, 42 (17%) were discharged to inpatient rehabilitation, and four (2%) were discharged to an ‘other’ destination. The median age of the HG was lower (53 years, IQR=38-61) than the RG (57 years, IQR=50-64), p<0.05. The HG had a shorter median post-transplant intensive care unit length of stay (4 days, IQR=15-28) compared to the RG (56 days, IQR=43-88), p<0.001. The RG had lower median baseline six-minute walk distance (245 m; IQR=172-338) compared to the HG (329 m, IQR=242-413), p=0.001. Using chi-square analysis, individuals with CF and COPD were found to be more likely to be discharged home whereas the ILD, PAH, and ‘other’ groups were more likely to be discharged to rehabilitation (p < 0.001).
Conclusion : The present study identified pre-transplant diagnosis, age at time of transplant, pre-transplant functional exercise capacity, and post-transplant intensive care unit length of stay to be factors affecting discharge destination following LT. The identification of these factors has the potential to facilitate early discharge planning and optimize continuity of care.

Abstract#: 14
Allergy and Autoimmunity Following Solid Organ Transplantation – Prevalence, Natural History and Risk Factors
Achiya Amir 1 , Nufar Marcus 2 , Eyal Grunebaum 2 , Anne Dipchand 3 , Diane Hebert 4 , Vicky Ng 1 , Thomas Walters 1 , Yaron Avitzur 1
1 Division of Gastroenterology, Hepatology and Nutrition, The Hospital for Sick Children, Department of Pediatrics, University of Toronto, Toronto, Canada.
2 Division of Allergy and Clinical Immunology,The Hospital for Sick Children, Department of Pediatrics, University of Toronto, Toronto, Canada.
3 Division of cardiology, The Hospital for Sick Children, Department of Pediatrics, University of Toronto, Toronto, Canada.
4 Division of Nephrology, The Hospital for Sick Children, Department of Pediatrics, University of Toronto, Toronto, Canada.
Introduction: Immunologic disorders, including allergic and autoimmune diseases, have been sporadically reported in pediatric recipients of solid organ transplantation and commonly lead to significant morbidity and rarely, mortality. Data regarding their prevalence, natural course and risk factors is limited and has not been studied systematically. Therefore, we conducted a cross-sectional retrospective study to assess these parameters.

Methods: The study cohort included all children (<18 years) who underwent liver, heart, kidney or intestinal transplantation at a pediatric tertiary medical center between 2000-2012, with a follow-up period of 6 months or more post transplant. Patients with a pre-transplant history of allergy or autoimmunity were excluded.

Results: 273 patients (111 liver recipients, 103 heart, 52 kidney, and 7 multiple organs) with a median follow-up period of 3.6 years were included in the study. A total of 92 (34%) patients developed allergy or autoimmune disease after transplantation with a high prevalence among liver (41%) and heart transplant recipients (40%) compared to kidney recipients (4%; P<0.001). Post-transplant allergies included eczema (n=44), food allergy (22), eosinophilc gastrointestinal disease (11) and asthma (28). Autoimmunity occurred in 20 (7.3%) patients, presenting mainly as autoimmune cytopenia (n=10). In a multivariate analysis, female gender, young age at transplantation, family history of allergy, EBV infection and elevated eosinophil count more than 6 months post transplantation were associated with an increased risk for immune dysregulation. Two patients (0.7%) died from autoimmune hemolytic anemia and in 50 patients (18%) the allergy or autoimmunity did not improve overtime.

Conclusions: Allergy and autoimmunity after solid organ transplantation are common in pediatric liver and heart recipients. Allergy and autoimmunity pose a significant health burden on transplant recipients and suggest a state of immune dysregulation post transplant.

Abstract#: 15
Nanovesicles released by apoptotic endothelial cells induce anti-LG3 production and accelerate vascular rejection
Mélanie Dieudé 1 , Christina Bell 2 , Shijie Qi 1 , Nicolas Pallet 1 , Julie Turgeon 1 , Chanel Béland 1 , Matthieu Rousseau 3 , Christiane Rondeau 4 , Claude Perreault 2 , Yves Durocher 5 , Michel Desjardins 4 , Eric Boilard 3 , Pierre Thibault 2 , Marie-Josée Hébert 1
1 Research Centre, Centre hospitalier de l'Université de Montréal (CRCHUM)
2 Institut de Recherche en Immunologie et Cancérologie (IRIC), Université de Montréal
3 CHUL Research Center/CHUQ
4 Université de Montréal
5 Biotechnology Research Institute, Montréal
Mounting evidence suggests that autoimmune humoral responses, such as the production of anti-LG3 antibodies, enhance the severity of vascular injury during allograft rejection episodes. Apoptosis of the endothelium is enhanced during vascular rejection and can trigger the production of various membrane vesicles (MV) of potential importance in modulating humoral responses. Here, we aim at characterizing the protein markers and immunogenicity of the different types of MV released by apoptotic endothelial cells (apoEC) and their impact on rejection.

MV released by apoEC were analyzed by small particle flow cytometry (spFACS) and purified by sequential centrifugation from serum-free medium conditioned by apoEC. Electron microscopy (EM) and differential proteomic MS/MS analyses were performed on apoptotic MV. Aortas from female BALB/c mice were transplanted to fully MHC-mismatched female C57Bl/6 mice in absence of immunosuppression. Purified donor or recipient apoptotic MV were injected intravenously post-surgery every other day for 3 weeks and recipients sacrificed 3 weeks post-transplantation.

Two groups of MV released downstream of caspase-3 activation by apoEC were identified by spFACS and EM: apoptotic bodies (≥800nm) and apoptotic nanovesicles (≤100nm). Proteomic analysis revealed strikingly different protein profiles in apoptotic nanovesicles vs bodies. LG3 (c-terminal fragment of perlecan) was highly enriched in apoptotic nanovesicles. To evaluate the immunogenic potential of apoptotic MV, aortic allograft recipients were injected with apoptotic nanovesicles, apoptotic bodies or vehicle. Injection of apoptotic nanovesicles generated from either the donor or the recipient strain significantly increased anti-LG3 IgG titers, compared to both control groups, demonstrating a specific and alloindependent immune response. Recipients injected with apoptotic nanovesicles also showed increased neointima formation and infiltration with CD3+ cells.

Collectively these results identify apoptotic endothelial nanovesicles as a novel inducer of humoral responses leading to increased anti-LG3 production and accelerated vascular rejection.

Abstract#: 16
Health professionals’ proposals for the implementation of an altruistic unbalanced paired kidney exchange program
Céline Durand 1 , Noémie Boudreault 1 , Andrée Duplantie 2 , Marie-Chantal Fortin 3
1 Centre de recherche du CHUM
2 Programmes de bioéthique de l'Université de Montréal
3 Centre de recherche du CHUM, Service de néphrologie du CHUM
Kidney transplant recipients in the O blood group are at a disadvantage when it comes to kidney exchange programs (KEPs), since they can only receive organs from O donors. A way to remedy this situation is through altruistic unbalanced paired kidney exchange (AUPKE), where a compatible pair consisting of an O donor and a non-O recipient is invited to participate in a KEP. The aim of this study was to gather empirical data about health professionals’ views on AUPKE.

A total of 19 transplant professionals working in 4 Canadian transplant programs, and 19 non-transplant professionals (referring nephrologists and pre-dialysis nurses) working in 5 Quebec dialysis centres took part in semi-structured interviews between 11/2011 and 06/2013. The content of these interviews was analyzed using a qualitative data analysis method.

Respondents’ recommendations focused on: (i) the logistics of AUPKE (e.g., not delaying the transplant for the compatible pair; retrieving organs locally; providing a good quality organ to the compatible pair; maintaining anonymity between pairs); (ii) medical teams (e.g., promoting KEPs within transplant teams; establishing a consensus among members; fostering collaboration between dialysis and transplant teams); (iii) the information provided to compatible pairs (e.g., ensuring that information is neutral); (iv) research (e.g., looking into all transplant options for O recipients; studying all potential impacts of KEPs and AUPKE); and (v) resources (ensuring there are sufficient resources in the system to manage the increased number of renal transplants performed). Transplant professionals were particularly concerned about the information provided to compatible pairs, whereas non-transplant professionals were mostly concerned about the lack of benefits for compatible pairs.

The results of this study can be used to develop future guidelines for the implementation of an AUPKE program in Canada. It will also be important to take into account the views of other stakeholders, such as patients and potential donors, to ensure the appropriate implementation of AUPKE.

Abstract#: 17
Current Perspectives of Urology Involvement in Renal Transplantation: A Survey of Canadian Senior Residents
Jennifer Bjazevic , Thomas McGregor
University of Manitoba
Introduction: Medical advancements in transplantation have lead to increasing complexity of the field and further surgical specialization. Consequently, the role of urology in renal transplantation has become highly variable with the growth of surgeons specialized in multi-organ transplant. However, renal transplantation remains a mandatory component of residency training, as determined by the Royal College of Physicians and Surgeons of Canada. We determined the involvement of urology faculty and residents in renal transplantation, and perceptions of the role of urology in transplantation across Canada.
Methods: An anonymous questionnaire was administered to all thirty-one final-year Canadian urology residents at the Queen’s Urology Examination Skills Training program (QUEST). The survey was devised to assess urological involvement and resident exposure to renal transplantation. Responses were closed ended and utilized a validated five-point Likert scale. Descriptive statistics and Pearson’s chi-squared test were used to analyze the responses and demonstrate correlations.
Results: All residents completed the survey. Urologists were involved in performing renal transplant surgery at most training centers across Canada (77.4%). The majority of residents believed that urology should remain highly involved with transplant (77.4%), and that it should be a mandatory component of residency training (64.5%). There was a positive correlation between the involvement of urology in renal transplantation at a resident’s training centre, and the opinion that urology should continue to play an important role in this field (r=0.51, p=0.003). However, barely half of the residents (51.6%) felt they had sufficient exposure to transplant surgery. Only 41.9% would feel comfortable performing transplant surgery after residency, and these residents were involved in an average of 30 transplant surgeries and 16 laparoscopic donor nephrectomies. A minority of residents had plans for fellowship training (9.7%) or future careers (12.9%) involving renal transplant.
Conclusion: Renal transplantation remains a limited component of the majority of residency training programs in Canada. However, the number of residents intending to pursue fellowship training or a future career that involves transplant remains limited. Consequently, a strong exposure to renal transplant during urology residency training is vital to ensuring urology remains highly involved in renal transplantation.

Abstract#: 18
Comparison of the ability of expanded peripheral versus thymic Tregs to suppress immune responses in transplantation.
Romy Hoeppli 1 , Esme Dijke 2 , Jessica Qing Huang 1 , Alicia McMurchy 1 , Lori West 3 , Megan Levings 1
1 Department of Surgery, University of British Colombia, Vancouver, BC
2 Department of Pediatrics, University of Alberta, Edmonton, AB; Alberta Transplant Institute, Edmonton, AB
3 Department of Pediatrics, University of Alberta, Edmonton, AB; Alberta Transplant Institute, Edmonton, AB; Department of Surgery, University of Alberta, Edmonton, AB.
Introduction: Transplantation is often subject to the risk of graft rejection or graft-versus-host disease (GVHD). Cell-based therapy with FOXP3+ T regulatory cells (Tregs) to induce tolerance to alloantigens could eliminate these complications. However, expanding enough human Tregs from blood to use in patients is challenging due to limited growth and potential for contamination with effector T cells. Discarded pediatric thymuses from cardiac surgery could provide an alternative source of Tregs which are less likely to be contaminated with effector T cells. However, whether thymic Tregs are as effective as peripheral Tregs at suppressing responses to transplanted antigens is unknown.

Methods / Results: Two different protocols were used to expand human peripheral Tregs: Tregs were stimulated with anti-CD3/28-coated beads or artificial antigen presenting cells (APCs) that express human CD58, CD86 and the human CD32 Fc receptor to immobilize soluble anti-CD3 mAbs. We consistently achieved the highest Treg expansion with artificial APCs. This condition was further optimized by the use of serum-free OpTMizer T cell Expansion Medium, resulting in over 100-fold expansion after 14 days. These culture conditions were similarly effective at expanding thymic Tregs which in comparison to peripheral Tregs retained a significantly higher proportion of FOXP3+ cells. To compare their suppressive function in vivo, we established a humanized-mouse model of GVHD which involves irradiation of immunodeficient NSG mice, followed by injection of 10x106 PBMCs. After ~2 weeks, human T cells engraft, and the mice lose weight and show clinical signs of GVHD. NSG mice will next be injected with PBMC in the absence or presence of different ratios of expanded peripheral or thymic Tregs. Clinical GVHD scores will be monitored and upon sacrifice, flow cytometry and histology will be performed to quantify the relative effectiveness of peripheral versus thymic Tregs.

Conclusion: We have developed optimized expansion conditions for peripheral and thymic Tregs and established a humanized-mouse model to test their function in vivo. Comparison of the potency of Tregs from peripheral blood and thymuses will reveal whether thymuses are a suitable source for continued development of Treg cell-based therapy.

Abstract#: 19
Autophagy fosters myofibroblast differentiation through mTORC2 activation and downstream upregulation of CTGF
Monique Bernard , Mélanie Dieudé , Katia Hamelin , Katy Underwood , Marie-Josée Hébert
Fibrosis is a key hallmark of failing allografts. Recent evidence implicates autophagy in myofibroblast differentiation leading to fibrosis. Autophagy is a conserved catabolic pathway activated in response to stress or starvation where damaged organelles and proteins are degraded as a means of sustaining metabolism. The molecular pathways governing the association between autophagy and myofibroblast differentiation remain largely uncharacterized. Here, we sought to characterize the mediators and signaling pathways implicated in autophagy-induced myofibroblast differentiation.

We exposed WI-38 human fibroblasts to serum free medium, a classical inducer of autophagy, for up to 4 days. Serum starved fibroblasts showed increased LC3 II/I ratios and decreased p62 levels, confirming enhanced autophagy. This was associated with myofibroblast differentiation characterized by increased expression of α-smooth muscle actin (αSMA), collagen I, collagen III and formation of stress fibers. Inhibiting autophagy with three different PI3KIII inhibitors (3-MA, wortmaninn, LY294002) or through Atg7 silencing prevented differentiation. Autophagic fibroblasts showed increased expression and secretion of Connective Tissue Growth Factor (CTGF) and CTGF silencing prevented myofibroblast differentiation. Phosphorylation of the mTORC1 target P70S6kinase was abolished in starved fibroblasts. Phosphorylation of Akt at Ser473, a mTORC2 target, was reduced after initiation of starvation but was followed by spontaneous rephosphorylation after 2 days of starvation, suggesting mTORC2 reactivation with sustained autophagy. Inhibiting mTORC2 activation with long-term exposure to rapamycin or by silencing rictor, a central component of the mTORC2 complex, abolished Akt rephosphorylation. Rictor silencing and treatment with rapamycin both prevented CTGF and αSMA upregulation, demonstrating the central role of mTORC2 activation in CTGF induction and myofibroblast differentiation. Finally, inhibition of autophagy with PI3KIII inhibitors or Atg7 silencing blocked Akt rephosphorylation.

Collectively, these results identify starvation-induced autophagy as a novel activator of mTORC2 signalling leading to CTGF induction and myofibroblast differentiation.

Abstract#: 20
Intermittent Subnormothermic Ex Vivo Liver Perfusion Reduces Endothelia Cell Death and Decreases Bile Duct Injury after Pig Liver Transplantation with DCD Grafts
Vinzent N. Spetzler , J. Matthias Knaak , Nicolas Goldaracena , Kristine Louis , David R. Grant , Markus Selzner
Department of Surgery, Multi Organ Transplant Program, Toronto General Hospital, Toronto, ON, Canada
Ischemic-type biliary lesions (ITBL) are the main obstacle for the utilization of DCD liver grafts for transplantation. We developed a novel technique of subnormothermic ex vivo liver perfusion (SNEVLP) for the preservation of liver grafts suitable for a clinical setting.
Methods: Using a porcine transplant model, liver grafts were either stored for 10hr at 4°C (CS, n=5) or preserved combining a total of 7hr cold storage plus 3hr SNEVLP (33°C, n=5). To simulate a clinical sequence including graft transportation and recipient hepatectomy time, SNEVLP was performed in between two series of cold storage of 4hr and 3hr respectively. Parameters of hepatocyte (AST, INR), endothelial cell (Hyaluronic Acid, CD31 immunohistochemistry), Kupffer cell (beta-Galactosidase), and biliary (alk. Phosphatase, Bilirubin) injury and function were determined. 7 day survival was assessed.
Results: 7 day animal survival was similar between both CS and SNEVLP groups (40 vs 80%, p=0.8). No difference was observed between CS and SNEVLP groups regarding maximum INR (1.7 vs 2, p=0.9) or maximum AST within 48hr (2500±1100 vs 3010±1530 U/L, p=0.3). In contrast, 7hr after reperfusion the CS vs SNEVLP group showed 5-fold higher Hyaluronic Acid levels (4195±2990 vs 737±450 ng/ml, p=0.01) indicating decreased endothelial cell function in the CS group. Beta-Galactosidase levels as marker of Kupffer cell activation were 2-fold higher in CS vs SNEVLP pigs (166±17 vs 95±25 U/mL, p<0.01). CD31 staining of parenchymal biopsies at 8hr after reperfusion demonstrated severe endothelial cell injury in the CS group only. 3 days after transplantation CS vs SNEVLP groups had higher alk. Phosphatase (179±9 vs 80±21 mcmol/L, p≤0.05) and Bilirubin levels (20±22 vs 6±2 mcmol/L). Bile duct histology at time of sacrifice revealed severe bile duct necrosis in 3 out of 5 animals with CS (picture 1), while no bile duct injury was observed in SNEVLP treated animals (picture 2).

picture 1 (H&E, x10)

picture 2 (H&E, x10)

Conclusion: SNEVLP preservation of DCD grafts reduces bile duct and endothelial cell injury following liver transplantation. Intermitted SNEVLP preservation could be a novel and clinically applicable strategy to prevent ITBL in DCD liver grafts with extended warm ischemia times.

Abstract#: 21
LG3 Regulates Migration and Homing of Mesenchymal Stem Cells and Neointima Formation during Vascular Rejection
Eve-Annie Pilon 1 , Mélanie Dieudé 1 , Shijie Qi 1 , Katia Hamelin 1 , Yves Durocher 2 , Mary Zutter 3 , Daniel Coutu 4 , Claude Perreault 5 , Marie-Josée Hébert 1
1 Research Centre, Centre hospitalier de l'Université de Montréal (CRCHUM), Montreal, QC, Canada
2 Biotechnology Research Institute, Montreal, QC, Canada
3 Vanderbilt University School of Medicine, Nashville, TN, USA
4 ETH Zürich, Basel, GE
5 Institut de Recherche en Immunologie et Cancérologie (IRIC), Université de Montréal, Montréal (Québec) Canada
Rationale: Transplant vasculopathy (TV) is characterized by neointimal accumulation of recipient-derived α-smooth muscle actin (SMA) progenitor cells. Higher levels of circulating and urinary LG3, a C-terminal fragment of perlecan, are found in rejecting renal transplant patients.

Objective: We aimed to evaluate whether LG3 regulates the migration and homing of mesenchymal stem cells (MSCs) and favors the accumulation of recipient-derived neointimal cells during rejection.

Methods and Results: We used a pure model of TV where mice are transplanted with a fully-MHC mismatched aortic graft followed by intravenous injection of recombinant LG3. Increased neointimal accumulation of α-smooth muscle actin (SMA) positive cells was observed in LG3-injected recipients. When green fluorescent protein (GFP)-transgenic mice were used as recipients, LG3 injection favored neointimal accumulation of GFP+ cells, confirming the accumulation of recipient-derived cells within the allograft vessel wall. Recombinant LG3 increased horizontal migration and transmigration of mouse and human MSC in vitro and enhanced ERK 1/2 phosphorylation. Neutralising β1 integrin antibodies in MSC in vitro or use of MSC from α2 integrin-/- mice (deficient in α2β1 integrins) led to decreased migration in response to recombinant LG3 and significantly decreased ERK 1/2 phosphorylation. To assess the importance of LG3/α2β1 integrin interactions in LG3-induced neointima formation, α2-/- mice or wild-type mice were transplanted with an allogeneic aortic graft followed by intravenous LG3 injections for 3 weeks. Reduced intima-media ratios and decreased numbers of neointimal cells showing ERK 1/2 phosphorylation were found in α2-/- recipients.

Conclusion: These results highlight a novel role for LG3 in neointima formation during rejection. LG3, through interactions with α2β1 integrins on recipient-derived cells leading to activation of the ERK 1/2 pathway, favors the accumulation of recipient-derived αSMA positive cells to sites of immune-mediated vascular injury.

Abstract#: 22
Application of the 2012 KDIGO Guidelines for Chronic Kidney Disease (CKD) Staging has a Significant Impact on Risk Stratification in Prevalent Kidney Transplant Patients.
Lan Song , M. Khaled Shamseddin , David Holland , Eduard Iliescu
Queen's University and Kingston General Hospital
Objective: The 2012 KDIGO guidelines recommend using estimated glomerular filtration rate (eGFR) calculated with the CKD-EPI equation and urinary albumin to creatinine ratio (ACR) for staging and risk stratification for multiple outcomes including death and dialysis for chronic kidney disease (CKD) patients. In kidney transplant patients CKD is common and albuminuria predicts graft loss and death. Our center traditionally used the MDRD equation and ACR was not routinely measured in transplant patients. This study aims to assess the impact of switching from MDRD to CKD-EPI equations and incorporating ACR in the staging and risk stratification in an existing kidney transplant population in Southeastern Ontario.

Methods: This is a cross-sectional study of prevalent kidney transplant patients. The variables were serum creatinine, eGFR (MDRD and CKD-EPI equations, ml/min/1.73m2), and ACR (mg/mmol). The number of patients in each 2012 KDIGO eGFR stage (G1 > 90, G2 = 60 – 89, G3a = 45 – 59, G3b = 30 – 44, G4 = 15 – 29, and G5 < 15 ml/min), ACR stage (A1 < 3.0, A2 = 3.0 - 30, A3 > 30 mg/mmol) and risk category (low, moderate, high, very high) were calculated.

Results: The patients were 133 subjects, mean age 54.6 years, 30% female, 0.7% African Canadian, 26 % diabetic, median transplant age 9.7 years, 35 % living donor. Compared to MDRD, CKD-EPI classified subjects to the same stage in 85.8%, to a less severe stage in 13.5%, and to a more severe stage in 0.7%. For risk stratification, compared with eGFR alone, incorporating ACR increased the number of patients at “very high” risk from 14 to 31 % and overall, 47 % of patients moved to a higher risk group.

Conclusions: This study suggests switching from MDRD to CKD-EPI equation has minimal impact on staging and that a large proportion of kidney transplant patients may be at “very high” risk for important clinical outcomes when eGFR and ACR are considered. The main limitation of this study is that the 2012 KDIGO risk stratification system has not been validated in kidney transplant patients, an area to be addressed by future studies.

Abstract#: 23
Proteomic analysis of machine cold perfusion fluid: differences between DCD and DBD kidneys
Steve Arcand 1 , Patrick Luke 2 , Gavin Beck 2 , Jolanta Sawicka 1 , Preston O'Brien 1 , Gregorz Sawicki 1 , Mike Moser 1
1 University of Saskatchewan
2 Western University
Introduction: Recent studies have suggested that different mechanisms exist for the injury that occurs to kidneys obtained from donation after cardiac death (DCD) and donation after brain death (DBD). The details of the different mechanisms, however, remain to be elucidated. The purpose of our study is to investigate kidney injury that occurs during cold preservation and identify differences between kidneys from DCD and DBD.
Methods: Perfusate samples were collected immediately after the kidney was removed from the pump from DBD (n=9) and DCD (n=4) donors. After a purification process to remove the starch from the perfusates, two-dimensional gel electrophoresis was performed on each sample. Protein expression was analysed using PDQuest measurement software. Spot levels that correlated with each type of donor were identified and then sent for mass spectrometry.
Results: Three spots that were significantly associated (p<0.05) with DCD donors emerged from the analysis and they were identified as fatty acid binding protein, Apo lipoprotein, and proapolipoprotein.
Conclusion: Although previously these biomarkers were noted in renal ischemia-reperfusion injury, we have identified them here as released prior to reperfusion. The known role of these proteins in lipid peroxidation suggests that this may be a part of mechanism of injury to DCD kidneys and may hint at a potential target for pharmacological intervention, which could be applied while the kidney is being cold perfused.

Abstract#: 24
Thrombolytic Protocol Minimizes Ischemic-type Biliary Strictures in Liver Transplantation from Donation-after-Cardiac Death (DCD) Donors
John Seal 1 , Trevor Reichman 2 , Ian McGilvray 1 , Mark Cattral 1 , Paul Greig 1 , David Grant 1 , George Loss 2 , Markus Selzner 1 , et al.
1 University of Toronto
2 Ochsner Medical Center
We investigated the impact of intraoperative tissue plasminogen activator (tPA) injection into the hepatic artery on outcomes of liver transplantation with organs retrieved after cardiac death (DCD). We conducted a retrospective analysis at the University of Toronto (TO) and Ochsner Medical Center (OC). Between 2009 and 2013, 85 DCD liver transplants were identified with tPA injection (N=30 TO, 55 OC) and compared to 33 DCD liver transplants without tPA. There was no significant difference between the groups (tPA vs non-tPA, P≥0.2) with regards to donor age (36.4 vs 38.0), donor warm ischemia time (23.1 vs 23.3 min), cold ischemia time (5.1 vs 4.4 h), recipient MELD score (20.0 vs 21.9). Intra-operative blood loss (3.5 vs 2.7 L; p=0.09) and transfusion requirements (3.1 vs 2.8 units; p=0.63) were also similar. Overall biliary strictures (12.1 vs 33.3%, p=0.04) and specifically diffuse ischemic-type bile duct strictures (ITBS)(5.9% vs 24.4%, p<0.01) were less common in the tPA group. The rate of focal ITBS was similar (9.4% vs 9.1%; p=0.99). Re-transplantation was performed less frequently in the tPA group (2.4% vs 12.1%, p<0.01). After 1- and 3-years the tPA vs non tPA group had improved graft (93.9 vs 68.7 and 80.1 vs 55.2; p=0.01) and patient survival (98.2 vs 82.1 and 88.9 vs 76.5; P=0.07). In conclusion, tPA injection into the hepatic artery during DCD liver transplantation reduces ITBS and improves graft and patient survival without increasing the risk for bleeding.

Abstract#: 25
Teens Taking Charge: Managing My Transplant Online: Usability Testing Results from Adolescents with a Solid Organ Transplant
Moira Korus , Elizabeth Cruchley , Jennifer Stinson , Samantha Anthony , Anna Gold
SickKids, Toronto, On
Teens Taking Charge: Managing My Transplant Online: Usability Testing Results from Adolescents with a Solid Organ Transplant

Background: Adolescents with solid organ transplants (SOT) demonstrate high rates of medication non-adherence and higher rates of graft loss compared to all other age groups. Self-management interventions encompass information-based material designed to achieve disease-related learning and changes in the participant’s knowledge and skill acquisition, while providing meaningful social support. These interventions have had some success in chronic disease populations by reducing symptoms and promoting self-efficacy and empowerment. Using findings from a needs assessment, we developed three modules (Diet, Medication, and Lifestyle) of an Internet-based self-management program for youth with SOT. This program contains information, graphics, peer experiences and self-management strategies. The purpose of this study was to determine the usability and acceptability of the online program from the perspectives of youth with SOT.
Methods : Participants were recruited from SOT clinics at one large pediatric tertiary care centre in Canada. Three iterative cycles (seven patients per each iteration) of usability testing took place to refine the website prototype. Study procedures involved finding items from a standardized list of features and communicating any issues they encountered, followed by a semi-structured interview to generate feedback about what they liked and disliked about the program.
Results : 21 post transplant teens,(mean age 15, 7 female) found the website content to be trustworthy; they liked the picture content and found the videos of peer experiences to be particularly helpful. Teenagers had some difficulties finding information within sub-modules and suggested a more simplistic design with easier navigation.
Conclusions and Future Direction: This web-based intervention is appealing to teenagers and may foster improved self-management with their SOT. Nine additional modules are being developed and will undergo usability testing before it is finalized. In the future, a randomized control trial will determine the feasibility and effectiveness of this online self-management program on adherence, graft survival and quality of life.

Abstract#: 26
Outcomes of Methicillin-Resistant Staphylococcus Aureus Colonization in the Lung Transplant Recipient
Isabel Coman 1 , Larry Lands 2 , Céline Bergeron 3 , Anna Yiannopoulos 4 , Me-Linh Luong 5 , Charles Poirier 4
1 Hôpital Notre-Dame, Centre Hospitalier de l’Université de Montréal, Université de Montréal
2 Montreal Children's Hospital, McGill University Hospital Center
3 Hôtel-Dieu, Centre Hospitalier de l’Université de Montréal
4 Hôpital Notre-Dame, Centre Hospitalier de l’Université de Montréal
5 Hôpital Saint-Luc, Centre Hospitalier de l’Université de Montréal
Background An increasing number of infections in the lung transplant recipient are caused by Methicillin-Resistant Staphylococcus Aureus (MRSA). However, the post-operative outcomes of MRSA-colonized patients are still poorly described.

Methods To better portray the evolution of MRSA-colonized lung transplant recipients, we conducted a 4-year retrospective observational study of our cohort of patients colonized with MRSA, analyzing their outcomes in the first year following lung transplant

Results Of the 128 lung transplantations carried out in our facility over 4 years, 23 were patients colonized with MRSA, a prevalence of 18%. Of these 23 patients, 6 died within the first year after transplant, a one-year survival rate of 74% in this subgroup. This MRSA-colonized cohort’s intensive care unit average length of stay was of 11 days (1-134, median 5 days) and the average hospital length of stay was of 31 days (13-134, median 22 days). This cohort of MRSA-colonized patients had a hospitalization rate of 0.82 per patient-year, with 30% of patients accounting for all hospitalizations. A total of 71 respiratory infections were treated over 1 year of follow-up, including 21 (30%) MRSA infections. There were 9 (39%) patients who developed reperfusion injury after the transplant and 9 (39%) patients who developed bronchial stenosis during follow-up. Only 5 cases of biopsy-proven acute rejection and 1 case of bronchiolitis occurred during the first year after transplant of our cohort. All deaths were attributable to complicated pneumonias, 50% of which were caused by MRSA.

Conclusion In the first year after lung transplant, our cohort of MRSA-colonized patients showed a high rate of respiratory tract infections with only a minority due to MRSA, as well as a survival rate comparable to the general lung transplant population.

Abstract#: 27
Collaborative care between transplant nephrologists and primary care physicians – A gap analysis
Famure Olusegun 1 , Myra Caballero 1 , Anna Li 1 , Lesley Adcock 2 , Jeffrey Schiff 1 , Rosalind Tang 1 , Joseph Kim 1
1 Kidney Transplant Program, Division of Nephrology, University Health Network, Toronto CA
2 Partner Family Health Care Centre, Toronto Western Hospital, University Health Network, Toronto CA
Collaborative care between transplant centres and primary care physicians (PCP) is imperative for the adequate management of chronic diseases in these patients. However, concern exists regarding the lack of training and guidelines provided to PCP in the management of KTR. Therefore, there is a need to assess the quality and barriers to optimal delivery of primary care provided to KTR.

Two self-administered questionnaires on the primary care management of KTR were developed and implemented. One survey investigated the perspectives of KTRs at an urban transplant centre on PCP performance, comfort level with their PCP, support received for self-management, and barriers to ideal care. The second survey targeted PCPs of KTR assessing their attitudes and practice patterns towards similar domains in addition to their communication with transplant centres.

A total of 502 patients and 209 physicians completed the survey (response rate of 77% and 22%, respectively). The majority (76%) of patients indicated that a PCP was involved in their care. Patients felt comfortable receiving care for non-transplant related issues, vaccinations, and periodic health examinations from their PCP. Similarly, PCP felt comfortable providing such care. While only 23% of patients felt uncomfortable with their PCP managing their immunosuppressive medication, the majority (75.3%) of PCP felt uncomfortable. PCP tend to rate their support for patient self-management better than that reported by patients. 73% of physicians responded that they were currently providing care to KTR. The majority of physicians specified that they rarely (57%) or never (20%) communicate with transplant centres. PCPs’ most commonly stated barriers to delivering optimal care to KTR were insufficient guidelines provided by the transplant centre (68.9%) and lack of knowledge in managing KTR (58.8%). The resources suggested by PCP to improve their comfort level in managing KTR were written guidelines and continuing medical educational activities related to transplantation.

Our results suggest that there is insufficient communication between the transplant centres, PCP, and patients. The modes for the provision of resources needed to bridge the knowledge gap for primary care physicians in the management of such patients needs to be further explored.

Abstract#: 28
Physical Activity Level and its Correlates in Children and Adolescents Post Liver Transplant
Catherine Patterson 1 , Stephanie So 1 , Jane E. Schneiderman 2 , Samantha Stephens 3
1 The Hospital for Sick Children Rehabilitation Medicine
2 The Hospital for Sick Children Physiology and Experimental Medicine; Kinesiology and Physical Education, University of Toronto
3 The Hospital for Sick Children Health Evaluative Sciences
Background : The health benefits of physical activity (PA) are well established for both healthy children and those with chronic disease.Low levels of PA have been reported in children post liver transplant (post-LTx) however, no studies have objectively measured or identified variables that impact their PA.
Purpose : To objectively determine PA and fitness levels and, to examine potential correlates of PA in children post-LTx.
Methods : 20 children (7 males, mean age 14.2 yrs ± 2.2) > 1 year post-LTx (mean 10.1 ± 4.3 yrs) were studied. Peak oxygen consumption (VO2peak) was measured through graded cardiopulmonary cycle ergometry. Muscle strength, endurance and flexibility were assessed via the Fitnessgram®. Moderate to vigorous PA (MVPA) and steps/day were determined with an Actigraph (GT3X) accelerometer worn for 7 consecutive days. Questionnaire measures included: Children’s Self-Perceptions of Adequacy in and Predilection for Physical Activity Scale, Pediatric Quality of Life Multidimensional Fatigue Scale and Physical Activity Perceived Barriers and Benefits Scale. All measures were compared to normative values for healthy children.
Results : VO2peak was low (mean 33.3 ± 7.5 ml/kg; 76.9 ± 15.5%predicted) and participants took less steps per day (mean 6790.6 ± 2941.8 compared to healthy children (11, 220 steps/day). MVPA (23.7 ± 9.6 minutes/day) was accumulated at moderate intensity only and no subjects met national recommended PA guidelines. Six participants (30%) attained the healthy fitness zone for abdominal strength and 1 participant (5%) for pushups ( Fitnessgram® criterion measures). Fatigue (69.5 ± 14.9) and self-efficacy (56.6 ± 9.5) were lower than reported levels in healthy children and similar to several other chronic disease populations. Most commonly reported perceived barrier to PA was “I am tired.” A positive correlation was shown between self-efficacy and MVPA (r=0.59, p=0.016) and self-efficacy and fatigue (r=0.51, p=0.025).
Conclusion : Children post-LTx show below normal levels of PA and VO2peak and their perceived fatigue is a common barrier to PA. Self-efficacy correlates with fatigue and MVPA. Further investigation into the potential mediating effects of these correlates is warranted to guide development of innovative and effective PA intervention strategies in order to maximize long-term health outcomes in children post-LTx.

Abstract#: 29
Cardiovascular Risk Scores in Stable Kidney Transplant Recipients
Mowad Benguzzi , Holly Mansell , Abubakar Hassan , Rahul Mainra , Ahmed Shoker
University of Saskatchewan, Department of Medicine, Division of Nephrology, Saskatoon Health Region, Saskatchewan Kidney Transplant Program
Background: The Framingham Risk Score (FRS) and Major Adverse Cardiovascular Event (MACE) are formulae used to score cardiovascular risk in renal transplant recipients (RTR). There is a notion from existing literature for underestimation of cardiovascular events (CVE) by FRS. We hypothesize that scores by MACE are higher than FRS because estimated glomerular filtration rate (eGFR) contributes to the MACE but not the FRS.

Objective: To compare cardiovascular risk scores by both formulae in our cohort of stable RTR.

Methods: A cross-sectional chart review was undertaken of 270 consecutive RTR transplanted from 1979 - 2012. High risk MACE score was defined at ≥20%. Standard statistical analyses including multivariate analysis (MVA) and stepwise analysis were performed.

Results: Data was collected between Jan 2011- Aug 2013. Mean transplant duration was 9.51±6.65 yrs. Mean eGFR by isotope dilution mass spectrometry (IDMS) was 59.19±28.26 ml/min. 46.3% had eGFR above 60 mL/min. Mean FRS score was 16.11% ±13.41. 41.5% were classified as high risk. 34.4% and 47.6% of patients with eGFR higher and lower than 60 mL/min respectively had high FRS. UVA showed statistical insignificance (p=0.261) between eGFR and FRS. In the MVA, FRS correlated significantly with age (p=0.000), body surface area (p=0.001), body mass index (p=0.018), and TC: HDL ratio (p=0.000).

Mean MACE score was 14.80% ±15.32. 24.8% were classified as high risk. 11.2% and 36.6% of patients with eGFR above and below 60 mL/min respectively had high MACE scores.. In the MVA, MACE scores correlated significantly with age (p=0.000), and eGFR (p=0.001). Stepwise analysis revealed an eGFR contribution to MACE score= {-9.781ln(eGFR) + 44, p < 0.01}.

Conclusions: eGFR contributed significantly to MACE scores but not FRS. The higher FRS over MACE scores suggests that traditional Framingham variables contribute more to the total score compared to the impact of diminished transplant eGFR. We suggest a prospective validation study.

Abstract#: 30
Quality of Life and Health Status are similar between directed and non-directed or paired exchange Canadian living kidney donors
Olwyn Johnston 1 , Lianne Barnieh 2 , John Gill 3 , Robert Richardson 4
1 Division of Nephrology, University of British Columbia, Vancouver General Hospital
2 University of Calgary
3 Division of Nephrology, University of British Columbia, St. Paul's Hospital
4 Division of Nephrology, University of Toronto, Toronto General Hospital
The impact of non-directed and paired donations on quality of life (QOL) and health status (HS) in the context of kidney paired donation (KPD) remain unclear. We hypothesized no difference in QOL and HS between non-directed/paired KPD donors and directed donors post-nephrectomy.
Adult living kidney donors from 3 Canadian transplant centres completed validated QOL (WHOQOL-BREF) and HS (Short Form-36 Health Survey) questionnaires at time of (baseline) and 6 months post-nephrectomy (follow-up). QOL and HS summary scores were calculated at baseline and follow-up, stratified by domain and donor type (directed versus non-directed/paired KPD). Scores between different donor types and between all donors and Canadian norms were compared using a t-test.
Among 94 donors (n=68 directed; n=6 non-directed and n=20 paired), QOL domains were similar between donor types at baseline and follow-up. Baseline and follow-up HS scores were similar between donor types except for lower social functioning in directed donors at baseline (p=0.03) (Table 1). Compared to Canadian norms, living kidney donors reported superior HS in all domains at baseline (p<0.01) and in 6 domains (P<0.05) at follow-up. Living donors were similar to Canadian norms for physical role limitations and energy/fatigue at follow-up (p>0.08).
QOL and HS are similar between directed and non-directed or paired exchange donors and HS is superior for living donors in almost all domains compared to Canadian norms. These findings support the expansion of the practice of paired exchange and non-directed kidney donation in Canada.

Abstract#: 31
Association Between the Seven Year Major Adverse Cardiovascular Event (MACE) Prediction Score and Circulating Inflammatory Markers in Renal Transplant Recipients
Holly Mansell 1 , Mowad Benguzzi 2 , Nicola Rosaasen 3 , Rahul Mainra 4 , Abubaker Hassan 4 , Ahmed Shoker 4
1 College of Pharmacy and Nutrition, University of Saskatchewan; Saskatchewan Transplant Program
2 University of Saskatchewan
3 Saskatchewan Transplant Program
4 College of Medicine, University of Saskatchewan; Saskatchewan Transplant Program
Background: Framingham risk scores (FRS) and the 7-year Major Adverse Cardiovascular Events Calculator (MACE) predict cardiovascular events (CVE) in renal transplant recipients (RTR). Our recent work showed that FRS do not correlate with plasma inflammatory levels.

Objective: To investigate whether MACE scores correlate with inflammatory chemokine levels in our RTR.

Methods: The MACE calculator was used to calculate the 7-year probability of CVE in 101 RTR. Forty-four immuno-inflammatory markers were measured by Luminex technique. Statistical analyses included stepwise analysis after a multivariate determination of significant demographic and inflammatory variables.

Results: The mean predicted risk of a CVE was 14.8% ± 15.4 [95% CI 12.9 – 16.7]. In the univariate analysis, MACE scores correlated significantly with age, HbA1c, serum creatinine, urea, eGFR (as measured by IDMS) systolic blood pressure, serum phosphate, thrombopoeitin (TPO), chemokine ligand (CCL) 2,5,11, vascular endothelial growth factor (VEGF) and granulocyte colony stimulating factor (G-CSF) (p<0.05). After multivariate analysis, however, only age, serum creatinine, eGFR and TPO remained significant (p<0.05). Similar to previous analysis the FRS demonstrated lack of significance with all inflammatory markers and eGFR.

Conclusion : Contrary to the FRS, the MACE score is associated with levels of inflammation, suggesting its relevance in RTR. TPO emerged as potential marker for future CVE in RTR. Prospective study is warranted to investigate the impact of these findings and to determine whether MACE more accurately predicts CVE.

Abstract#: 32

Canadian Society of Transplantation (CST) Members’ Views on Anonymity in Organ Transplantation
Mena Gewarges 1 , Jennifer Poole 2 , Enza De Luca 1 , Margrit Shildrick 3 , Susan Abbey 4 , Oliver E. Mauthner 1 , Heather J. Ross 1
1 Department of Cardiology and Transplantation, University Health Network, University of Toronto
2 School of Social Work, Faculty of Community Services, Ryerson University
3 The Department of Thematic Studies - Gender Studies Linköping University
4 Department of Psychiatry, University Health Network, University of Toronto
Introduction/background: Anonymity has been central to medical, psychosocial and societal practices in transplantation. In Canada, anonymity between transplant recipients, organ donors and donor families is legally mandated in most provinces. Any communication between donor families and organ recipients is vetted, depersonalized and anonymized prior to being shared. Exploring heart transplant recipients’ experiences with transplantation revealed their desire to learn the identity of their donor and their unease and distress surrounding the anonymity of their donor. Given that physicians and health care practitioners are stakeholders in developing policy and practice guidelines surrounding transplantation, our research group sought to explore the Canadian Society of Transplantation (CST) members’ views on anonymity in the context of organ transplantation in Canada.

Methods: This study involved the electronic distribution of an eighteen-item survey to the CST membership, specifically asking respondents to consider the possibility and implications of open communication and contact between organ recipients and donor families. Respondents were also given an opportunity to elaborate with written comments.

Results: Of the 541 CST members surveyed, 106 replied (20%), with a completion rate of 57%. Among these respondents, 70% felt that organ recipients and donor families should only communicate anonymously, while 47% felt that identifying information could be included in correspondence between consenting recipients and donor families. 53% thought that organ recipients and donor families should be allowed to meet should they be interested in doing so; however, 27% of respondents were against this, and 20% neither agreed or disagreed with them meeting. With the advent of social media facilitating communication, 38% of respondents felt that a re-examination of current policies and practices concerning anonymity in transplantation is necessary.

Conclusions: Further research and discussion concerning the views of organ recipients and donor families on the mandate of anonymity is both timely and relevant, and may influence future policy.

Abstract#: 33
The Anti-Human Globulin (AHG) Enhanced C1qScreenTM Assay Improves the Detection of Complement Binding Donor Specific HLA Antibodies.
Robert Liwski 1 , Sandra Lee 1 , Roxanne Sperry 1 , Peter Nickerson 2 , Robert Bray 3 , Howard Gebel 3
1 Department of Pathology Dalhousie University
2 Department of Medicine University of Manitoba
3 Department of Pathology Emory University
Aim: The C1qScreenTM assay detects complement binding HLA antibodies. Recent studies suggest that the presence of C1q binding donor specific antibodies (DSA) is associated with poor post-transplant outcomes. However, in some patients with documented episodes of antibody mediated rejection or graft failure, C1q binding DSA are not detected by the C1qScreenTM suggesting poor assay sensitivity. The goal of this study was to develop an enhanced C1q assay to improve the detection of complement binding HLA DSA.
Method: Nine patient sera with well characterized (using IgG single antigen bead (SAB) assay) post-transplant DSA were tested for C1q binding using the standard C1qScreenTM assay and two modified protocols: 1) wash modified (WM) protocol and 2) anti-human globulin (AHG) C1q enhanced (ACE) protocol. The number of class I/II DSA detected and DSA MFI values obtained with each protocol were compared.
Results: The C1qScreenTM assay was positive for only 30% of Class I (9/30) and 52% of Class II (15/29) IgG DSA specificities. The average MFI DSA values were markedly reduced compared with the IgG-SAB assay. The WM protocol exhibited higher average DSA MFI values (3.4 fold) compared to the standard protocol but only 2 additional DSA specificities were detected. In contrast, the ACE protocol identified 10 additional Class I (19/30; 63%) and 4 additional Class II (19/29; 66%) DSA specificities with an average increase in DSA MFI of 4.7 fold when compared with the standard C1qScreen TM assay.
Conclusions: The C1qScreenTM assay exhibits suboptimal assay sensitivity and fails to adequately identify many post-transplant complement fixing HLA DSA. The sensitivity of the C1qScreenTM assay to detect DSA can be improved with protocol modifications including introduction of wash steps and/or AHG enhancement. The ACE protocol in particular was the most sensitive of the protocols tested and detected the majority of post transplant HLA DSA. Future studies will investigate the clinical significance of HLA DSA detected by the ACE protocol.

Abstract#: 34
Anti-Human Globulin (AHG) Enhanced C1qScreenTM assay positivity correlates with the CDC-AHG crossmatch results.
Robert Liwski 1 , Sandra Lee 1 , Roxanne Sperry 1 , Anne Halpin 2 , Luis Hidalgo 2 , Patricia Campbell 3
1 Department of Pathology Dalhousie University
2 Department of Laboratory Medicine and Pathology University of Alberta
3 Department of Medicine University of Alberta
Complement binding donor specific HLA antibodies (DSA) detected by the C1qScreenTM assay are associated with poor post-transplant outcomes. However, in some patients with documented episodes of antibody mediated rejection or graft failure, C1q binding DSA are not detected by the C1qScreen TM. We have recently developed a more sensitive anti-human globulin (AHG) - C1qScreenTMenhanced (ACE) protocol, which identifies more complement binding DSA compared to the standard assay. In this study we investigate functional relevance of HLA antibodies identified by the ACE protocol by correlation with the CDC and CDC-AHG Lambda Cell Tray (LCT) testing.
Ten sera from highly sensitized post-transplant patients with well characterized HLA antibodies (IgG single antigen bead (SAB) assay) were tested using C1qScreenTM and the ACE protocol. CDC and CDC-AHG crossmatches were performed using 1W60 LCT panels. Actual PRA values for CDC and CDC-AHG LCT were calculated (#Pos reaction/#Valid reactions) for each serum and were compared to the predicted PRA values based on C1q positive HLA antibody specificities.
PRA predicted based on antibody specificities identified with the ACE assay correlate well with CDC-AHG PRA (mean difference = 2.9%, range = -1.7-8.6%) but not with CDC PRA (mean difference = 26.2%, range = 7-48%). PRA predicted based on C1qScreenTMresults correlatereasonable well with CDC PRA (mean difference = -4.8%, range = -39.7 – 17.2%) but not with CDC-AHG PRA (mean difference = -28%, range = -62 – 5%). Importantly, only 6/383 (1.6%) positive CDC-AHG reactions were not predicted by the ACE assay results. In contrast, a significant proportion of positive CDC reactions, 49/240 (20.4%), were predicted to be negative by standard C1qScreenTM suggesting poor assay sensitivity.
There is a good correlation between the CDC-AHG PRA and predicted PRA based on ACE assay HLA antibody test results. This suggests that the ACE assay may be used to predict CDC-AHG crossmatches.

Abstract#: 35
Preserving renal function with prolonged- release tacrolimus-based immunosuppression in de novo liver transplantation: Initial results from the DIAMOND study
Paul Marotta 1 , Vincent Bain 2 , Denis Marloe 3 , Marie Laryea 4 , Urs Steinbrecher 5 , Pavel Trunecka 6 , Jürgen Klempnauer 7 , Guiseppe Tisone 8
1 Multiorgan transplant unit, London Health Sciences Centre, London, Ontario, Canada
2 Liver Unit, University of Alberta, Edmonton, Alberta, Canada
3 Centre de Researche du CHUM, Hôpital Saint Luc, Montreal, Quebec, Canada
4 Multi-Organ Transplant Program, Dalhousie University, Halifax, Nova Scotia, Canada
5 British Columbia Transplant Program, Vancouver, British Columbia, Canada
6 Transplant Centre, Institute for Clinical and Experimental Medicine (IKEM), Prague, Czech Republic
7 Department of General, Visceral and Transplant Surgery, Hannover Medical School, Hannover, Germany
8 Policlinico di Tor Vergata, Rome, Italy
Background : DIAMOND: multicentre, randomized study to investigate renal function with once-daily, prolonged-release tacrolimus (QD; oral)-based immunosuppression.
Methods : Patients received: Arm 1: tacrolimus QD (initial dose: 0.2mg/kg/day), Arm 2: tacrolimus QD (0.15-0.175mg/kg/day) plus basiliximab; Arm 3: tacrolimus QD (0.2mg/kg/day delayed to Day 5) plus basiliximab. All patients received MMF (IV 3-5 days then oral) and a single bolus of corticosteroid. Primary analyses (full-analyses set; FAS): eGFR (MDRD4) at Week 24. Secondary endpoints (per-protocol set) included graft and patient survival, and acute rejection (AR). Mortality rates were calculated using the safety-analyses set.
Results : 901 patients randomized; FAS: 295, 286 and 276 in Arms 1-3, respectively (23, 17, 17 from Canada). Baseline characteristics were comparable with mean baseline eGFR of 90.6, 89.3, 89.9 mL/min/1.73m2 in Arms 1-3, respectively. Mean tacrolimus QD trough levels were initially lower in Arm 2 vs 1 and 3; by Day 14, levels were comparable between arms and remained stable. Week 24: eGFR was higher in Arms 2 and 3 vs 1 (76.4 and 73.3 vs 67.4 mL/min/1.73m2; p<0.001 and p<0.047, respectively; ANOVA), and eGFR was numerically higher in Arm 2 vs 3 (p=ns); renal function was preserved in all arms (Figure). Kaplan-Meier estimates of graft survival in Arms 1-3: 86.5%, 87.7% and 88.6% (p=ns; Wilcoxon-Gehan); patient survival: 89.3%, 89.1% and 90.4% (p=ns); and without AR: 79.9%, 85.7% and 79.6% (Arm 2 vs 1: p=0.0249, Arm 3 vs 1: p=ns; Arm 2 vs 3: p=0.0192). Overall mortality rate: 5.1% and mortality rate for males vs females: 5.8% and 3.6%, respectively. AEs were comparable between arms, with a low incidence of diabetes mellitus and no major neurologic disorders.
Conclusion : An initial low dose of prolonged-release tacrolimus (0.15-0.175mg/kg/day) plus MMF and induction therapy (without maintenance steroids) had better renal function and a significantly lower incidence of AR over 24 weeks vs the other regimens. There were no advantages observed with delaying the initiation of tacrolimus QD.

Abstract#: 36
Impact of Acute Kidney Injury Following Liver Transplantation on Long-Term Outcomes
Emilie Trinh , Ahsan Alam , Jean Tchervenkov , Marcelo Cantarovich
McGill University Health Center
Background: The incidence of acute kidney injury (AKI) after orthotopic liver transplantation (OLT) ranges from 17% to 64%. AKI is associated with prolonged hospitalization and increased early mortality. However, the long-term outcomes of AKI on mortality and chronic kidney disease (CKD) remain to be determined.

Purpose : In our cohort study, we examined the impact of AKI on long-term patient (pt) survival and on the incidence of stage 4 and 5 CKD.

Methods: We studied 491 OLT recipients at a single center between 01/1990 and 08/2012, and pts were followed for up to 20 years. We identified 278 pts (56.6%) with AKI defined as either an increase in serum creatinine (SCr) ≥26.5 µmol/L within 48-hr or elevation in SCr 1.5X above baseline within 7 days (KDIGO criteria).

Results: In a multivariable Cox proportional hazards model, survival was worse in pts with AKI (HR: 1.46, 95% CI: 1.08-1.96, p=0.014, Figure 1). The median survival time was 13.2 yrs for pts with AKI, and 17.9 yrs in pts without AKI. Severe (stage 3) AKI was associated with worse pt survival (HR: 2.29, 95% CI: 1.46-3.58, p=0.001), while AKI stages 1 and 2 were not statistically different. The risk of developing stage 4-5 CKD was also higher in pts with AKI compared to non-AKI pts (17.5% vs. 9.1%) with a HR of 2.28 (95% CI: 1.30-4.00, p=0.004, Figure 2).

Conclusions: Our findings suggest that AKI after OLT is associated with poor long-term outcomes, including worse pt survival and higher incidence of CKD stage 4-5. Strategies to prevent and manage OLT pts with AKI need to be developed.

Abstract#: 37
Achievement of renal function recovery and long-term graft survival after renal transplantation.
Susan Wan 1 , Marcelo Cantarovich 1 , Istvan Mucsi 1 , Dana Baran 1 , Steven Paraskevas 2 , Tchervenkov Jean 2
1 Department of Medicine, Division of Nephrology, Multi-Organ Transplant Program, McGill University Health Center.
2 Department of Surgery, Multi-Organ Transplant Program, McGill University Health Center.
BACKGROUND: Short-term correlates of long-term outcomes are necessary to identify kidney transplant (KTx) recipients at risk for long-term complications, and processes or intervals crucial to the longevity of the graft. In this respect, the recovery of the kidney from ischemic injury sustained during transplantation has not been quantified or evaluated. The achievement of a best estimated glomerular filtration rate (eGFR) relative to the function in the donor is a novel concept, the importance of which is unknown.

AIM: To calculate renal function recovery (RFR) based on recipient and donor eGFR, and to evaluate the impact of RFR on long-term death censored graft survival (DCGS).

METHODS: We studied adult deceased donor kidney transplant (KTx) recipients transplanted between January 1990 and June 2012. The last donor serum creatinine prior to procurement was used to estimate donor eGFR. The predicted eGFR was calculated as donor eGFR/2. The recipient eGFR was calculated using the average of the best three eGFR values observed during the first 3 months post-KTx. The abbreviated MDRD equation was used to estimate both donor and recipient eGFR. Renal function recovery (RFR) was defined as the achievement of predicted eGFR as determined by the donor eGFR. Recipients who achieved RFR were compared to those who did not, with DCGS as the primary outcome.

RESULTS: 1023 KTx were performed during the study period, of which 705 were studied after exclusion for missing data, living donors and combined transplants. The predicted eGFR was achieved in 57% of patients, and 138 graft failures (20%) occurred during the follow-up period. Recipients who achieved the predicted eGFR had significantly better DCGS compared to those who did not (HR for graft failure 0.57, 95% CI 0.40-0.79, P=0.0008, Figure 1). ECD kidneys, recipient age, diabetes, hypertension, HLA-mismatch, acute rejection and transplant era were also significant determinants of graft survival in a multivariate model (Table 1).

CONCLUSION: Recovery of predicted eGFR based on donor eGFR correlates with improved DCGS in KTx recipients.

Abstract#: 38
Esophageal manometry and impedance pH monitoring in patients listed for lung transplantation: a single center review
Dale Lien 1 , Justin Weinkauf 1 , Ali Kapasi 1 , Kathy Jackson 1 , Jackson Wong 1 , Douglas Helmersen 2 , Mitesh Thakrar 2 , Mark Fenton 3 , Jayan Najendran 1 , Stephen Meyer 1 , John Mullen 1
1 University of Alberta
2 University of Calgary
3 University of Saskatchewan
Objective: Gastroesophegeal reflux disease (GERD) has been considered a risk factor for bronchiolitis obliterans syndrome (BOS) post lung transplant. Our current approach is to examine all patients for GERD by history, routine esophageal motility and impedence-pH testing pre transplant, to assess the frequency and severity of GERD. Method: A retrospective data review was undertaken to examine 91 consecutive patients listed for lung transplant. The data collected included: demographics, prior diagnosis of GERD, indication for lung transplant. Lower esophageal sphincter pressures (LES); peristaltic contractions (%); simultaneous/non-conducted waves (%); upper esophageal sphincter pressure (UES); mean contractile amplitude (MCA); ineffective esophageal motility (IEM); DeMeester composite scores; esophageal acid exposure and reflux episodes were analyzed. Results: Of the 91 patients 62% were male. Mean age 58 years (range 19-68). The underlying diagnoses were: COPD(37%), pulmonary fibrosis(44%), cystic fibrosis(7%), pulmonary hypertension(5%) and other(5%). GERD had been identified in 42/91 (46%) of patients prior to referral to the program. Results demonstrated resting LES pressure <13 mm Hg (normal 13 to 43 mmHg) 24%; peristaltic contractions under 80% (normal > 80%) 35%; spontaneous or non-conducted contractions >20% (normal < 20%) 51%; UES <30 mmHg (normal 30 to 104 mmHg) 28 %; IEM 26 %; DeMeester composite score >14.7 (normal <14.7) 19%; increased esophageal acid exposure 20% and increased total number of reflux episodes in 26%. 6 patients had “0” peristaltic contractions: 5 had PF and 1 COPD. The incidence of decreased LES was very similar across the diagnostic groups; PF patients showed the highest incidence of abnormal peristaltic contractions and the highest incidence of increased total # of reflux episodes. The number of patients on PPI at the time of testing was highest in the PF and CF groups 55% and 42% as compared to 23% in the COPD group. There was little difference between patients with a diagnosis of GERD at the time of referral and those without, reflux episodes were about 10% higher in the GERD group. Conclusions: Awareness of the association between GERD and PF and CF seems to be good given the higher percentage of patients on treatment. Patients with PF tended to have more abnormal results then the other diagnostic groups, in particular a higher incidence of IEM.

Abstract#: 39
Defect in the Th17 pathway worsens renal interstitial fibrosis and tubular atrophy after ischemic acute kidney injury
Minh-Tri Nguyen 1 , Elise Fryml 1 , Sossy Sahakian 1 , Jean Tchervenkov 1 , Rene Michel 2 , Steven Paraskevas 1
1 Multi-Organ Transplant Program, Department of Surgery, McGill University Health Centre, Montreal, QC, Canada
2 Department of Pathology, Lyman Duff Medical Sciences Building, McGill University, Montreal, QC, Canada
Background: Ischemic acute kidney injury (AKI), which is inherent to kidney transplantation, is associated with the detrimental development of interstitial fibrosis/tubular atrophy (IF/TA). IL-17A, the signature cytokine of the Th17 cell, is fibrogenic in models of heart and lung transplant. It also promotes acute damage after ischemic AKI, but its implication in the progression to renal IF/TA is unknown.

Methods: Wild-type (WT), IL-17AKO, and BATFKO (Th17 lineage deficient) mice on a C57BL/6 background underwent unilateral renal pedicle clamping for 30 minutes, followed by reperfusion for up to 42 days. Injured (IK) and control (CK) kidneys were digested in collagenase, and lymphocytes isolated by density gradient. Phenotypic analysis of Th17, Th1, Th2, and Treg cells was performed by flow cytometry. IF/TA was quantified by a blinded pathologist on H&E and Masson’s Trichrome kidney sections.

Results: CD4+IL-17A+ Th17 cells progressively infiltrated the IK but not the CK in WT mice, peaking at 14 days after reperfusion (Fig. 1A). The majority co-expressed the Th17-specific transcription factor RORgt, and co-secreted IL-17F. At 42 days after reperfusion, significant IF/TA developed in the IK of WT mice, while the CK was IF/TA-free. In comparison to WT mice, IL-17AKO had worse IF/TA, while deficiency in both IL-17A and F (BATFKO) only worsened tubular atrophy (Fig. 1B-C). IL-17F alone did not explain worse fibrosis as its expression in IL-17AKO was reduced compared to WT mice. There was no difference in the expression of IFN-γ (Th1), IL-4 (Th2), or FoxP3 (Treg) by kidney-infiltrating CD4+ T cells after reperfusion between WT and KO mice.

Conclusion: Following ischemia-reperfusion injury, there is a long-lasting infiltration of Th17 cells in the IK. Contrary to evidence in heart and lung transplant models, a defect in the Th17 pathway is fibrogenic after renal AKI, and worsens tubular atrophy. Targeting the Th17 pathway to reduce acute damage after AKI could therefore worsen chronic injury.

Abstract#: 40
Tumor necrosis factor receptor type 2 expression on regulatory T cells predicts acute kidney injury after kidney transplantation
Minh-Tri Nguyen , Elise Fryml , Sossy Sahakian , Shuqing Liu , Jean Tchervenkov , Steven Paraskevas
Multi-Organ Transplant Program, Department of Surgery, McGill University Health Centre, Montreal, QC, Canada
Background: Ischemia-reperfusion injury-related acute kidney injury (AKI) after kidney transplantation leads to worse graft outcomes. Tumor necrosis factor (TNF) is classically a mediator of AKI. Recent evidence, however, suggests that interaction between TNF and TNF receptor type 2 (TNFR2) expressed on AKI-protective regulatory T cells (Tregs) increases their survival and immune suppressive function. We investigated whether pre-transplant TNFR2 expression on peripheral Tregs was reduced in kidney transplant recipients suffering from AKI and had any predictive value.

Methods: 53 consecutive deceased donor kidney transplant recipients were divided into AKI (n=37) and immediate graft function (IGF, n=16) groups based on post-transplant dialysis and 24-hour serum creatinine. Donor, organ procurement, and recipient characteristics were similar between both groups except for donor age and cold ischemic time (CIT). Pre-transplant recipient peripheral blood TNFR2 expression was quantified by flow cytometry, gating on CD4+CD127- Treg. Treg suppressive function was measured by in vitro suppression of autologous CD4+CD25- CFSE-labeled effector T cell proliferation by CD4+CD25+ Treg.

Results: Pre-transplant TNFR2 expression on Tregs correlated with Treg suppressive function (r=0.38, p=0.02) in a subset of 37 recipients, and was decreased in AKI compared to IGF recipients (p<0.05; Fig. 1A). It also accurately predicted AKI in ROC curve analysis (AUC=0.71, p<0.02; Fig. 1B) with a sensitivity of 97.3% and a specificity of 56.3% at a cut-off value of 91.6%. This cut-off value remained a significant predictor of AKI in multivariate logistic regression (OR=119, p<0.01) adjusting for dissimilar variables between our groups (donor age, CIT). Combining TNFR2 expression on Tregs with donor age and CIT to form a logistic regression model improved the predictive accuracy for AKI (AUC=0.95, p<0.01; Fig. 1C).

Conclusion: TNFR2 expression on Tregs is a rapid surrogate marker of Treg suppressive function, and predicts AKI pre-transplant independent of or in combination with donor age and CIT. Its measurement could further guide organ allocation to prevent AKI.

Abstract#: 41
Jessica GY Luc , Joanne Y Zhao , Evangelos D Michelakis , Darren H Freed , Jayan Nagendran
University of Alberta, Edmonton, AB
Background: 2-Methoxyestradiol (2ME2) is an endogenous metabolite of estrogen found in lower levels in men than women. It has been studied in cancer as an antimitotic-agent that is beneficial through the induction of both apoptosis and senescence, specifically in proliferative cancer cells without toxicity to normal cells. As its effect in a transplant rejection setting remains unknown, we hypothesized that 2ME2 can inhibit activated T-cell function.

Methods: Human peripheral blood mononuclear cells (Cedarlane) were cultured and pre-treated with 2ME2 overnight (18H) before activation with Cell Stimulation Cocktail (4H) (eBioscience). The cultured medium was collected for ELISA assays and whole-cell-lysates were collected for western immunoblotting. Five day cultured cells were stained with CellTrace Violet proliferation dye for flow cytometry (FCM). Live-cell and fixed-cell imaging was performed using a LSM-510 confocal microscope (Carl Zeiss) with Tetramethyl-rhodamine-methylester-stain (TMRM) (10nM, Molecular Probes) and TUNEL assay. Annexin V/PI assays was analyzed by FCM.

Results: Markers of T-cell activation, TNF-α (p=0.0025) and IFN-γ (p=0.0216) levels in 2ME2 treated activated T-cells were reduced relative to controls. As well, when compared to controls, activated T-cell proliferation was significantly blunted upon treatment with 2ME2, with a observed 10% decrease in apoptosis, no change in necrotic events, and no decrease in mitochondrial membrane potential or caspase-9 activity. These results collectively suggest that 2ME2 is independent of a mitochondrial-mediated apoptotic mechanism.

Conclusions: Our study is the first to show that 2ME2 is able to decrease the immune response of activated T-cells in a rejection setting. The mechanism by which 2ME2 modulates its anti-rejection effects is related to its ability to induce cell-cycle-arrest by a cellular senescence phenomenon that warrants further investigation. As 2ME2 has a low side-effect profile, it may be a possible oral-immunomodulatory adjunctive therapy for individuals undergoing solid organ transplantation.

Abstract#: 42
Squamous cell carcinoma following prolonged voriconazole treatment in a pediatric lung transplant patient
Jackson Wong 1 , Paul Kuzel 2 , John Mullen 3 , Dale Lien 4 , Muhammad Mahmood 5 , Loretta Fiorillo 1
1 Department of Pediatrics, University of Alberta
2 Division of Dermatology, Department of Medicine, University of Alberta
3 Department of Surgery, University of Alberta
4 Department of Medicine, University of Alberta
5 Department of Laboratory Medicine and Pathology, University of Alberta
PURPOSE: Although skin cancer presents only rarely in children post-transplantation, it is thought that treatment with voriconazole may increase this risk. This study reports a rare case of squamous cell carcinoma which developed in a pediatric lung transplant (LTx) patient with a history of voriconazole exposure.

METHOD: We conducted a retrospective study of a 14 year old boy with cystic fibrosis treated with double LTx. The patient underwent LTx at age 10, after which he was maintained on tacrolimus, mycophenolate mofetil and oral prednisone. Pre-LTx, the unusual fungus Blastobotrys was isolated from his airway secretions, appearing again in the chest tube fluid post-LTx. This was treated successfully with IV and nebulized liposomal amphotericin B (L-AmpB), IV caspofungin and 23 months oral voriconazole.

Voriconazole was stopped when the patient developed severe photosensitivity, despite the use of sunscreen. However, 3 months later the patient developed an aspergillus airway infection which again required treatment with (IV) voriconazole, along with IV caspofungin and nebulized L-AmpB. Prophylactic oral voriconazole was continued for 1 year, along with lifelong nebulized L-AmpB. During this time, the patient developed numerous large irregular lentigines on sun exposed areas and signs of accelerated skin ageing. At 44 months post LTx, he developed a verrucous lesion near his lower left eyelid.

RESULTS: . Skin biopsy found well differentiated squamous cell carcinoma, which was successfully treated with surgical excision.

CONCLUSION: Skin cancer is a rare occurrence in pediatric patients who have undergone solid organ transplantation. Severe photosensitivity due to voriconazole and insufficient sun protection predisposed our patient to develop squamous cell carcinoma. Excision of the lesion was curative. When voriconazole is used post-transplant, it should be discontinued once a patient's risk of fungal disease diminishes. Sunscreen alone is insufficient to protect against sunburn. Exposed skin areas should be well covered. Early reporting and regular examinations by a dermatologist are recommended.

Abstract#: 43
Stable function and phenotype of expanded thymic CD25+FOXP3+ regulatory T cells (Tregs) under inflammatory conditions
Esme Dijke 1 , Alicia McMurchy 2 , Tess Ellis 1 , Karin Boer 3 , Ingrid Larsen 1 , Ivan Rebeyka 1 , David Ross 1 , Carla Baan 3 , Megan Levings 2 , Lori West 1
1 University of Alberta, Alberta Transplant Institute, Edmonton, AB
2 University of British Columbia, Vancouver, BC
3 Erasmus MC Medical Center, Rotterdam, Netherlands
Introduction: Treg-based cellular therapy to suppress graft-directed immunity could reduce the need for life-long immunosuppressive medication. Challenges include isolating pure Tregs and expanding them to clinically relevant numbers while maintaining stable function and phenotype, especially in inflammatory environments. Recently we demonstrated that abundant CD25+FOXP3+ cells can be isolated from discarded pediatric thymuses and expanded to highly suppressive Tregs. To define stability of these cells under inflammatory conditions, we investigated their function and phenotype after culturing under Th1-polarizing conditions.
Methodology: Thymuses (n=5) were obtained during pediatric cardiac surgery. After mechanical dissociation, CD25+ thymocytes (TC) were isolated by magnetic cell separation and expanded with α-CD3, IL-2, rapamycin and artificial APCs. After 7 days, rapamycin was removed; cells were further cultured without (CD25+ TCexp) or with IL-12 (CD25+ TCexp.IL12). CD25-depleted cells were controls. Phenotype was defined by flow cytometry. Stability of FOXP3 expression was assessed by analyzing the methylation status of the Treg Specific Demethylated Region (TSDR) within the FOXP3 gene. Expanded cells were co-cultured with α-CD3/CD28-stimulated PBMC to determine suppressive capacity, analyzing proliferation and IL-2 production by flow cytometry and ELISA, respectively.
Results: Isolated CD25+ TC were FOXP3+Helios+CTLA-4+PD-1dimTGF-β-. After culturing, CD25 + TCexp expanded 11 to 59-fold with high viability (79-97%) and maintained high FOXP3 expression. The TSDR was demethylated in 82-97% of the cells. IFN-γ and IL-2-producing cells were infrequent (0.5-7% and 1-4%, respectively). CD25+ TCexp.IL12 showed a higher expansion capacity (26 to 107-fold; p=0.06); viability (84-96%), FOXP3 expression and the methylation status of the TSDR (% demethylated: 89-100%) were comparable with CD25+ TCexp. In contrast to controls, addition of IL-12 did not increase frequencies of IFN-γ and IL-2-producing cells within CD25+ TCexp.IL12 (2-6% and 1-4%, respectively). TGF-β was upregulated on both CD25+ TC exp and CD25+ TCexp.IL12. Both CD25+ TCexp and CD25+ TCexp.IL12 potently suppressed proliferation and IL-2 production by PBMC even at a 1:10 ratio of Tregs:PBMC.
Conclusion: Expanded CD25+FOXP3+ Tregs isolated from pediatric thymuses maintain stable phenotype and function under inflammatory conditions, including stable FOXP3 expression, absence of IFN-γ-producing cells and potent suppressive capacity. These results indicate that discarded thymuses are potentially an excellent source of Tregs for cellular therapy.

Abstract#: 44
Long term success in lung transplantation in a child with CF and persistent Blastobotrys lung infection
Jackson Wong 1 , Jeff Fuller 1 , Atiliano Lascon 1 , Alf Conradi 1 , John Mullen 1 , Dale Lien 1 , Atul Humar 2
1 University of Alberta
2 Multi-organ Transplant Program, University Health Network, Toronto
Introduction: Fungal infections cause significant morbidity in patients with cystic fibrosis (CF) pre and post lung transplant (LTx).

Methods: A retrospective study of a child with CF and Blastobotrys species (Btbs), an emerging fungal pathogens, pre- and post-LTx.

Results: A 9 year old boy with CF and severe bilateral bronchiectasis inhaled soil particles while playing in a pit. He lost 450cc FEV1(31% predicted). His sputum persistently grew Pseudomonas and 2 strains of Btbs confirmed by a bronchoalveolar lavage culture. He developed severe mucus plugging, permanent bilateral lower lobe collapse and oxygen dependency requiring nocturnal BiPAP, nebulized pulmozyme and hypertonic saline. He had severe sinus disease on CT. Based on susceptibility results he was commenced on antifungal therapy: IV caspofungin, nebulized liposomal amphotericin (L-AmB) and oral voriconazole. Sinus aspiration pre LTx isolated no Btbs. He received double LTx 14 months later. The explanted lung showed Btbs species on microscopy and culture but no angioinvasive disease. The airways were filled with neutrophils. Post LTx, he received Tacrolimus, Cellcept, prednisone but no induction immunosuppression. He received 2 weeks IV antibiotics and 3 weeks IV L- AmB. Fluid from the right chest tube grew Btbs. He was extubated on day 3, chest tubes were removed on day 9, discharged from PICU on day 10 and home on day 36.
Post LTx, the patient received 3 months nebulized L- AmB, 9 weeks IV caspofungin and 22 months oral voriconazole. He had 1 grade 3 rejection successfully treated with IV pulse steroids. Chest HRCT found shadows in the peripheral right lower lobe and pleura that improved over 12 months. BAL (9) and transbronchial biopsies (7) found no evidence of fungus post LTx. Sinus CT findings were unchanged. He was well with a FEV1 of 75% and no evidence of Btbs found on BAL and lung biopsies 2 years post LTx. He was successfully treated at 28 months post-LTx for an episode of Aspergillus airway sepsis. He was put on life-long nebulized L-AmB. His FEV1 was 65% at 57 months post-LTx.

Conclusions: With aggressive combination antifungal therapy, Btbs was treated successfully with long term survival post-LTx.

Abstract#: 45
Influence of concomitant medications on tacrolimus levels after pediatric solid organ transplantation
Steven Habbous , Mina Safi , Alan Fung , Seema Mital
Division of Cardiology, The Hospital for Sick Children, Toronto ON
Background : Tacrolimus is a widely used immunosuppressant following solid organ transplantation. It has a narrow therapeutic window with challenges attaining target levels despite therapeutic drug monitoring. We evaluated if use of concomitant medications (CM) during the first 48 hours of tacrolimus initiation impacts tacrolimus drug levels.
Methods: Pediatric solid organ transplant recipients enrolled in our transplant Biobank from 2010-2013 were included. Medical records were reviewed for clinical, demographic, and medical history. Patients undergoing re-transplants were ineligible.
Results: Thirty-seven patients were included (21 heart, 15 liver, 1 kidney). During the first 48h after tacrolimus initiation (median starting dose, 0.09mg/kg), methylprednisolone (n=5) and amlodipine (n=5) were the most frequently used CM, followed by lansoprazole (n=3), fluconazole (n=2), nifedipine (n=2), and amiodarone (n=1). The median circulating tacrolimus level at 36-48 hours post drug initiation was 7.9 ng/mL (range 0.9–29.2 ng/mL). Among patients receiving no CM known to interact with tacrolimus, 15/28 (54%) experienced out-of-range levels during the first 48h after tacrolimus initiation. In contrast, among patients receiving at least one CM, 8/9 (89%) experienced out-of-range levels (p=0.06). The median tacrolimus dose at 48 hours was 0.21 mg/kg for patients with ≤3 CM vs 0.35 mg/kg for >3 CM (p=0.14). Results from the larger biobank cohort are being analyzed.
Conclusion: M edications that interact with tacrolimus may influence tacrolimus drug levels attained after drug initiation. The presence of such CM should be considered in the choice of starting tacrolimus dose.

Abstract#: 46
IL-37 controls IL-18 induced pro-inflammatory cytokine expression by tubular cells and attenuates renal ischemia-reperfusion injury
Yunbo Yang 1 , Zhu-Xu Zhang 2 , Jevnikar Anthony 3
1 Matthew Mailing Centre for Translational Transplantation Studies,London Health Sciences Centre, Western University,London,
2 Matthew Mailing Centre for Translational Transplantation Studies, London Health Sciences Centre; Depts. of Pathology, Western University
3 Matthew Mailing Centre for Translational Transplantation Studies, London Health Sciences Centre; department of Medicine, Western University
IL-37 is a newly described member of the IL-1 family,which is anti-inflammatory by its inhibition of pro-inflammatory cytokine production. While human IL-37 has no murine homologue, shared amino acid sequences with IL-18 allows binding to both human and mouse IL-18 receptors as well as IL-18 binding protein (IL-18BP).IL-37 binds to IL-18Rα with low affinity and also can form a trimeric complex with IL-18BP and IL-18Rβ to block IL-18 activity. IL-37 is expressed by mononuclear cells, dendritic cells and breast carcinoma cells, but to date no report has described IL-37 expression in renal tubular epithelial cells (TEC) nor any capacity to attenuate kidney ischemia reperfusion injury (IRI). We have found that mouse TEC has basal expression of IL-18Ra, IL-18Rβ and IL-18BP, and TNFa and hypoxia inducible expression of IL-18. Exogenous IL-37 (300ng/mL) down-regulated the IL-18 induced expression of TNF-a, IL-6 and IL-1β in murine NG TEC (p <0.05) and human PT2 TEC (p <0.01). Importantly, for the first time, we found that LPS, IFN-g and IL-18 induced the expression of IL-37 in human PT2 TEC. Consistent with an inhibitory role, mRNA silencing of IL-37 in TEC resulted in augmented mRNA expression of TNF-a, IL-6 and IL-1β induced by IL-18. In contrast, enhanced expression of IL-37 by human PT2 TEC transfected with pCMV6-XL5-IL37 plasmid decreased mRNA expression of pro-inflammatory cytokines (TNF-a, IL-6 and IL-1β) induced by IL-18 (p<0.05). We then tested whether IL-37 could reduce renal IRI in vivo using a uni-nephrectomy mouse model and renal artery clamping (45 min, 33℃). Mice was transfected with pCMV6-XL5-IL37(tail vein, 24 hours prior). Augmented kidney expression of IL37 in these mice resulted in lower serum creatinine levels at 48 hours compared to vector controls (73.8±52.5umol/l vs 169.2±56.1umol/l, n=8/grp, p=0.009). Our results confirm that TEC express IL-18 and IL-18R, and for the first time report that TEC express the IL-18 contra-regulatory proteins IL-37 and IL-18BP. Collectively these data suggest IL-37 production by TEC may be a previously unrecognized intra-renal control mechanism to attenuate inflammation. Augmenting kidney IL-37 levels may represent a novel strategy that exploits an endogenous pathway to broadly prevent renal inflammatory injury and IRI during transplantation.

Abstract#: 47
Assessing School Readiness and Health–Related Quality of Life in Children Transplanted Under the Age of 2 Years at the Hospital for Sick Children: A Pilot Study
Elizabeth Cruchley 1 , Anna Gold 2 , Alaine Rogers 3 , Stephanie Rankin 1 , Maria De Angelis 1 , Binita Kamath 4 , Yaron Avitzur 4 , Vicky Lee Ng 4
1 Transplant and Regenerative Medicine Centre, Hospital for Sick Children, Toronto
2 Transplant and Regenerative Medicine Centre, Hospital for Sick Children, Toronto; Department of Psychology, Hospital for Sick Children, Toronto
3 Department of Rehabilitation, Hospital for Sick Children, Toronto
4 Transplant and Regenerative Medicine Centre, Hospital for Sick Children, Toronto; Department of Pediatrics, University of Toronto, Toronto
Background: Advances in pediatric liver transplantation (LTx) have led transplant teams to examine outcomes beyond survival, specifically neurodevelopmental outcomes and quality of life. This pilot study examined early neurocognitive abilities in a cohort of children who have had LTx, compared with controls (children with liver disease without LTx).
Methods: Participants were 3 - 6.99 years of age, >1 year post isolated LTx with age at LTx < 2 years. Control patients had chronic cholestatic liver disease (LD), still with native liver. All participants completed a range of neurocognitive and motor tasks, with parent and teacher questionnaires.
Results: A total of 18 participants, 13 LTx (mean age 4.69 years) and 5 LD (mean age 4.49 years) completed the testing battery. Indications: biliary atresia (17 participants) and alpha-1 antitrypsin (1 control). Overall intellectual abilities fell within the age expected range for both groups, but LTx participants performed significantly poorer on two tasks assessing visual construction (p= .037) and processing speed (p=.067). LTx patients had a greater frequency of below average scores on visual perception (29 % vs .8% for LD) and motor coordination (24% vs .8% for LD). Parent and teacher reports indicated greater concerns among the LTx group for mood (e.g. low mood, anxiety), quality of life and adaptive behaviour skills.
Conclusions : Pre-school children with LTx before 2 years of age were within population norms across many domains, but demonstrated challenges in the areas of visual and motor functioning, with perceived problems of executive, emotional and adaptive functioning. These potential challenges may compromise early academic and functional skills acquisition. Thus, routine early neurodevelopmental assessment is vital among preschool aged children post LTx to establish appropriate early educational and rehabilitation supports.

Abstract#: 48
The Comparative Effectiveness of Double vs. Single Deceased Donor Kidney Transplantation in the Modern Era
Shafi Malik , S. Joseph Kim
Division of Nephrology and the Kidney Transplant Program, Toronto General Hospital, University Health Network, University of Toronto
BACKGROUND: The use of double (DKT) vs. single (SKT) kidney transplantation has not been well studied in the modern era.

METHODS: This is a cohort study of all incident adult deceased donor kidney transplant recipients from 1 Jan 2000 to 31 Dec 2010 (followed until 31 Dec 2011) using the Scientific Registry of Transplant Recipients. The relation between DKT status and delayed graft function (DGF) was quantified using logistic regression models. Cox proportional hazards models were fitted to determine the association of DKT status with total graft failure, death-censored graft failure, death with graft function, and total mortality. To compare outcomes of DKT and SKT recipients with kidneys of similar quality, we used propensity score (PS) matching to balance all measured confounders.

RESULTS: The total study cohort and PS subcohort comprised of 75,081 (SKT 73,911, DKT 1,170) and 2,326 (SKT 1,163, DKT 1,163) patients, respectively. In the total study cohort, DKT recipients were older, more likely diabetic, and had longer cold ischemia times. DKT donors were also older, had lower pre-terminal kidney function, and were more likely expanded criteria donors. DKT (vs. SKT) was associated with a decreased adjusted relative odds of DGF in the total study cohort (OR = 0.69 [95% CI: 0.60, 0.80]). DKT showed a protective effect on graft failure endpoints but no significant effect on mortality endpoints (see Table). The protective effect of DKT was significantly diminished in patients with DGF. Similar findings were observed in the PS subcohort.

CONCLUSIONS: DKT is associated with reduced rates of DGF and graft failure vs. SKT of similar quality. The occurrence of DGF may modify the protective effect of DKT on graft outcomes. The potential impact on life expectancy from transplanting two SKTs vs. one DKT using kidneys with qualities similar to those in this cohort requires further study.

Abstract#: 49
Patient survival after heart transplantation, the effect of older age of recipients.
Michel Carrier , Anique Ducharme , Guy Pelletier , Michel White , Louis Perrault
Montreal Heart Institute
Patient survival after heart transplantation, the effect of older age of recipients. Introduction There is no strict upper age limit for candidates of heart transplantation. Therefore, all programs have accepted and proceeded to heart transplantation with patients in their sixties. The objective of the present study is to review survival of older recipients (60 years and older) andto assess the possibility to offer permanent LVAD implantation as an alternative to heart trans- planation in this age group of patients. Method We reviewed the experience of the Montreal Heart Institute with 356 patients who underwent heart transplantation between 1983 and 2010. Patient survival and characteristics were evaluated accor-ding to age at transplantation. Results There were 33 patients aged 60 years and older and 290 younger patients who underwent heart transplantation during the study period. 1 and 5 year patient survival averaged 85%±6 and 72%±8 in patients 60 years and older and 93%±1 and 88%±2 in younger patients (p=0.005). The INTERMACS registry for mechanical assisted circulatory support reported a survival averaging 78% and 43% ofpatients one and five years after LVAD (left ventricular assisted device) implantation (n=8609). Conclusion Although heart transplantation remains the treatment of choice for patients with end stage heart disease, permanent LVAD implantation is becoming a serious alternative to transplantation, specially in older recipients.

Abstract#: 50
Assessment of Upper Limb Function in Lung Transplant Candidates
Polyana Mendes 1 , Lianne Singer 2 , Lisa Wickerson 3 , Denise Helm 2 , Tania Janaudis-Ferreira 4 , Dina Brooks 5 , Sunita Mathur 5
1 Graduate Department of Rehabilitation Science, University of Toronto
2 Toronto Lung Transplant Program, University Health Network
3 Department of Physical Therapy, University of Toronto and Toronto Lung Transplant Program, University Health Network
4 St John’s Rehab Hospital, Department of Clinical Research
5 Graduate Department of Rehabilitation Science and Department of Physical Therapy, University of Toronto

Introduction: Upper limb muscle size, strength and exercise capacity have not been objectively assessed in lung transplant (LTx) candidates. The objectives of this study were to (1) compare upper limb muscle size, arm exercise capacity and muscle strength in a cohort of LTx candidates to healthy control subjects; and (2) examine predictors of arm exercise capacity in LTx candidates.
Methods: Thirty-four LTx candidates enrolled in a pulmonary rehabilitation program (60 ± 8 years; 59% males) with the following diagnoses (IPF=26, COPD=6, Bronchiectasis=2, Bronchiolitis Obliterans=1, Bronchoalveolar Carcinoma=1) and 12 healthy control subjects (56 ± 9.5 years; 50% males) were included in the study. All subjects underwent measures of biceps muscle thickness and muscle strength of the elbow flexors using ultrasound and a hand held dynamometer (HHD), respectively. Subjects also performed the 6-minute walk test (6MWT) and the unsupported upper limb exercise test (UULEX).
Results: Biceps muscle thickness was 10% smaller in LTx candidates when compared with healthy controls but did not reach statistical significance (2.54±0.39 vs. 2.80±0.33cm2; p=0.06). Elbow flexion strength of LTx candidates was 47% lower than controls (177.4±74 vs. 260.7±79 Newtons; p=0.01). LTx candidates had a shorter time to fatigue in the UULEX compared to controls (553.6 vs. 702sec; p<0.01). A moderate correlation was found between biceps muscle thickness and strength (r=0.60, p<0.01). Muscle strength also correlated with UULEX (r=0.58, p <0.01) and percent predicted 6MWT (r=0.46; p<0.01) in LTx candidates. Muscle strength was the only significant predictor of the UULEX, (b=0.59, p<0.01).
Conclusion: Although the LTx candidates were actively participating in rehabilitation, upper limb muscle strength and exercise capacity were impaired, thus specific training strategies may be required pre- and post-transplant to target improvements in upper limb function.

Abstract#: 51
Eculizumab Decreases Early Antibody-mediated Rejection in Sensitized Deceased Donor Kidney Transplant Recipients
D Glotz 1 , C Legendre 2 , M Manooke 3 , L Rostaing 4 , G Russ 5 , D Roelen 6 , K Nelson 7 , W Marks 8
1 Saint-Louis Hospital and INSERM U 940, Paris, France
2 Université Paris Descartes and Hôpital Necker, Paris, France
3 Guy’s Hospital, London, UK
4 Toulouse University Hospital, Université Paul Sabatier and Institut National de la Santé et de la Recherche Médicale, Toulouse, France
5 The Queen Elizabeth Hospital, Woodville, Australia
6 Leiden University Medical Centre, Leiden, The Netherlands
7 Puget Sound Blood Center, Seattle, WA, USA
8 Alexion Pharmaceuticals, Inc., Cheshire, CT, USA
Background : Antibody-mediated rejection (AMR) develops in ~30% of sensitized transplant patients; the main cause is complement activation by donor-specific antibodies (DSAs). We evaluated the safety and efficacy of eculizumab, a monoclonal antibody that binds to C5 and blocks terminal complement activation, to prevent acute AMR in sensitized recipients of deceased donor kidney transplants.
Methods : In a multicenter, open-label, phase 2, single-arm study of adult (≥18 years) candidates for deceased donor kidney transplantation with donor sensitization (N=47), select enrollment criteria were: stage V chronic kidney disease; prior exposure to human leukocyte antigens; presence of anti-donor antibodies by Luminex® and/or B- or T-cell positive cytometric flow crossmatch with negative CDC at transplantation; no ABO incompatibility, previous eculizumab treatment, or history of severe cardiac disease, splenectomy, bleeding disorder, or active infection. Patients received eculizumab (1200 mg on day 0 before reperfusion; 900 mg on days 1, 7, 14, 21, 28; 1200 mg at weeks 5, 7, 9) and induction/maintenance immunosuppressants. The primary endpoint was treatment failure rate (biopsy-proven AMR, graft loss, patient death, and/or loss to follow-up) at week 9 post-transplantation. Secondary endpoints included mean serum creatinine levels and treatment-emergent adverse events (TEAEs). Preliminary results are reported.
Results : Patients (mean age, 49.8 years) had a mean (SD) of 2.3 (1.6) DSAs and a mean (SD) total DSA median fluorescence intensity of 5,873 (4,675). Five patients (10.6%) met criteria for week 9 post-transplantation treatment failure (biopsy-proven AMR, n=3; graft loss and/or death, n=2). Mean serum creatinine level at month 3 was 1.74 mg/dL. Conclusion: Eculizumab appeared to be a safe and effective prophylaxis against early acute AMR in sensitized recipients of deceased donor kidney transplants.

Abstract#: 52
Yupin Tong 1 , Xiaoli Pang 2 , Jutta Preiksaitis 1
1 Division of Infectious Diseases, Department of Medicine, University of Alberta, Edmonton, Alberta, Canada
2 Department of Laboratory Medicine and Pathology, University of Alberta, Edmonton, Alberta, Canada Provincial Laboratory for Public Health (ProvLab), Edmonton, Alberta, Canada
Background: Although cytomegalovirus (CMV) DNA testing in plasma is used extensively for CMV disease prevention, diagnosis and treatment monitoring in solid organ transplant (SOT) recipients,the exact biologic form of CMV DNA in plasma is unknown .

We developed an assay combining DNase I digestion and quantitative real-time PCR (CMV-QPCR) to differentiate CMV free DNA from encapsidated virions in plasma samples. The Qiagen DNA mini kit was used for viral DNA extraction and CMV-QPCR assay for quantitation. The amount of CMV free DNA was determined by subtracting virion DNA (after digestion) from total DNA (without digestion); DNase I degrades all free viral DNA but not encapsidated CMV virions. 104 serial CMV DNA positive stored plasma samples collected from 20 SOT patients with different pre-transplant CMV risk serostatus (10 = donor (D) +/recipient (R) -, 7 = D+/R+, 2 = D-/R+, and 1 = D-/R-) were tested.

Results: The optimized assay achieved 99.99%-100% efficiency in spiked CMV DNA degradation in plasma samples diluted 1:4 Non-encapsidated free DNA was the primary biologic form of CMV in plasma, representing 98.82%-100% of all detectable CMV DNA in all samples. CMV virions were not detected in 88 samples (84.6%) but found at extremely low levels (≤0.5% virions) in 16 samples (15.4%).

Conclusion : Delayed separation of plasma from whole blood resulting in contamination by virions from lysed cells is a possible explanation for virion presence in some plasma samples. Although measurement of CMV DNA in plasma is a clinically useful surrogate of replicating infectious virus, CMV DNA in plasma is almost exclusively non-infectious free DNA. This has implications for the design of assays for measuring CMV DNA in this compartment and the interpretation of dynamic patterns of CMV DNA in individual patients.

Abstract#: 53
BKV recruits lymphocytes for viral propagation
Deyaa El Deen Morsy , Lee Anne Tibbles
University of Calgary
BKV nephropathy (BKN) is currently the leading cause of early renal allograft loss. The process starts after immune suppression following renal transplantation, which causes BK virus (BKV) reactivation, resulting in lytic injury of renal tubular epithelial cells followed by tubular inflammation, and allograft destruction. Although the role played by lymphocytes in BKN pathogenesis remains largely uncharacterized, both cellular and humoral immunity were demonstrated to be involved in BKV early elimination. Our laboratory is interested in characterizing the role played by lymphocytes in BKV pathogenesis. Using real-time polymerase chain reaction (rt-PCR) and immunoblotting, we demonstrated that BKV can infect human B and T cell lines. BKV was also shown to infect and replicate in primary lymphocytes from peripheral blood at higher efficiency than cell lines. Viral replication was associated with remarkable changes in cell shape and behaviour, with cells acquiring a refractile, fusiform appearance and becoming more adhesive. B lymphocytes infected with BKV differentiated into both memory and plasma cells, but a significant proportion of these differentiated cells were not specific for BKV proteins. These observations may provide a clue into the accumulation of plasma cells in renal transplants affected by BK nephropathy. The potential transmission of BKV virions between immune lymphocytes was confirmed by rt-PCR of CFSE-labelled BKV infected and non-infected purified lymphocytes that were previously incubated and sorted using flow cytometry. To examine the mechanism of viral transmission, we used fluorescent microscopy to image lymphocytes from PBMCs infected with BKV in vitro. We found that BKV induced lymphocytes to form long tiny tubules (tunneling nanotubules, TNT) that transmitted both BKV virions and large T-antigen between neighbouring cells. Viral transmission in TNT structures might elucidate a viral escape mechanism that evades the host humoral immunity. Other mechanisms of viral transmission are still under investigation. This project is anticipated to establish a baseline for understanding host-viral interaction in patients with BKN, which could also help in designing tools for assessment of, and following up progression of BKN in allograft recipients.

Abstract#: 54
Assessing the impact of early out-of-range tacrolimus levels on organ rejection after heart transplant
Mina Safi , Steven Habbous , Alan Fung , Seema Mital
Hospital for Sick Children
Background: Tacrolimus (FK) is used as an immunosuppressant after solid organ transplantation. Routine induction therapy often precludes the need for aggressive uptitration of FK in the early post-transplant period. In pediatric heart transplant recipients who routinely receive peri-operative induction therapy, we assessed the association between out-of-range FK levels in the first two weeks after transplant and frequency of biopsy-proven rejection during one-year follow-up.

Methods: We studied pediatric heart transplant recipients enrolled in the Transplant Centre Biobank Registry between 2010-2013. All patients routinely receive anti-thymocyte globulin post-transplant. FK trough levels during first two weeks post-transplant were captured. Out-of-range levels were defined as levels necessitating FK dose change. Time to attain stable therapeutic levels (i.e. ≥2 consecutive FK levels in therapeutic range without dose change) was calculated.

Results: Of 30 patients that met inclusion criteria, endomyocardial biopsies were performed in 24 (80%). Biopsies were performed at 1-month (67%, 16/24), 3-month (67%, 16/24), 6-month (54%, 13/24), and 12-month (14, 47%) intervals post-transplant. 21% of patients had no rejection (0R), 54% mild rejection (1R) and 25% moderate rejection (2R). Median number of dose changes in the first two weeks after transplant was 3 (range 1–7). Median frequency of dose changes was higher in rejectors versus non-rejectors (3 vs. 2, p=0.03). Patients with >3 dose changes had higher risk of rejection (100% vs. 67%, p=0.05). Time to achieve stable therapeutic levels was higher in patients with >3 dose changes versus ≤3 dose changes (median 7 vs. 3 days, p=0.01). Time to achieve steady state was not different in rejectors vs. non-rejectors (median 5 days vs. 3 days, p=0.39).

Conclusions: Non-therapeutic FK levels in the first two weeks post-transplant are associated with higher risk of mild/moderate rejection during 1-year follow-up. These findings highlight the importance of achieving therapeutic FK levels early in the post-transplant period, even in patients receiving routine induction therapy.

Abstract#: 55
Immune Sensitization and Mortality on the Waiting List for Kidney Transplantation
Ruth Sapir-Pichhadze , Kathryn Tinckam , Alexander Logan , Andreas Laupacis , S. Joseph Kim
Divisions of Nephrology and General Internal Medicine, Department of Medicine, University of Toronto
Background: Cardiovascular mortality is the leading cause of death in end-stage renal disease (ESRD) patients. Inflammation has been shown to play a role in cardiovascular disease. Anti-HLA antibodies, measured as panel reactive antibodies (PRA), may also be a marker of inflammation. PRA is routinely monitored in ESRD patients on the waiting list to facilitate transplantation with a compatible donor. Whether PRA is a risk factor for cardiovascular mortality in ESRD is unknown.

Methods: Using the Scientific Registry of Transplant Recipients, we conducted a retrospective cohort study in first-time adult kidney transplant candidates. The relationship between PRA modeled as a time-dependent categorical variable (PRA 0%, PRA 1 to 19%, PRA 20 to 79%, and PRA 80 to 100%) and cardiovascular mortality was assessed in competing risk Cox proportional hazards models (i.e., Fine and Gray model). Transplantation and non-cardiovascular mortality were considered competing events. All-cause mortality was a secondary endpoint. The analysis was repeated in subcohorts of transplant candidates who were unsensitized and at low risk for cardiovascular mortality at baseline.

Results: In a sample of 34,700 kidney transplant candidates, competing risks Cox proportional hazards models showed an increase in the hazard ratios (HR [95% confidence interval]) for cardiovascular mortality (HR 1.05 [0.87, 1.28], 1.48 [1.20, 1.83], and 1.52 [1.13, 2.04]) and all-cause mortality (HR 1.02 [0.95, 1.10], 1.34 [1.23, 1.45], and 1.65 [1.48, 1.84]) across ascending PRA categories with PRA 0% as the referent, respectively. A similar relationship was noted in patients who were unsensitized and at low risk for cardiovascular mortality at baseline.

Conclusions: Our findings suggest that PRA may be an independent risk factor for mortality on the waiting list for kidney transplantation. Current organ allocation schemes may need to consider this additional risk of mortality in sensitized candidates awaiting kidney transplantation. Whether transplantation modifies the risk of cardiovascular mortality in sensitized patients requires further study.

Abstract#: 56
A MicroRNA of Cytomegalovirus (CMV) Regulates a Major Host Transcription Factor Interfering with Macrophage Polarization
Luiz Lisboa 1 , Deepali Kumar 2 , Xiaoli Pang 3 , D. Lorne Tyrrell 1 , Atul Humar 2
1 Department of Medicine, University of Alberta
2 Multi-Organ Transplant Program, University of Toronto
3 Department of Laboratory Medicine & Pathology, University of Alberta
The CMV genome encodes a number of microRNAs that are hypothesized to regulate host immune responses. Our previous investigations in organ transplant recipients revealed the expression of CMV microRNAs in the blood at the time of diagnosis of CMV disease. We showed that the viral microRNA hcmv-miR-UL22A independently predicted virologic recurrence after discontinuation of antiviral therapy. We hypothesized that this viral microRNA may regulate the host immune response. The objective of this work was to investigate the regulatory roles of hcmv-miR-UL22A over human host proteins.

We performed in vitro studies utilizing CMV-susceptible cell lines MRC-5 (fibroblast) and THP-1 (monocyte) and transient transfection of synthetic microRNA mimics or inhibitors. Protein levels were assessed by ITRAQ LC-MS proteomics and immunoblotting. Computational analyzes were performed for microRNA target prediction and transcription factor enrichment. Gene expression was assessed by RT-qPCR.

Transfection of hcmv-miR-UL22A led to significant reduction in expression levels of multiple proteins in fibroblasts. Computational analysis, in most cases, was unable to predict mRNA sequences directly targeted by the viral microRNA. Based on this, targeting of a common transcriptional enhancer was suspected and an enrichment analysis revealed C-MYC as a transcription factor shared by the affected genes (z=-2.519 p<0.001). Fibroblast expression of C-MYC protein isoforms was differentially affected by transfection of hcmv-miR-UL22A, and targeting of predicted C-MYC mRNA sequences is being confirmed by luciferase reporter assays. C-MYC is a key transcription factor for macrophage polarization, a process known to be hijacked by CMV infection thereby facilitating viral dissemination. Hcmv-miR-UL22A importantly limited the IL-4-induced M2 polarization of PMA-differentiated macrophages, with a ~ 2.5-fold reduction in expression of IL-10 mRNA levels.

Our efforts to better understand the association of hcmv-miR-UL22A and virological recurrence in organ transplant recipients uncovered the regulation of a major human transcription factor – C-MYC – by a CMV microRNA. This finding has direct implications over the process of macrophage polarization, known to be affected by CMV. This works shows how CMV may subvert the immune response to facilitate viral dissemination and may contribute to a better understanding of CMV reactivation and recurrence post-transplant.

Abstract#: 57
Jennifer Harrison 1 , Holly Mansell 2 , Christian Coursol 3 , Marie-Josee Deschenes 4 , Jennifer Gibson 5 , Erica Greanya 6 , Lee Anne Tibbles 7 , Jenny Wichart 8 , Tom Blydt-Hansen 9
1 Toronto General Hospital, University Health Network, Toronto ON; Leslie Dan Faculty of Pharmacy, University of Toronto, Toronto ON
2 University of Saskatchewan, College of Pharmacy and Nutrition, Saskatoon SK; Saskatchewan Transplant Program, Saskatoon SK
3 Royal Victoria Hospital, Montreal QC
4 Ottawa Hospital, Ottawa ON
5 Health Sciences Centre, Winnipeg MB
6 Vancouver Island Health Authority, Victoria BC
7 Department of Physiology and Pharmacology, University of Calgary, Calgary AB
8 Alberta Health Services, Calgary AB
9 Winnipeg Children’s Hospital, Winnipeg MB; Pediatric Nephrology, University of Manitoba, Faculty of Medicine, Pediatrics and Child Health, Winnipeg MB
Background: Adverse symptoms of immunosuppressants (ASI) are relevant for management of transplant recipients due to potential impact on quality of life and medication adherence. ASI have received relatively little attention in the literature compared to other immunosuppressant toxicities.

Methods: A review of pivotal trials for common immunosuppressants (tacrolimus, cyclosporine, mycophenolate compounds, sirolimus) was performed to identify common ASI (≥15% reported incidence). Perceptions of health care providers on ASI (including prednisone and azathioprine) in post-transplant patients were evaluated via survey. Questions addressed ASI frequency (proportion of patients requiring active management), patient quality of life, etiology, and ease of management. The survey was distributed electronically to members of the Canadian Society of Transplantation.

Results: Forty-nine respondents representing all solid-organ transplant specialties (53% physicians, 29% pharmacists, 18% nurses) completed the survey evaluating 12 ASI (tremor, diarrhea, nausea, constipation, dyspnea, insomnia, peripheral edema, dyspnea, arthralgia, acne, mouth sores, paresthesias). Diarrhea, dyspepsia and insomnia were perceived as frequent (requiring management in >20% of patients) by 23%, 16% and 10% of respondents, respectively. Mouth sores, arthralgia and constipation were perceived to require management infrequently (<2% of patients), by 56%, 56% and 54%. ASI believed to most negatively impact quality of life were diarrhea, dyspnea and insomnia (cited as ‘major impact’ by 47%, 39% and 31% of respondents, respectively). All respondents perceived immunosuppressants as the main contributor to the etiology of tremor, diarrhea, acne and mouth sores. Non-immunosuppressive drugs were cited as most important in the etiology of peripheral edema and constipation, and primary disease-related factors in the etiology of dypsnea. Clinicians reported greatest success in managing diarrhea, dyspepsia and nausea and least success for arthralgia, paresthesias and dyspnea.

Conclusion: Clinician survey data show that ASI occur frequently, can be challenging to manage and are perceived to have an important impact on patient quality of life.

Abstract#: 58
An analysis of Natural Killer Cell subsets and their relationship to cytomegalovirus (CMV) immunity in Solid Organ Transplant patients
Shanil Keshwani 1 , Luiz Lisboa 2 , Nicolas Mueller 3 , Deepali Kumar 1 , Atul Humar 1
1 Multi-Organ Transplant Program, University of Toronto
2 Department of Medicine, University of Alberta
3 Department of Medicine, University Hospital Zurich, Switzerland.
Prediction of CMV reactivation after transplant by testing of CMV-specific immune responses may be useful. In addition, to CMV specific CD4+ and CD8+, NK cell responses may be important for control of CMV. Recently a subset of NK cells which express the NKG2C receptor have been implicated in a type of memory function specific to CMV control. We tested ex-vivo cellular immune response to CMV using live virus stimulation.

In collaboration with the CNTRP and the Swiss Cohort Study, pre-transplant cryopreserved PBMCs from 275 transplant patients enrolled in 5 Swiss centers were analyzed. All patients were CMV-seropositive. PBMCs were exposed for 6 hours to CMV strain Towne at MOI 0.03, prior to live/dead and antibody staining for 10-color flow cytometry immunophenotyping of T-cell subsets and NK cell subsets including NKG2C+ cells. Unstimulated and PMA/Ionomycin-stimulated PBMCs served as controls. Only live cells were retained in the analysis.

Overall, live virus challenge of PBMCs resulted in significant interferon-γ production by CD4+ (p<0.001) and CD8+ T-cells (p=0.002) and by NK-cell subpopulations (p<0.05; Wilcoxon matched-pair for all comparisons). It significantly altered the relative proportions of CD4+ (reduced) and CD8+ (increased) T-cells, and decreased the frequency of T regulatory cells compared to non-stimulated controls. In the NK cell compartment, increased frequency of both CD56dimCD16neg and CD56brightCD16dim were seen with live virus stimulation, with concurrent increase in expression of interferon-γ and of the potential memory marker NKG2C. The frequency of T-reg cells was significantly and directly correlated with the frequency of CD56dim and inversely correlated with CD56bright NK cell populations, in both unstimulated and stimulated cells. Furthermore, the phenotypes can be differentiated into responders and non-responders based on interferon-γ produced by CD4+ and CD8+ T-cells in 20% and 64% of the patients respectively, and NKG2C expression.

Ex vivo live-virus CMV challenge allowed for meaningful redistribution and stimulation of T and NK cell subpopulations and indicated a crosstalk between T-reg and NK cells. Pre-transplant samples from CMV-seropositive transplant patients showed heterogeneity of cellular responses. These responses have the potential to determine the risk of CMV reactivation post-transplant. Clinical correlation is currently being analyzed.

Abstract#: 59
Christophe Legendre 1 , David Cohen 2 , Thorsten Feldkamp 3 , Denis Fouque 4 , Richard Furman 5 , Osama Gaber 6 , Larry Greenbaum 7 , Timothy Goodship 8 , Hermann Haller 9 , Maria Herthelius 10 , Maryvonne Hourmant 11 , Christoph Licht 12 , Bruno Moulin 13 , Neil Sheerin 8 , Antonella Trivelli 14 , Camille L. Bedrosian 15 , Chantal Loirat 16
1 Université Paris Descartes, Hôpital Necker, Paris, France
2 Columbia University Medical Center, New York, USA
3 University Hospital Schleswig Holstein Christian-Albrechts-University Kiel, Germany
4 Centre Hospitalier Lyon-Sud and Université de Lyon, Lyon, France
5 Weill Cornell Medical College, New York, NY, USA
6 Methodist Hospital, Houston, TX, USA
7 Emory University School of Medicine, Atlanta, GA, USA
8 Newcastle University, Newcastle upon Tyne, UK
9 Hannover Medical School, Hannover, Germany
10 Karolinska University Hospital, Stockholm, Sweden
11 CHU Hôtel Dieu-Nantes, Nantes, France
12 Hospital for Sick Children and University of Toronto, Toronto, Canada
13 Hôpitaux Universitaires de Strasbourg, Nouvel Hôpital Civil, Strasbourg, France
14 Istituto G. Gaslini, Genoa, Italy
15 Alexion Pharmaceuticals, Inc., Cheshire, CT, USA
16 Assistance Publique-Hôpitaux de Paris, Hôpital Robert Debré, Paris, France
BACKGROUND: Case reports suggest eculizumab (Ecu) inhibits complement-mediated thrombotic microangiopathy (TMA) and prevents graft loss in patients with TMA post-transplantation (Zuber, Am J Transplant 2012;12:3337).
METHODS: Ecu efficacy/safety were evaluated in 2 single-arm, 26-wk, phase 2 trials with long-term extensions in 37 aHUS patients aged ³12y. Outcomes in non-transplant (NT, n=10+12) and prior transplant (T, n=7+8) patients with 2y of ongoing Ecu treatment were analyzed.
RESULTS: Ecu improved renal function in all groups. In patients with progressing TMA, improvement was greater in NT vs T (P=0.0165; Table). Baseline eGFR was not predictive of eGFR change in any group. Earlier treatment was associated with greater eGFR gain (P<0.05).
CONCLUSIONS: Ecu is effective in improving renal function in NT and T patients. Data support early initiation of Ecu to preserve renal function and suggest prophylactic Ecu treatment in aHUS patients undergoing transplantation may be of benefit.

Table. Baseline characteristics and 2-y efficacy outcomes in Ecu-treated patients with or without prior transplantation.

Progressing TMAa
(median Ecu duration 100 wk)
Long disease durationb
(median Ecu duration 114 wk)

Baseline characteristics
Age, mean, y (range) 28 (17–68) 37 (21–47) 27 (13–52) 40 (25–63)
Female 60% 86% 84% 25%
No identified complement mutation or autoantibody 20% 28% 17% 63%
Pts with >1 transplant, n (%) NA 1 (14.3)c NA 5 (62.5)d
Chronic kidney disease stage (2,3a,3b,4,5), n 0,1,3,3,3 0,0,1,2,4 0,2,3,3,4 2,0,3,3,0
Platelet count ×109/L,
102.0±33.5 119.1±29.4 225.0±74.7 232.4±86.9
eGFR mL/min/1.73 m2, mean±SD 27.0±14.8 17.1±12.9 25.9±16.6 38.2±21.0
Time from transplantation to screening, mean±SDe NA 18.5±28.6 NA 14.7±11.2
Time from clinical manifestation of aHUS to screening, mean±SDe 0.67±0.47 1.71±1.4 14.4±14.3 12.9±12.0
eGFR increase, mL/min/1.73 m2, mean±SD 48.3±38.4 14.8±18.7 7.3±7.5 3.9±23.8
eGFR, estimated glomerular filtration rate; MDRD formula used in adults >18y and Schwartz formula in children.
a Platelet count <150×109/L after ≥4 plasma exchange/infusion sessions in prior wk with average platelet count decrease >25% prior to most recent TMA presentation.
b Receiving chronic plasma exchange/infusion on stable regimen with no platelet count decrease >25% during 8-wk observation period before Ecu treatment.
c 1 pt had 2 prior transplantations.
d 3 pts had 2 prior transplantations.
e Months.

Abstract#: 60
DN Treg cells specifically suppress cognate alloreactive memory T cells and prolong allograft survival by secreting Granzyme B.
Ye Su , Anthony Jevnikar , Xuyan Huang , Dameng Lian , Zhuxu Zhang
Mathew Mailing Centre for Translational Transplantation Studies, Lawson Health Research Institute, London Health Sciences Centre; Departments of Medicine, and Pathology, of Western University, London, Ontario, Canada.
[Background] Since the emphasis of clinical organ transplant survival moved from short term to long term, memory T (Tm) cell, by mediating several key-mechanisms such as transplant vasculopathy and allograft nephropathy, has been recognized as an essential culprit of long term organ rejection. It is inherently resistant to conventional therapies such as apoptosis-induction and co-stimulation-blockade, yet indispensable to the recipient’s protective immunity against environmental pathogens. In this study, we show that double-negative (DN) Treg cell might be a delicate solution to this dilemma. [Methods and Results] DN Tregs specifically suppress cognate B6-anti-Balb/c CD8+ Tm both in vitro (suppression rate: 53.9±1.2%), and in vivo, with the CD8+ Tm cell-mediated graft rejection delayed from (21.5±1.7)-day to (63.0±4.7)-day in a BALB/c to B6Rag1-/- skin transplantation, but with no effect on anti-C3H Tm. Both CD8+ Tem and Tcm compartments are significantly suppressed by DN Treg. We further identified the suppression as Granzyme B (GzmB)- and perforin (PFN)-dependent. Intriguingly, CD4+ Tm cells resist DN Treg both in vitro and in vivo. For the first time, we found CD4+ Tm highly express Serine Protease Inhibitor-6 (Spi6) ((5.03±1.15)-fold of CD4+ Teff, protein level), an inhibitor of GzmB. The deficiency of Spi6 sensitized CD4+ Tm both in vitro (suppression rate: 10.5±5.8% to 37.5±6.2%) and in vivo(skin-allograft survival from (28.3±5.7)-day to (75.5±7.9)-day) to killing by DN Tregs. Whereas the restored sensitivity can be again abrogated in vitro (suppression rate: 4.5±3.2%) and in vivo (skin-allograft survival (22.5±2.4)-day) by PFN-deficiency in DN Tregs.[Conclusion] In summary, DN Tregs specifically suppress cognate alloreactive CD8+ Tm cells by GzmB and PFN pathway, and effectively delay CD8+ Tm cell-mediated graft rejection without compromising Tm cell-dependent protective immunity. The fact that CD4+ Tm resists DN Treg by highly expressing Spi6 and can be sensitized by Spi6-deficiency underlines the essential role of Spi6 in CD4+ Tm cell fate, which also shed light to the possibility of combining DN Treg and Spi6-targeting therapy to even further improve long-term allograft survival.

Abstract#: 61
A prospective cohort conversion study of twice-daily to once-daily extended-release tacrolimus: Role of ethnicity
Jeffrey zaltzman 1 , Lauren Glick 2 , Fernanada shamy 2 , Ahmed Sokwala 3 , Tushar Malvade 3 , Ramesh Prasad 1
1 St. Michael's, University of Toronto
2 St. Michael's
3 University of Toronto
Tacrolimus is a widely used calcineurin inhibitor (CNI) in kidney transplantation available as twice-daily Prograf®(Tac-BID) and once-daily Advagraf®(Tac-OD). Although therapeutically equivalent, some patients require dose adjustments to achieve similar trough concentrations [C0] after conversion. Tacrolimus exposure is impacted by ethnicity in the de-novo setting but the role of ethnicity in determining dose requirements and adjustments in a conversion setting is unknown.
496 RTRs were prospectively converted from Tac-BID to Tac-OD, with dose adjustments targeted to achieve similar [C0 ] at 12 months post-conversion. Renal function, acute rejection and Tac dose adjustments by ethnicity were analyzed.
There were similar numbers of recipients from living and deceased donors. Mean transplant duration was 7 years, 60% of RTRs were Caucasian and 40% were identified as belonging to an ethnic minority cohort. There was no change in eGFR post-conversion to Tac-OD. At 12 months, 35/488 (7%) RTRs had a dose reduction, 101/488 (21%) required a dose increase, 77 (15.7%) had at least a 30% increase in dose over baseline. Dose increase of >30% by ethnicity, varied from 8.0% in South Asians to 27.5% in East Asians(p=0.03), despite this cohort having similar baseline dosing of Tac-BID (3.59 mg/d) compared to the entire cohort (3 .53 mg/d).
Ethnicity may play an important role in dosing requirements in converting from Tac-BID to Tac-OD, unrelated to baseline dose. Further investigation is required to determine the reasons for ethnic variability when patients are converted between tacrolimus preparations.

Abstract#: 62
Comparison of our new Protocol with 6 months of valganciclovir Vs 3 months for CMV prevention. Rate of late infection, effect on WBC and need for dose reduction of mycophenolate.
Marie-Josée Deschênes
The Ottawa Hospital

Cytomegalovirus (CMV) is the most frequent viral infection following renal transplantation. It is a major cause of morbidity and is a preventable cause of mortality in solid organ transplant (SOT) recipients. Adequate prophylaxis significantly prevents CMV disease in the first 3 months post transplant.

Because of concerns for late-onset CMV disease after 3 months antiviral prophylaxis in CMV D+/ R- patients, a trial was performed to compare 200 vs 100 days of valganciclovir prophylaxis (Humar et al., AJT 2010). They found that kidney transplant recipients had significantly lower disease with 200 days prophylaxis.

Based on these results the AST guidelines changed their prophylaxis guidelines to 6 months duration.

At The Ottawa Hospital (TOH), our protocol was also changed in January 2012 to extend the duration to 6 months to follow these new guidelines. It seemed that our patients had more frequent leucopenia, requiring a dose reduction of their mycophenolate dose with similar number of CMV infection.

  • Compare the effect of 6 months prophylaxis vs 3 months in our transplant patients.
    • Rate of late CMV infection
    • Effect on WBC and the need for dose reduction of mycophenolate

Study Method:
  • Retrospective chart review
  • Inclusion criteria: -All patients on valganciclovir transplanted in 2012 (n=38)
- Control group: All patients with 3 months of valganciclovir prophylaxis transplanted in 2011 (n=21)
  • Exclusions criterias: - Patients transferred to other centers within 1 year
  • - Graft loss
  • - Deceased patients with 1 year

  • Effect on WBC
    • Found a significant decrease in WBC with 6 months prophylaxis vs 3 months 57.9% (22/38 patients) vs 38% (8/21 patients).
    • All these patients required mycophenolate dose reduction (average time 3.1 months post transplant)
  • Effect on late CMV
    • 18.4% (7/38 patients) developed late-CMV after 6 months prophylaxis (average time: 8-16 months post-transplant) vs 14.2% (3/21 patients) with 3 months
  • Late-onset CMV disease not much less with 6 months prophylaxis
    • Should follow closely with CMV PCR once prevention finished if patient symptomatic
  • Significantly more leucopenia
  • Optimal duration still need to be fully evaluated

Abstract#: 63
Determinants of Discard of Kidneys from Expanded Criteria Donors Undergoing Donation after Circulatory Death
Sunita Singh , S. Joseph Kim
Division of Nephrology and the Kidney Transplant Program, Toronto General Hospital, University Health Network, University of Toronto
Background: Given the ongoing shortage of donor kidneys, the use of combined expanded criteria donor (ECD) and donation after circulatory death (DCD) kidneys is potentially an important strategy to expand the deceased donor pool. Prior research from our group (Singh et al. Am J Transplant 2013;13:329-36.) showed that DCD kidneys have a slightly increased risk of graft failure vs. non-DCD kidneys, but the relative risk was not significantly greater amongst ECD vs. non-ECD kidneys. The Kidney Donor Risk Index (KDRI) in the ECD/DCD group was similar to those in the ECD/non-DCD group, suggesting careful selection of ECD/DCD kidneys and a potential pool of discarded kidneys that may be acceptable for transplantation.
Methods: A cross-sectional study of combined ECD/DCD kidneys recovered from adult donors between 1-Jan-2000 to 31-Dec-2011 was conducted using the Scientific Registry of Transplant Recipients. Donor kidney characteristics and KDRI scores of discarded and transplanted ECD/DCD kidneys were assessed using parametric and non-parametric methods as appropriate. Multivariable logistic regression models were used to determine the adjusted odds of discard based on donor factors.

Results: There were 894 combined ECD/DCD kidneys included in the study, of which 306 (34.2%) were discarded. The median KDRI in transplanted vs. discarded organs was 1.78 (IQR 0.69) vs. 1.98 (IQR 0.89), respectively (P < 0.001) although the distribution of KDRI scores between discarded and transplanted kidneys showed considerable overlap (Figure). In multivariable logistic regression models, the odds of discard were higher if ECD/DCD kidneys were recovered from diabetic donors (aOR 1.68 [95% CI: 1.16, 2.45], P= 0.007) and hepatitis B or C positive donors (aOR 2.73 [95% CI: 1.37, 5.43] and aOR 4.16 [95% CI: 1.56, 11.08], P = 0.004 for both). Pulsatile machine perfusion was associated with decreased odds of discard (aOR 0.32 [95% CI: 0.22, 0.44], P < 0.001). Other factors such as age, race, death by cerebrovascular accident, hypertension and weight were not associated with an increase in odds of discard.

Conclusion: This study demonstrates a high discard rate of ECD/DCD kidneys, some of which may be acceptable for transplantation given their favorable donor characteristics and KDRI scores.

Abstract#: 64
Clinical Prediction Models of Patient and Graft Survival in Kidney Transplant Recipients: A Systematic Review
Sunita Singh 1 , David Naimark 1 , J. Charles Victor 2 , S. Joseph Kim 1
1 Division of Nephrology, Department of Medicine, University of Toronto
2 Institute for Clinical Evaluative Sciences, University of Toronto
Background: Identification, at an early stage, of kidney transplant recipients at higher risk for mortality and graft failure remains a challenge for the clinician. Several clinical prediction models of patient and graft survival have been developed to help identify these high-risk individuals. Evaluation of the quality, validity, and performance of these models is essential prior to clinical use. The purpose of this study is to systematically review existing clinical prediction models of patient and graft survival in kidney transplant recipients.

Methods: Ovid Medline and EMBASE were searched from 1966 to 2013 for English language articles. Eligible studies included clinical prediction models of patient and graft survival (total and death-censored) in adult and/or pediatric kidney transplant recipients with at least 100 patients and a minimum of three predictors. Studies where clinical prediction was not the focus were excluded. Two independent reviewers extracted data on model performance measures, risk of bias and clinical usefulness.

Results: A total of eleven studies were identified which included three studies (15 models) of patient survival, nine studies (26 models) of graft survival in primarily deceased donor transplant recipients and two studies (5 models) of graft survival in living donor recipients. Overall model discrimination was modest with c-statistics of 0.60 to 0.75 for patient survival models, 0.61 to 0.90 for graft survival models in primarily deceased donor recipients (Table) and 0.71 to 0.88 for graft survival models in living donor recipients. Calibration was reported in eight of the eleven studies and external validation of the models was done in four studies. One study met the criteria for clinical usefulness (utility and usability).

Conclusions: The majority of existing clinical prediction models of patient and graft survival have modest discriminatory ability. Reporting of other measures of model performance (i.e., calibration, model fit, and reclassification) is variable. External validation of models is inconsistent and lacking in a large contemporary cohort of Canadian kidney transplant recipients. Further study is needed to validate existing models and/or develop clinically useful prediction models of graft and patient survival in Canadian kidney transplant recipients.

Abstract#: 65
RIPK3 regulates microvascular endothelial cell death and cardiac allograft rejection
Alexander Pavlosky 1 , Anthony Jevnikar 2 , Zhuxu Zhang 3
1 Matthew Mailing Centre for Translational Transplantation Studies, London Health Sciences Centre; Department of Pathology; University of Western Ontario. London Ontario, Canada
2 Matthew Mailing Centre for Translational Transplantation Studies, London Health Sciences Centre; Department of Pathology; Department of Medicine, University of Western Ontario. London Ontario, Canada
3 Matthew Mailing Centre for Translational Transplantation Studies, London Health Sciences Centre1; Department of Pathology2; Department of Medicine, University of Western Ontario3. London Ontario, Canada
Despite recent advances in immunosuppression, over 50% of patients who have undergone allogeneic cardiac transplantation still suffer from graft loss after 11 years. Cell death in donor grafts results in tissue damage, and ultimately graft rejection, and can occur as an active molecular process through apoptotic, autophagic, and newly identified Receptor Interacting Protein 1 and 3 kinase (RIPK1/3) mediated necroptotic pathways. These variations in cell death may be important for graft survival as necroptosis can lead to the release of chemotactic and activating danger molecules which can activate host immune cells. This pathway has yet to be studied in transplantation.

In this study, necroptosis was induced in murine cardiac microvascular endothelial cell (MVEC) under anti-apoptotic conditions following TNFα treatment. Necroptotic cell death and release of the danger molecule high mobility group box 1 (HMGB1) were inhibited by the RIPK1/3 inhibiting molecule necrostatin-1 and by genetic deletion of RIPK3. In addition, tissue necrosis, release of HMGB1, and graft cell infiltrate were attenuated in RIP3 null heart allografts following transplantation. Finally, a brief sirolimus treatment markedly prolonged RIPK3 null cardiac allograft survival in Balb/c recipients as compared to wildtype C57BL/6 donor grafts. These data suggest that RIPK1/3 contributes to inflammatory injury in cardiac allografts through necroptotic death and the release of danger molecules. The ability of immunosuppression to provide rejection protection or permit tolerance is influenced by the level of cell death and inflammation. We therefore suggest that targeting RIPK mediated necroptosis may be an important therapeutic strategy in organ transplantation.

Abstract#: 66
An Analysis of Major Surgical Bleeding in Kidney Transplantation: Incidence, Risk Factors, and Outcomes
Laureen Hachem 1 , Olusegun Famure 2 , Yanhong Li 1 , Anand Ghanekar 1 , Markus Selzner 1 , S. Joseph Kim 2
1 Multi-Organ Transplant Program, Toronto General Hospital, University Health Network, University of Toronto
2 Division of Nephrology and the Kidney Transplant Program, Toronto General Hospital, University Health Network, University of Toronto
Background: While surgical bleeding in kidney transplantation is of great concern due to potential adverse effects on the graft, this issue has not been thoroughly studied.

Methods: Patients who underwent a kidney transplant at a Canadian transplant centre from 1 Jan 2000 to 30 Sep 2012 (follow-up until 31 Dec 2012) were included in this study. Major surgical bleeding was defined as a drop in hemoglobin ≥ 20 g/L over a 24-hour period within 3 days of transplantation, followed by an ultrasound indicating a significant hematoma/collection (volume ≥ 33.2 cm3). Multivariable logistic regression analysis was used to assess risk factors for bleeding. To examine the effects of surgical bleeding on graft loss/death, multivariable Cox proportional hazards models were used.

Results: A total of 59 of 1,203 (4.9%) kidney transplant recipients who met the inclusion criteria had major surgical bleeding. The majority of cases (89.8%) occurred within 1 day post-transplantation. Living donor transplants (OR 0.30 [95% CI: 0.16, 0.55], P < 0.001) and higher recipient BMI (OR 0.54 per 10 kg/m2 increase in BMI [95% CI: 0.29, 0.99], P = 0.05) were both associated with a significantly lower risk of bleeding. Pre-operative anti-coagulant or anti-platelet usage led to an increased risk of bleeding but the association was not statistically significant (OR 1.82 [95% CI: 0.56, 5.90], P = 0.32). Surgical bleeding was associated with a higher risk of graft loss and/or death (HR 1.61 [95% CI: 1.00, 2.59], P = 0.05).

Conclusion: While the incidence of surgical bleeding in kidney transplantation is relatively low, it is associated with a potentially increased risk of graft loss/death. These findings can aid in identifying high-risk patients for major surgical bleeding in order to optimize their care during the peri-operative period.

Abstract#: 67
Treg activity in peripheral blood lymphocytes assayed by DNA methylation and FOXP3 expression in the first 6 months post renal transplantation.
Mark Lipman 1 , Michel Marcil 1 , Minh-Tri Nguyen 2 , Yan Zhang 3 , Jean Tchervenkov 2 , Steven Paraskevas 2
1 Lady Davis Institute for Medical Research, Jewish General Hospital, McGill University.
2 Division of Transplantation, Department of Surgery, McGill University Health Centre, McGill University.
3 Lady Davis Institute for Medical Research, Jewish General Hospital, McGill University
Regulatory T cells (Treg) are hypothesized to promote allograft acceptance. FOXP3 is a key transcription factor for Treg development and function, but in clinical transplantation FOXP3 can be expressed by effector cells as well as Treg. A more accurate reflection of Treg activity in humans may be the degree of epigenetic DNA methylation of the TSDR (Treg-specific demethylated region) locus of the FOXP3 gene.

Here, we describe both the methylation profile of the TSDR and FOXP3 expression in peripheral blood lymphocytes (PBL) of 45 renal transplant recipients treated with three different induction therapies: anti-thymocyte globulin (n=20) or alemtuzumab (n=22), both lymphocyte-depleting agents (LDA), and basiliximab (BSLX) a non-depleting agent (n=3). PBL were procured on Day 0 pre-transplantation and then on Day 1, 7, 14, 30, and 180 post-transplantation.

Methylation was examined at 10 CpG sites within the chrX:49,117,020-49,117,375 fragment encompassing the TSDR locus. Methylation was quantified by the EpiTYPER method (Sequenom®) and MALDI-TOF mass spectrometry. FOXP3 transcript levels were measured by qPCR.

A definite pattern emerged in patients treated with LDA wherein methylation increased significantly at 6/10 CpG sites (p<0.001 Kruskal-Wallis) in the early post-transplant period (up to Day 30) and then decreased towards, but still remained significantly above baseline levels, at three CpG sites. This pattern was inversely correlated with FOXp3 transcripts which plummeted to 10% of baseline expression on Day 1 with a subsequent recovery to 50% at Day 30 and 70% at day 180.

Conversely, in the limited number of BSLX-treated patients analyzed to date, we observed a decrease in methylation through Day 14 (longer term methylation analysis is pending) and a comparatively modest 40% decrease in FOXP3 transcripts from baseline on Day 1 and 7, with a complete recovery to baseline by Day 14.

This data suggests that the profound decrease in FOXP3 transcripts in the early post-transplant period in LDA-treated patients is not simply a function of lymphopenia induced by LDA but reflective of an intrinsic decrease in Treg activity as evidenced by increased methylation. Moreover, it suggests that Treg activity may play a more prominent role in the immunosuppressive action of BSLX compared to LDA.

Abstract#: 68
BK virus infection enhances TGF-β signaling
Ryan H. Cunnington 1 , Wayne R. Giles 2 , Lee Anne Tibbles 1
1 University of Calgary, Department of Medicine
2 University of Calgary, Department of Kinesiology
Background: BK polyoma virus is nearly ubiquitous in the general population and is not associated with disease in healthy individuals. However, with standard immunosuppression therapy in renal transplant patients, BK virus re-activation may become a significant complication leading to kidney fibrosis and failure. The current treatment regimen for patients with BK nephropathy includes a reduction in immunosuppressive therapy however no effective intervention exists that targets viral induced fibrosis. The mechanisms underlying BK induced renal fibrosis are currently not well understood. Transforming growth factor β (TGF-β) induction of fibrosis has been well studied in many disease models but is understudied in BK virus nephropathy. TGFβ signaling involves the binding of extracellular TGF-β to type II receptors, which then trigger an intracellular signaling cascade through Smad-dependent and independent pathways leading to collagen synthesis and secretion. TGF-β signaling is inhibited at both the receptor and nuclear levels by Smad7 and Ski, respectively. We hypothesize that BK virus modulates protein expression of the TGF-β pathway thereby promoting collagen synthesis and fibrosis leading to renal graft failure.

Methods: Cultured human renal tubular epithelial cells (HPTCs) were infected with BK polyoma virus for varying time points and total protein was collected and separated using SDS-PAGE for analysis by Western blot.

Results: BK virus infection increased the main receptor for binding extracellular TGF-β ligand, the TGF-β type II receptor and the TGF-β signal transducer protein, phospho-Smad3. The same conditions caused a reduction in the endogenous TGF-β inhibitor, Ski, in BK virus infected cells. These data suggest that BK virus promotes fibrosis by increasing the cell’s ability to respond to TGF-β binding and inhibiting negative feedback mechanisms for the regulation of this pathway.

Conclusions: These data provide evidence that the BK virus “primes” the TGF-β pathway for enhanced synthesis of extracellular matrix leading to fibrosis and renal graft failure.

Abstract#: 69
A mouse model for ABO-incompatible transplantation (ABOi Tx): Hyperacute rejection following ABOi heart Tx
Bruce Motyka 1 , Taylor Rocque 1 , Fahim H. Rahman 1 , Sheila Wang 1 , Kesheng Tao 1 , Thuraya Marshall 1 , Jean Pearcey 1 , Banu Sis 1 , Michael Mengel 1 , Anthony J.F. d'Apice 2 , Peter J. Cowan 2 , Lori J. West 1
1 University of Alberta, Edmonton, AB, Canada
2 St Vincent's Hospital, Melbourne, Australia
Introduction: ABOi heart Tx (HTx) results in rapid graft loss by hyperacute antibody-mediated rejection (HAR) when accidentally performed in adults. In contrast, ABOi HTx can be performed in infants safely and results in tolerance to the donor blood group antigen(s). A mouse model would be useful for the study of antibody-mediated rejection (AMR) and tolerance in the setting of ABOi Tx. We developed transgenic mice [A-Tg, C57BL/6 (B6) background] that express human A-antigen on vascular endothelium. ‘A into O’ Tx can be approximated using A-Tg mice as donors and B6 wild-type (WT) mice as recipients. We hypothesize that administration of high levels of anti-A antibody (Ab) to WT recipients with A-Tg heart grafts will result in HAR. Methods: B6 mice with undetectable anti-A titres were heterotopically transplanted with A-Tg hearts (n=6) and 6-8 days later injected intravenously with mouse anti-A monoclonal IgM Ab plus rabbit complement. Heart grafts were monitored by palpation [score of 4 (strongest) to 0 (absence of pulsation)] and harvested 24 hours post-injection. Grafts were assessed for AMR by H&E staining and by immunohistochemistry for the deposition of complement C4d and for the macrophage marker CD68. Serum anti-A Ab titres were determined by hemagglutination. Results: In A-Tg heart recipients, anti-A Ab titres increased from ≤1:2 just prior to anti-A antibody injection to ≥1:1024 at 60 minutes post-injection. Heart palpation scores were 4 prior to injection and ranged from 1-4 (median 2.5) post-injection. Grafts demonstrated both morphological and immunophenotypic features of AMR. Morphological features included interstitial infiltrate, myocytolysis, interstitial hemorrhage, endothelial injury/swelling, capillaritis, and the presence of neutrophils. Immunophenotypic features included C4d deposition and the presence of interstitial CD68+ cells (macrophages). Conclusions: These findings are indicative of HAR following passive anti-A Ab administration. This model will prove useful for the study of AMR in the setting of ABOi Tx.

Abstract#: 70
The First Report of Alemtuzumab as Rescue Therapy in Refractory Kidney-Pancreas Allograft Rejection
Neal E. Rowe 1 , Kelly Mclean 2 , Jason Archambault 1 , Ghaleb Aboalsamh 1 , Vivian McAlister 2 , Alp Sener 1 , Patrick Luke 2
1 Western University
2 London Health Sciences Centre
Alemtuzumab is a humanized anti-CD52 monoclonal antibody that depletes T and B lymphocytes. While the use of this agent has been well established for induction immunosuppression at many centres, its utilization for the treatment of kidney allograft rejection has only been reported in a limited number of small series. To our knowledge, this is the first report of alemtuzumab rescue therapy for mixed cellular and humoral rejection in simultaneous pancreas-kidney (SPK) recipients.
Three SPK patients have been treated with alemtuzumab for refractory mixed cellular and humoral rejection at our centre. All patients had received induction immunosuppresion with anti-thymocyte globulin. Two patients were maintained on tacrolimus, mycophenolate and corticosteroid while a third patient was on a steroid sparing regimen. All patients developed mixed cellular and humoral rejection at various time points post-transplant. Treatment included administration of corticosteroids and anti-thymocyte globulin in conjunction with plasma exchange, intravenous immunoglobulin (IVIG), and cyclophosphamide. Despite aggressive immunosuppression, patients continued to have minimal improvement on kidney biopsy and continued declining allograft function. Each patient received alemtuzumab via peripheral line. Kidney and pancreas function stabilized (as measured by serum creatinine, lipase and amylase) with no evidence of rejection on follow-up biopsy. Our first patient, treated late, subsequently lost the allografts for poor residual function but demonstrated resolution of rejection on biopsy.
Refractory mixed cellular and humoral rejection in SPK recipients can be successfully treated with alemtuzumab. Our initial experience suggests early initiation of therapy is required for allograft salvage. However, long term follow-up and further study is required before adopting this regimen for general use.

Abstract#: 71
Influence of CYP3A Genetic Polymorphisms on Tacrolimus-Amlodipine Drug Interaction in Pediatric Heart Transplant Recipients
Alan Fung , Tina Marvasti , Lisa D’Alessandro , Ashok Manickaraj , Mina Safi , Steven Habbous , Seema Mital
The Hospital for Sick Children
Introduction: Tacrolimus (FK) and amlodipine, both metabolized by CYP3A enzymes, are commonly used medications in transplant recipients for immunosuppressive and anti-hypertensive therapy, respectively. Previous studies have reported that CYP3A5 single nucleotide polymorphisms (SNPs) influence FK levels, while CYP3A4 SNPs influence amlodipine response. The influence ofCYP3A4/5 combined genotypes in patients receiving both FK and amlodipine is not known. We characterized the association ofCYP3A4/5 genotype on FK levels in patients receiving both FK and amlodipine.
Methods : Heart transplant recipients <18 years old were prospectively enrolled in a multi-centre Transplant Biobank Registry. Those receiving both FK and amlodipine were eligible for inclusion. All patients were genotyped for rs776746 A>G (CYP3A5) and rs2246709 A>G (CYP3A4). FK levels were captured before and 2-14 days after amlodipine initiation. Genotype associations with pre-amlodipine FK levels (ng/ml) and change in FK levels were assessed.
Results : 61 patients (59% male; 78% white, 13% Asian, 6% black, 4% other) were eligible. 17% were CYP3A5 expressors (AA/AG); 50% were CYP3A4 expressors (AG/GG) and 14% were both CYP3A4/5 expressors. In the overall cohort, amlodipine initiation was not associated with a significant increase in FK levels (9.7 ng/ml vs 10.8 ng/ml, p=0.10). However, when stratified by genotype, CYP3A5expressors showed a significant increase in FK levels following amlodipine initiation (3.64 ng/ml, p=0.01). Further stratification byCYP3A4 genotype revealed that the greatest increase in FK levels post-amlodipine initiation was seen in CYP3A4/5 expressors (3.73 ng/ml, p=0.03).
Conclusions: CYP3A4/5 expressors are at risk for increase in FK levels following initiation of amlodipine. Patients in this genotype group may require FK dose adjustments when initiating amlodipine, highlighting the need for CYP3A pharmacogenotyping in clinical practice.

CYP3A5 genotype CYP3A5 (AA/AG) CYP3A5 (GG)
CYP3A4 genotype CYP3A4 (AG/GG) CYP3A4 (AA) CYP3A4 (AG/GG) CYP3A4 (AA)
FK pre-amoldipine 8.2 12.9 10.0 10.0
FK post-amoldipine 11.9 16.2 10.8 10.3
FK change 3.73* 3.25 0.82 0.31

Abstract#: 72
Red Cell Distribution Width is Associated with Obstructive Sleep Apnea in Kidney Transplant Recipients
Miklos Z Molnar 1 , Akos Ujszaszi 2 , Katalin Fornadi 3 , Marta Novak 4 , Istvan Mucsi 5
1 University of Toronto, Toronto, ON, Canada;
2 Semmelweis University Budapest, Hungary
3 Semmelweis University Budapest, Hungary; University of Toronto
4 University of Toronto, Toronto, ON, Canada; Semmelweis University Budapest, Hungary
5 McGill University Health Centre, Montreal, QC, Canada; Semmelweis University Budapest, Hungary
Background: Red cell distribution width (RDW), a marker of heterogeneity in the size of circulating erythrocytes. It is associated with mortality in various patient populations. We assess the association between Obstructive Sleep Apnea (OSA) and RDW in stable kidney transplant recipients to demonstrate if RDW is associated with intermittent hypoxemia generated by OSA.
Methods: Cross-sectional study of 100 kidney transplant patients who underwent polysomnography. Socio-demographic information and data about medication, comorbidity and laboratory parameters were collected.
Results: The mean age was 51±13 years, 43% were women, and the prevalence of diabetes was 19%. The mean RDW was 14±1%, and the median (interquartile range) Apnoe-Hypopnoe Index (AHI) was 3 (13). We found an incremental linear association between the AHI and RDW in unadjusted (B=0.027, 95%CI:0.012-0.04; p<0.001) linear regression model.

After adjustment for relevant factors, such as age, gender, eGFR, comorbidity, abdominal circumference, serum albumin, serum CRP, blood hemoglobin and soluble transferrin receptor levels, AHI was still associated with RDW (B=0.022, 95%CI:0.007-0.037; p=0.005).
Conclusions: RDW is associated with the apnoe-hypopnoe index, the objectively assessed measure of the severity of OSA. It is tempting to speculate that RDW is increased in response to intermittent hypoxemia (a hallmark of OSA). Further studies will need to unravel if the link between RDW and mortality is a reflection of underlying OSA.

Abstract#: 73
Blood group A transgenic mice as a model for ABO-incompatible transplantation (ABOi Tx): study of antibody-mediated rejection (AMR)
Bruce Motyka 1 , Fahim H. Rahman 1 , Annetta Kratochvil 1 , Kesheng Tao 1 , Jean Pearcey 1 , Thuraya Marshall 1 , Banu Sis 1 , Michael Mengel 1 , Anthony J.F. d'Apice 2 , Peter J. Cowan 2 , Lori J. West 1
1 University of Alberta, Edmonton, AB, Canada
2 St Vincent's Hospital, Melbourne, Australia
Introduction: The ABO blood group system is generally a barrier to safe organ Tx between incompatible donors and recipients. However, in infants ABOi heart Tx (HTx) can be performed safely as anti-blood group antibody levels are low or absent. Following ABOi HTx, immune tolerance develops to the donor A/B antigen(s), by mechanisms not well understood. Mice do not normally express ABO antigens, therefore we developed transgenic mice (A-Tg, C57BL/6 [B6] background) expressing human A-antigen on vascular endothelium. ‘A into O’ ABOi HTx can be approximated using A-Tg mice as donors and wild-type (WT) B6 mice as recipients. Herein, we investigated AMR following Tx of A-Tg hearts into WT mice. We hypothesize that A-Tg grafts will undergo AMR in WT recipients with circulating anti-A antibodies. Methods: Juvenile WT mice were induced (sensitized) to produce anti-A antibodies by subcutaneous injection of human A red blood cells. Serum anti-A antibody titres were assessed by hemagglutination assays. Sensitized WT recipients received a heart transplant from an A-Tg (n=15) or WT (n=6) donor, and graft pulsation was monitored by palpation. Grafts were harvested following cessation of beating or at 7-14 days post-transplant (median 13 days); AMR was assessed by histology. Results: Sensitization resulted in high anti-A Ab titres (median 1:512). There was morphological evidence of AMR in 6 of 15 of the A-Tg heart grafts transplanted into WT recipients; of these, 2 ceased beating within 24 hours post-transplant and showed morphological signs of hyperacute rejection. Grafts with morphological features of AMR also showed evidence of C4d deposition. No WT heart grafts showed evidence of AMR. Conclusions: These findings indicate that incompatible A-Tg heart grafts can undergo AMR following Tx into WT mice with high anti-A antibody titre; however not all A-Tg grafts showed morphological features of AMR within the study period. Ongoing studies will address AMR in A-Tg mice beyond 14 days post-Tx. We expect this model to become a valuable resource for studies related both to rejection and tolerance in the setting of ABOi transplantation.

Abstract#: 74
Urine Metabolites for Diagnosis of T Cell-Mediated Rejection in Pediatric Kidney Transplants
Tom Blydt-Hansen 1 , Atul Sharma 2 , Rupasri Mandal 3 , Ian Gibson 4 , David Wishart 3
1 University of Manitoba, Pediatrics and Child Health
2 University of Manitoba
3 The Metabolomics Innovation Center
4 University of Manitoba, Pathology
Acute rejection remains a significant threat to allograft outcome in pediatric kidney transplant recipients. Non-invasive screening for acute rejection is highly desired to avoid requirement for surveillance biopsies. Since urine metabolites may change as a result of kidney injury, we hypothesize that urine metabolomic approaches may provide robust biomarkers of acute rejection.

Urine samples (n=277) from 54 patients <18 years at transplant with concurrent surveillance or indication biopsies were assayed for 101 urine metabolites by DI:MS. Groups analyzed were T cell-mediated rejection (TCMR=33) and “No rejection” (NR=186). NR included polyoma BKV infection (BKV=3), recurrent glomerulonephritis (RGN=3) and antibody-mediated rejection (AMR=17). Borderline tubulitis (BTUB=58) was analyzed separately. Metabolite profiles compared with partial least squares-discriminant analysis using MetaboAnalyst 2.0, ROCCET, and R (pls, pROC, and chemometrics libraries).

Patient samples were median 25 months post-transplant (IQR; 6, 52 months). TCMR is readily distinguishable from NR (permutation testing, p<0.01). The top 5 discriminating metabolites (TCMR vs. NR) had auROC = 0.73, 0.71, 0.68, 0.66, 0.66 respectively (all p<0.01). A multivariate classifier (double cross-validation) combining these features yields auROC = 0.811 (p<0.01), with predictive accuracy 0.86, sensitivity 0.61, specificity 0.90 (see Figure). By varying the threshold, sensitivity may be optimized (0.94), but with a drop in specificity (0.44). Excluding BKV, RGN and AMR from the analysis yielded similar classification. BTUB samples tested against the original classifier showed distinct separation into either TCMR phenotype or NR pheonotype, with 45% classified as TCMR.

Urine profiles provide accurate distinction of TCMR from non-rejection phenotypes. Almost half of borderline samples had a metabolite profile consistent with TCMR. These results require validation for noninvasive screening in children.

Abstract#: 75
Obstructive sleep apnea without excessive daytime sleepiness in kidney transplant recipients
Katalin Ronai 1 , Andras Szentkiralyi 2 , Rezso Zoller 1 , Csilla Turanyi 1 , Julia Szocs 1 , Katalin Fornadi 3 , Miklos Z Molnar 4 , Istvan Mucsi 5 , Novak Marta 6
1 Semmelweis University Budapest, Hungary
2 Semmelweis University Budapest, Hungary; Westfälische Wilhelms-Universität Münster
3 Semmelweis University Budapest; University of Toronto
4 University of Toronto, Toronto, ON, Canada;
5 McGill University Health Centre, Montreal, QC, Canada; Semmelweis University Budapest, Hungary
6 University of Toronto, Toronto, ON, Canada; Semmelweis University Budapest, Hungary
Introduction: Obstructive sleep apnea (OSA) increases cardiovascular risk, thus the timely diagnosis and effective therapy for OSA is important. The most characteristic daytime symptom of OSA is excessive daytime sleepiness in the general population. OSA is frequent in chronic kidney disease even in kidney transplant recipients (Tx). We conducted a large polysomnographic study to confirm our clinical impression that OSA may not be accompanied by daytime sleepiness in Tx.
Methods: 100 stable prevalent kidney transplant recipients were included in the study (57 males, 43 females, mean age 51±13 years, BMI 27±5 kg/m2, GFR 52±19ml/min). OSA was diagnosed by one night polysomnography (PSG); OSA severity was defined by the apnea-hypopnea index (AHI). Daytime sleepiness was measured by the Epworth Sleepiness Scale (ESS). Statistical analysis was performed by STATA 12.0 software.
Results : OSA was present in 43% of patients: mild OSA (5<=AHI<15) in 18%; moderate (15<=AHI<30) in 11% and severe (AHI>=30) in 14%. There was a strong, negative correlation between AHI and the average oxigen saturation during sleep (rho=-0.585; p<0.001). BMI was positively correlated with AHI (rho=0.452; p<0.001), so were abdominal and neck circumference. Suprisingly AHI showed a weak negative correlation with ESS (r=-0.218; p=0.029). The median (interquartile range) ESS scores were 5 (5) in individuals with no OSA, 4 (5) in mild, 4 (5) in moderate and 4.5 (7) in severe OSA, respectively (p=NS). In multivariable linear regression model the association of AHI and ESS was not significant after adjusting for gender, age, kidney function and BMI.
Conclusion: Among kidney transplant recipients excessive daytime sleepiness is not associated with OSAS. Our results highlight the fact clinical symptoms are insufficient to screen for OSA in kidney transplant recipients.

Abstract#: 76
Independent association between serum FGF-23 and serum EPO in kidney transplanted patients – a potential link between anemia cardiovascular health
Istvan Mucsi 1 , Akos Ujszaszi 2 , Maria Czira 3 , Zsofia Szekely 2 , Marta Novak 4 , Miklos Z Molnar 5
1 McGill University Health Centre, Montreal, QC, Canada; Semmelweis University Budapest, Hungary
2 Semmelweis University Budapest, Hungary
3 Semmelweis University Budapest, Hungary; University of Münster, Germany
4 University of Toronto, Toronto, ON, Canada; Semmelweis University Budapest, Hungary
5 University of Toronto, Toronto, ON, Canada
Background: Serum FGF23 levels are remarkably high in patients with chronic kidney disease (CKD) and correlate with mortality. FGF23 is thought to contribute to myocardial hypertrophy in this context. Previous reports suggest that FGF23 expression is stimulated by iron deficiency, potentially through HIF-1alpha signaling. This signaling pathway is directly involved in the response to hypoxia/anemia, and stimulates the expression of erythropoietin (EPO). In this analysis we wanted to assess the association between serum FGF23 and serum EPO in kidney transplant recipients to assess the potential interaction between these two systems of paramount importance in the context of CKD.
Methods: We collected socio-demographic parameters, medical and transplant history and laboratory data from 886 stable, prevalent, EPO naive Tx recipients. Serum FGF-23 was measured using a C-terminal enzyme-linked immunosorbent assay (Immutopics, San Clemente, CA, USA). eGFR was calculated using the 4-variable equation derived from the Modification of Diet in Renal Disease Study. A solid-phase, chemiluminescent immunometric assay (IMMUNOLITE 2000 EPO) was used to measure baseline EPO (EURO/DPC Ltd. United Kingdom).
Results: (mean age 51±12 years, 60% males, mean eGFR 53±20 ml/min/1.73m2 and mean hemoglobin was 137±16 g/l. The median Tx vintage was 77 months [interquartile range 78 mo] and 21% were diabetics. Serum FGF23 was negatively correlated with Hb (r=-0.19, p<0.001) and positively with serum EPO (r=0.19, p<0.001) (fig.1).

In multivariable linear regression an independent association between serum FGF23 and serum EPO remained after statistical adjustment for age, gender, eGFR, iron deficiency, Hb, CRP, albumin, Ca, PO4 and iPTH (B=0.14, 95%CI:0.086-0.199; p<0.001).
Conclusions: Serum EPO and FGF23 are associated independently of several potential co-variables. It is tempting to speculate that both FGF23 and EPO are increased in response to tissue hypoxia. Further studies will need to unravel if the link between FGF23 and serum EPO and to demonstrate if these associations are relevant to the increased cardiovascular mortality of patients with CKD.

Abstract#: 77
Secular Trends in Cardiovascular Disease Among Kidney Transplant Recipients
Ngan Lam 1 , Kyla Naylor 1 , Salimah Shariff 2 , Eric McArthur 2 , Gregory Knoll 3 , Joseph Kim 4 , Amit Garg 1
1 Western University
2 Institute for Clinical Evaluative Sciences
3 University of Ottawa
4 Toronto General Hospital
Background: Cardiovascular death remains the number one cause of mortality in kidney transplant recipients. Cardiovascular events alone are associated with significant morbidity. Unfortunately, current trends in cardiovascular events after kidney transplantation are poorly understood.

Methods: We conducted a retrospective observational study using Ontario’s linked healthcare databases to follow all first-time kidney-only transplant recipients between 1994 to 2009. Our primary outcome was a composite of death or first major cardiovascular event defined as one of myocardial infarction, coronary angioplasty, coronary bypass surgery, or stroke within three years of the transplant date.

Results: There were 4949 first-time kidney-only transplant recipients during the study period, of which 63% were male. The median age steadily increased from 43 years (interquartile range [IQR] 33-54) in 1994 to 53 years (IQR 42-62) in 2009 as did the proportion of recipients aged 65 years old older (3.8% in 1994 to 20.4% in 2009). There was also an increase in the proportion of recipients with diabetes (19.2% in 1994 to 29.9% in 2009) and coronary artery disease (23.7% in 1994 to 37.7% in 2009). A total of 455 recipients (9.9%) died or experienced a major cardiovascular event within three years of transplantation and this incidence remained stable throughout the study period.

Conclusion: Despite transplant centers accepting recipients who are older with more co-morbidities, the three-year incidence of death or major cardiovascular event has remained stable from 1994 to 2009. These results are reassuring for transplant programs.

Abstract#: 78
A comparison of a pre-transplant weight management clinic to bariatric surgery.
Roy Hajjar 1 , Gabriel Chan 2 , Pierre Garneau 3 , Olivier Court 4 , et al.
1 University of Montreal
2 Hôpital Maisonneuve-Rosemont
3 Hôpital Sacre-Cœur de Montreal
Morbid obesity is an epidemic in the renal failure population and is associated with longer surgery with more complications. Controversy has existed for many years regarding the survival of obese patients while on dialysis. Recently, a review of patients listed for transplantation in the USRDS showed that a 1-year survival benefit of transplantation existed for all weight groups, though diminished in the morbidly obese. Furthermore, the obesity paradox, the apparent survival benefit for obesity, did not exist. Obese has been shown to have a decreased graft survival. Something must be done to treat obesity.
A retrospective clinical review was conducted for patients in the pre-transplant surgical clinic at Hôpital Maisonneuve-Rosemont (2010-2013). The weight loss plan included a target weight, regular follow-up, nutritionist, physical exercise and community health consultation. The maximum limit for listing was 36.0 kg/m2.
A multi-centre retrospective review was done for patients undergoing laparoscopic sleeve gastrectomy between 2009 and 2013 at the Hôpital Sacre-Coeur and the Royal Victoria Hospital.
85 patients were assessed at the pre-transplant clinic, of whom 50 had sufficient follow-up. Of the 10 patients with a BMI >40, none were able to achieve a BMI < 36.0. In the non-listed, BMI 36 – 39 group, 43% were able to achieve the weight target. In the listed, BMI group of 32-35, 32% were able to achieve weight loss. The overall success rate of weight loss was 30%.
22 patients underwent sleeve gastrectomy with a mean BMI of 46.2. The mean LOS was 2.9 days with the majority (63%) without complication. The complications were related to volume management and electrolyte imbalances, probably more fragile due to CRF. The mean change in BMI was 9.3 and 11.8 kg.m2 at 3 and 6 months post-bariatric surgery respectively. The majority (71%) were able to achieve their weight target for transplantation.
Weight loss with chronic renal failure is not easy. Only a minority of patients can achieve the targets with diet and exercise. Bariatric surgery can achieve rapid weight loss, though careful attention should be paid to the immediate post-operative time to avoid volume and electrolyte related complications.

Abstract#: 79
Delivery of heme oxygenase-1-cell penetrating peptide (HO-1-CPP) into hepatocytes in in vitro and ex vivo model of cold ischemia.
Ananda Venkatachalam 1 , Qianni Hu 1 , Caroline Wood 1 , Sanem Cimen 1 , Ian Alwayn 2
1 Atlantic Centre for Transplantation Research, Department of Surgery, Dalhousie University, Halifax, Nova Scotia, Canada.
2 Atlantic Centre for Transplantation Research, Departments of Surgery, Pathology, Microbiology & Immunology, Surgical Lead Multi Organ Transplant Program, QEII Health Sciences Center, Dalhousie University, Halifax, Nova Scotia, Canada.
In an attempt to increase the number of donor organs, one of the major strategies being employed by transplant surgeons is the use of extended criteria donor (ECD) organs. Unfortunately, ECD organs are more susceptible to cold ischemia. Several methods have been described that may reduce the cellular injury in experimental models of cold ischemia. One of the proteins that have gained a lot of traction is the heme-oxygenase 1 (HO-1). Induction of HO-1 expression under conditions of cellular stress has targeted it as a key gene for protection against injury during cold ischemic conditions.
Our hypothesis is to introduce active and functional HO-1 protein conjugated to a cell penetrating peptide (CPP) directly in vitro to hepatocytes, and ex vivo to hepatocytes, endothelial cells, and Kupffer cells in hypothermic and anoxic conditions. This novel strategy may also be used for the ex-vivo delivery of other protective proteins in hypothermic perfusion systems.
Our preliminary experiments have determined that we are able to consistently produce a functional HO-1 protein fused to a CPP. To confirm the ability of our fusion protein to cross cell membranes, we have performed an in vitro experiment in which rodent (McA-RH7777) and human (Hep G2) hepatocytes were incubated with HO-1-CPP and later stained with anti-HO-1 and anti-His antibodies followed by fluorescent secondary antibodies. Localization of HO-1-CPP was revealed by (red) anti-His staining which shows a pattern consistent with HO-1 localization to the endoplasmic reticulum. Further evidence of the cell permeability of HO-1-CPP has been studied in ex vivo perfusion experiments. The HO-1-CPP perfused livers sections were paraffin embedded, sectioned and stained with an anti-HO-1 antibody which revealed intracellular localization of HO-1-CPP 4-5 cell layers deep surrounding the perfused vessel.
Our ability to successfully deliver an active protein conjugated to a CPP to cells of a whole organ in an ex-vivo hypothermic and hypoxic perfusion model holds great potential for future repair and protection of organs for transplantation. Future studies to determine the ability of HO-1-CPP to modulate the response to ischemia reperfusion injury (IRI) in our in vitro and ex vivo models are planned.

Abstract#: 80
Assessing the impact on result harmonization of an international reference standard for human cytomegalovirus (HCMV) DNA
Jutta Preiksaitis 1 , Randal Hayden 2 , Yupin Tong 3 , Xiao-Li Pang 4 , Jacqueline Fryer 5 , Alan Heath 6 , Angela Caliendo 7 , et al.
1 Division of Infectious Diseases, Department of Medicine, University of Alberta, Edmonton, Alberta, Canada
2 Department of Pathology, St. Jude Children's Research Hospital, Memphis TN USA
3 Department of Medicine, University of Alberta, Edmonton, Alberta, Canada
4 Department of Laboratory Medicine and Pathology University of Alberta, Edmonton, Alberta, Canada
5 National Institute for Biological Standards and Control, UK
6 Alan Heath Biostatistics Group National Institute for Biological Standards and Control National Institute for Biological Standards and Control, UK
7 Division of General Internal Medicine, Alpert Medical School at Brown University ,Providence, RI, USA
Background: Extreme variability in quantitative CMV DNA results reported on individual samples among laboratories led to the development of the WHO 1st international reference standard (IS) for CMV DNA with the hope that this common assay calibrator would improve result harmonization.

Methods: In order to assess the impact of the IS, a blinded sample panel consisting of 40 positive samples (pooled serially collected samples from unique solid organ transplant patients) and 10 negative samples was tested by six laboratories using eight different assays [six commercial, two laboratory-developed] that had been calibrated to the WHO IS. Dilutions of the WHO IS were also tested by all assays. Two positive panel samples were duplicated; all were gB genotyped. Panels were tested with two of the commercial assays in two laboratories. Results were compared using standard statistical techniques.

Results: There were two false negative and one false positive result. Result variation was greater for the clinical samples than for the WHO IS dilutions with a minimum result range of 1.02 log10 IU/ml and a maximum of 2.74 log10 IU/ml for clinical samples; 59.4% of all results fell within ±0.5 log10 range of the geometric mean (GM) of all results. The standard deviation of reported results was not significantly impacted by either the quantity of DNA in the sample or gB genotype. Each assay had a result bias above or below the GM result for all assays. Nucleic acid extraction methods significantly impacted results reported, even when amplification and detection kits were identical.

Conclusions: Although variability in quantitative CMV DNA results reported on individual samples has been reduced by the use of the WHO IS as a universal assay calibrator, ongoing clinically relevant variability persists preventing meaningful inter-laboratory result comparison. Our study suggests that nucleic acid extraction techniques and amplicon sizes may be contributing to variability. These and other causes of variability must be identified and corrected to achieve further result harmonization.

Abstract#: 81
Development of High-Resolution HLA Typing by Next Generation Sequencing
James Lan 1 , Yuxin Yin 2 , Elaine Reed 2 , Qiuheng Zhang 2
1 University of British Columbia Clinician Investigator Program; UCLA Immunogenetics Center
2 UCLA Immunogenetics Center
Background High-resolution human leukocyte antigen (HLA) typing is required for the identification of allele-specific anti-donor HLA antibodies in solid organ transplantation and for donor-recipient HLA matching in hematopoietic-stem cell transplantation. Current HLA genotyping methods produce ambiguous allele combinations due to phase ambiguity and incomplete exon sequencing. To overcome these barriers, we developed long-range PCR to amplify full-length HLA gene sequences and utilized next generation sequencing (NGS) technology to deliver high-resolution genotyping results across five classical HLA loci.

Methods 15 reference DNAs with HLA typing confirmed by Sanger’s method (SBT) served as standards for NGS analysis on the IonTorrent PGM platform. First, long-range PCR primers co-amplified HLA-A, B, C, DRB1, and DQB1 genes from promoter to 3’-UTR. To increase data throughput, we labeled individual amplicons with unique barcodes, permitting all samples to be multiplexed prior to sequencing. Raw sequence data were processed, quality-filtered, and re-assembled by Torrent Suite. Final HLA allele assignment was performed by Omixon software.

Results A total of 5.7 million reads with a mean read length of 253 bp were generated from one PGM run using the Ion 318 chip (400 bp chemistry). 86% of the 1.4G NGS-sequenced base pairs passed the Q20 quality score (<1% error rate) before alignment to reference human genome. The average sequence depth (number of times a base pair is sequenced) was 294 at class I and 453 at class II loci. Class I alleles determined by NGS were 100% concordant with reference data (Table 1). Importantly, NGS yielded unambiguous, 8-digit level typing results where SBT encountered difficulties. NGS-generated class II alleles were 97% concordant at the serological level, but showed higher mis-assignment under high-resolution (23% in DRB1, 10% in DQB1). Primer-related allele dropouts and low sequence coverage (<100) appeared to be main factors leading to downstream allele mis-calling. Both issues are modifiable and should permit improved class II typing once resolved.

Conclusions NGS can deliver high-resolution, unambiguous class I HLA typing at the 8-digit level with impressive throughput. Sequencing a larger sample set and primer/sequence depth optimization for class II alleles are required before routine clinical application in the immunogenetics laboratory.

Abstract#: 82
Reduced Health Literacy is Related to Poorer Medication Adherence in Adult Renal Transplant Recipients
Maryam Demian 1 , R. Jean Shapiro 2 , Wendy Thornton 1
1 Simon Fraser University
2 University of British Columbia
Medication non-adherence is a common problem following renal transplantation. Poor health literacy (HL), defined as patients’ ability to access, understand, and use health-based information in order to make medically related decisions, is linked to worse disease outcomes in a variety of medical populations. The role of HL in medication adherence is not known. We examined the extent of and risks for reduced HL as well as the association between HL and medication adherence in renal transplant recipients (RTR).
HL was assessed using: i) the Rapid Estimate of Literacy in Medicine – Transplantation (REALM-T), which measures familiarity with 69 transplant related words; ii) the Newest Vital Sign (NVS), which measures numeracy and comprehension of a nutrition label using 6 items; and iii) the Health Literacy Questionnaire (HLQ), which comprehensively captures the construct of HL through nine distinct scales. Medication adherence was measured by the Transplant Effects Questionnaire (TEQ)-Adherence subscale. Pearson correlations were used to examine the relationships between variables related to HL and medication adherence.
67 adult stable RTR (mean age 53 years, 38 males, average years of education 14) were recruited from the transplant clinic. Participants were required to be proficient in English. RTR were identified as being at risk for reduced HL (NVS average score 3.97/6 ± 1.91 SD, where score <4 indicates risk of limited HL). Older age (r= -.43, p<.05) and lower levels of education (r= .54, p<.05) were associated with poorer HL. Specific factors of the HLQ, including lower “healthcare provider support” (r= .32, p<.05), “having insufficient information” (r= .28, p<.05), lower “social support” (r= .30, p<.05), and worse “navigation of the healthcare system” (r= .26, p<.05) were associated with poorer medication adherence.
Reporting an established relationship with at least one healthcare provider, feeling confident that one has the information needed to manage their condition, having strong social support for their health, and successfully navigating the healthcare system were all associated with better medication adherence. These findings suggest that the development of interventions to improve or provide support for patients’ level of health literacy would translate to enhanced medication adherence.

Abstract#: 83
Incidence and Evolution of Native Kidney Renal Cell Carcinoma in Renal Transplant Recipients: A Comparative Study with a General Population
Annie-Claude Blouin 1 , Vicky Mai 1 , Michael Sourial 1 , Frédéric Pouliot 1 , Thierry Dujardin 1 , Jean-François Audet 1 , Jean-Guy Lachance 2 , Réal Noël 2 , Isabelle Côté 2 , Isabelle Houde 2 , Yves Caumartin 1
1 Department of Surgery, Division of Urology, Université Laval, Québec, Québec, Canada
2 Department of Nephrology, Université Laval, Québec, Québec, Canada

Renal Cell Carcinoma (RCC) accounts for only 2-3% of all cancers, but the incidence in renal transplant recipients (RTR) is up to 15 times higher than in the general population. The association between Acquired Cystic Kidney Disease (ACKD) affecting patients with renal failure and the development of RCC might in part explain this increased incidence. Published literature has depicted a disease with unique clinical and pathological characteristics and a better prognosis for RTR, therefore suggesting specific oncogenic mechanisms.

Our primary objective was to describe clinical and pathological characteristics of native kidney RCC in RTR and compare them with RCC features in the general population. We also wished to evaluate the efficacy of our screening protocol, consisting of native kidneys ultrasound every two years. Between 1981 and 2010, 1347 patients received a kidney in our center. 44 patients developed RCC for a total of 47 tumors. We compared them to 334 patients from the general population that underwent partial (n=178) or radical nephrectomy (n=156) for RCC in our center between 2003 and 2010.

Compared to the general population, RTR were younger at the time of RCC diagnosis (55 vs. 60 years old; p<0.001) and male were more frequently affected (86 vs. 62%; p<0.001). Native kidney RCC were smaller (39 vs. 42 mm; p<0.001) and papillary histology was predominant (55 vs. 16%; p<0.001). There was no difference in disease-free survival and cancer-specific mortality. Besides, 50% of RTR exhibited ACKD. Moreover, 45% of RTR had their screening ultrasound every two years, 23% each year and 27% did not have routine screening. We did not have information on screening for 5%. Whatever the type of screening, there was no difference in cancer-specific mortality.

In accordance with current knowledge on native kidney RCC in RTR, we demonstrated a majority of papillary RCC and a higher proportion of ACKD (50%) than RTR without RCC (estimated prevalence 25%). Finally, screening for native kidney RCC with an ultrasound every two years did not confer a survival advantage in our small population.

Abstract#: 84
Examining the Utility of Catalytic Antioxidants in Islet Transplantation
Antonio Bruni 1 , Andrew Pepper 1 , Boris Gala-Lopez 1 , Rena Pawlick 1 , Nasser Abualhassan 2 , A.M. James Shapiro 1
1 Alberta Diabetes Institute Clinical Islet Transplant Program Department of Surgery University of Alberta
2 Alberta Diabetes Institute Clinical Islet Transplantation Program Department of Surgery University of Alberta
Islet transplantation has been demonstrated as an effective modality to treat type-1 diabetes (T1D). However, long-term function has been limited, in part, to the potency of isolated islets and the availability of donor pancreata. Oxidative stress is a mechanism associated with disease states marked by inflammatory processes, including, but not limited to, autoimmune diseases and islet graft dysfunction. The ability to catalytically modulate oxidation-reduction reactions within a cell may control signaling cascades necessary for generating inflammation and provide therapeutic benefit targeted at down regulation of aberrant immune function. The metalloporphyrin-based catalytic antioxidants (CA) can scavenge a broad range of oxidants and serve to decrease the production of free radicals and therefore, inflammatory cytokines. It is believed that this may have a positive impact on islet function post-transplant and reduce the prevalence of primary non-function, increasing the incidences of insulin independence from single islet infusions. In addition, we hypothesize that CA may allow for the utilization of islets isolated from expanded-criteria donors (ECDs) and donation-after cardiac death (DCD) donors, which are now clinically discarded, thus potentially increasing the number of T1D patients that can receive an islet transplant. To this extent, in both murine and human islet isolation experiments, CA were supplemented during organ procurement, organ storage (preservation), islet isolation (wash media and collagenase) and subsequent islet culture. Preliminary results suggest that redox modulation decreases oxidative stress typical from these donors, leading to increased beta cell mass recovery post-isolation, robust beta cell health and in vivo transplant function compared to non-treated control donors. Viability measures include glucose stimulating insulin secretion, membrane dye exclusion assay and beta cell viability through flow cytometry. By reducing the oxidative stresses that occurs during the islet isolation and transplantation processes, we will be able to make inroads in the patient care of T1D through increasing the long-term efficacy of clinical islet transplantation.

Abstract#: 85
SPI-6 (Serpin Protease Inhibitor-6) inhibits granzyme B mediated injury of renal tubular cells and promotes renal allograft survival
Arthur Lau 1 , Karim Khan 1 , Kelvin Shek 1 , Ziqin Yin 1 , Xuyan Huang 1 , Aaron Haig 2 , Weihua Liu 1 , Bhagi Singh 1 , Zhuxu Zhang 1 , Anthony Jevnikar 1
1 Matthew Mailing Centre for Translational Transplant Studies, Western University
2 Department of Pathology, Western University
Introduction: Proteinase inhibitor 9 (PI-9) is an intracellular serpin that inhibits Granzyme B (GrB), a serine protease found in the cytosolic granules of CD8+ T and Natural Killer (NK) cells. PI-9 functions to prevent "misdirected" apoptosis in GrB expressing cells. The murine homolog of PI-9 is serpin protease inhibitor 6 (SPI-6). Kidney tubular epithelial cells (TEC) are a principal targets for cytotoxic cells following transplant. Therefore TEC resistance to cell mediated injury may influence the graft survival and function. The expression and regulation of SPI-6 in TEC and kidney has not been studied.
Methods/Results: We demonstrate TEC express SPI-6 protein, the murine homolog of PI-9, basally with a modest increase following cytokine exposure. TEC expression of SPI-6 blocks granzyme B mediated death as TEC from SPI-6 null kidneys have increased susceptibility to cytotoxic CD8+ cells in vitro. We then tested the role of SPI-6 in a mouse kidney transplant model using SPI-6 null or wild type donor kidneys (H-2b) into nephrectomized recipients (H-2d). SPI-6 null kidney recipients had reduced renal function at day 8 post-transplant compared to controls (creatinine: 113+23 vs. 28 +3 mol/L, n=5, P<0.01) consistent with greater tubular injury and extensive mononuclear cell infiltration. Finally, loss of donor kidney SPI-6 shortened graft survival time (20+19 vs. 66+33 days, n=8-10, P<0.001).
Conclusion: Our data shows for the first time that resistance of kidney TEC to cytotoxic T cell, granzyme B induced death is mediated by the expression of SPI-6. We suggest SPI-6 is an important endogenous mechanism to prevent rejection injury from perforin/granzyme B effectors and enhanced PI-9/SPI-6 expression by TEC may provide protection from diverse forms of inflammatory kidney injury and promote long term allograft survival.

Abstract#: 86
Single incision robotic donor nephrectomy: first reported case in Canada

Single incision robotic donor nephrectomy: first reported case in Canada
Jason Archambault , Neal Rowe , Patrick Luke , Alp Sener
Western University

With an ever expanding number of patients waiting for a renal transplant, programs are continually looking for ways to increase the pool of donor organs. One such strategy is to increase the rates of living donation by limiting the morbidity of the donor operation. In recent years laproscopic donor nephrectomy has become standard practice. This would typically include 3-4 laproscopic ports and an additional extraction incision. New technology has allowed the pursuit of less invasive techniques. Here we describe the first reported case of a robotic single incision donor nephrectomy in Canada. Our first donor was a 48 year old female who underwent a left donor nephrectomy with a single umbilical incision with the aid of a Gelport and Da Vinci robot. In order to evaluate the benefits of this technique we are comparing a prospective cohort of single incision donors to a control group undergoing donor nephrectomy in the standard laproscopic technique. Our goal is to assess if there is a recovery benefit in addition to improved cosmesis.