Background. Refractory infectious wounds on renal transplantation (RT) recipients significantly prolong hospital stay, increase medical costs, and threaten allograft survival. Vacuum sealing drainage (VSD) therapy is a new technique for managing wounds based on the principle of application of controlled negative pressure. The aim of this study was to summarize the efficacy and safety of VSD therapy in the management of refractory infectious wounds following RT. Materials and methods. This is a retrospective study of a cohort of 661 consecutive patients who received renal transplants over a period of 3 years in which the data were collected and analyzed retrospectively. Results. Out of the 661 patients, 16 (2.4%) developed refractory wound infection following RT. Nineteen organisms were identified by culture from all patients, including 10 patients infected with 1 or more bacteria, 2 patients with fungal infection, and 4 patients with both. Specifically, mucormycosis was demonstrated in 4 patients, pan-resistant Klebsiella pneumoniae in 2 patients, and Acinetobacter baumannii in 2 patients. All 16 patients were treated with VSD therapy for a median of 37 days (range, 6 - 111 days). The number of VSD sets used ranged from 4 to 28 sets (mean, 11.1 sets). A combination of antibiotics, debridement, and VSD therapy lead to 100% (16 of 16) wound healing. No VSD-relevant adverse events were observed. Conclusions. VSD therapy is an effective and safe adjunct to conventional treatment modalities for the management of refractory wound infection following RT.
Objectives. This study aimed to identify the potential risk factors of acute rejection after deceased donor kidney transplantation in China. Methods. Adult kidney transplantations from deceased donors in our center from February 2004 to December 2015 were enrolled for retrospective analysis. All deceased donations complied with China's Organ Donation Program. No organs from executed prisoners were used. The incidence of clinical and biopsy-proved acute rejection was assessed with the Kaplan-Meier method, and the Cox proportional hazard model was used for multivariate analysis. Results. One-year, 2-year, 3-year and 5-year incidences of acute rejection were 12.4%, 14.2%, 14.8%, and 17.1%, respectively. Multivariate analysis demonstrated that longer pre-transplant dialysis duration (hazard ratio [HR] 1.009 per month; 95% confidence interval, 1.003-1.015; P = .003), positive pre-transplant panel reactive antibody (PRA) (positive vs negative HR 3.266; 1.570-6.793; P = .023), and increasing HLA mismatches (>= 4 vs < 4 HR 2.136; 1.022-4.465; P = .044) increased the risk of acute rejection, while tacrolimus decreased acute rejection risk compared to cyclosporine (HR 0.317; 0.111-0.906; P = .032). Conclusion. Longer pre-transplant dialysis duration, HLA mismatch, and positive pre transplant PRA increase the risk of acute rejection, while tacrolimus helps prevent acute rejection compared to cyclosporine in deceased donor kidney transplantation.
Background. A variety of complex drug regimens are offered to kidney transplant recipients after transplantation. This study aimed to evaluate the behavioral and physiological outcomes of pharmaceutical care in this population. Methods. A cross-sectional prospective study was conducted, which collected and categorized kidney transplant recipients according to pharmaceutical care. In the IR group, patients had received irregular pharmaceutical care after transplantation, and in the RE group, patients had received regular intervention. Intervention included face-to-face interview, checkup for laboratory examinations, discovery of drug-related problems, and pharmaceutical consultation. Baseline knowledge for self-care was tested for patients in both groups. Correct concepts and medication guidance were consistently provided to enable patients to understand the importance of rejection prevention and knowledge for medication and renal care after transplantation. After 12 months, the same test was used to evaluate the outcomes for pharmaceutical care and a satisfaction questionnaire was used to assess for pharmacy service. Results. The study results revealed that patients in the RE group possessed better knowledge for self-care (P < .001); however, the differences at 12 months became insignificant (P = .72) after patients in the IR group had also received routine pharmaceutical care. Besides, serum creatinine level of the RE patients was stable without significant variation (P = .93), but it demonstrated a rising trend in IR patients (P < .01). Patients were greatly satisfactory with the intervention. Conclusions. A consistent post-transplantation pharmaceutical care service is effective to substantially improve knowledge of post-transplantation self-care. Pharmaceutical care should be started as early as possible during the pre-transplant period and continue in a long-term follow-up.
Four kidneys from 2 young donors suffering from rhabdomyolysis were rejected for transplantation at the time of procurement because of their severely bruised and black gross appearance. A frozen section revealed a focal tubular injury filled with granular pigmented casts which an immunohistochemistry confirmed to be myoglobin. The 4 kidneys were transplanted successfully and all recipients recovered normal renal function with no delay. These cases indicate that kidneys with patchy black gross appearance caused by myoglobin casts secondary to rhabdomyolysis is not a contraindication for transplantation.
Glomerulonephritis recurrence has emerged as one of the leading causes of allograft loss. We aimed to investigate the effect of living-related and deceased donation on the incidence of renal allograft glomerulonephritis and its effect on renal allograft survival. Methods. Adult renal allograft recipients with primary glomerulonephritis were enrolled. Transplantation date was from Feb 2004 to Dec 2015. Exclusion criteria included combined organ transplantation, structural abnormality, diabetic nephropathy, hypertension nephropathy, obstructive nephropathy, and primary uric acid nephropathy. The incidence of biopsy-proven allograft glomerulonephritis was compared between the living-related donor group and the deceased donor group. Graft survival was assessed with Kaplan-Meier method, and Cox proportional hazard model was used to evaluate the effect of posttransplant glomerulonephritis on graft outcome. Results. There were 525 living-related donor kidney transplant recipients (LRKTx) and 456 deceased donor kidney transplant recipients (DDKTx) enrolled. The incidence of IgA nephropathy was 8.8% in the LRKTx group and 1.3% in the DDKTx group (P < .001); the incidence of focal segmental glomerulosclerosis (FSGS) was 3.8% in the LRKTx group and 1.5% in the DDKTx group (P = .03). FSGS increased the risk of graft failure compared with non-FSGS (hazard ratio [HR], 3.703 [1.459-9.397]; P = .006). IgA nephropathy increased the risk of graft failure by over 5 times 5 years after kidney transplantation compared with non-IgA nephropathy, but it did not affect early allograft survival (HR for >= 5 years, 6.139; 95% CI, 1.766-21.345; P = .004; HR for <5 years, 0.385 [0.053-2.814]; P = .35). Conclusions. Higher incidence of IgA nephropathy and FSGS in renal allograft was observed in living-related donor kidney transplantation compared with deceased donor kidney transplantation. De novo or recurrent IgA nephropathy and FSGS impaired long-term renal allograft survival.
The aim of this study was to determine distinctive risk factors for graft survival of living related and deceased donor kidney transplantation (KTx). Methods. Consecutive 536 living-related and 524 deceased donor kidney transplant recipients from February 2014 to December 2015 in a single center were enrolled for retrospective analysis. Graft survival was assessed with the Kaplan-Meier method, and the Cox proportional hazard model was used to determine independent risk factors of allograft survival. Results. One-, 3-, and 5-year graft survival rates were 98.8%, 98.5%, and 97.2%, respectively, in living-related donor KTx and were 94.9%, 91.3%, and 91.3%, respectively, in deceased donor KTx (log-rank, P < .001). Multivariate analysis demonstrated that risk factors for graft survival in living-related donor KTx were pretransplant dialysis duration (hazard ratio [HR], 1.023 per month; P = .046), delayed graft function (HR, 5.785; P = .02), and acute rejection (HR, 2.706; P = .04); risks factors in deceased donor KTx were recipient age (HR, 1.066 per year; P = .004), recipient history of diabetes mellitus (HR, 3.011; P = .03), pretransplant positive panel reactive antibody (HR, 3.353; P = .02), and donor history of hypertension (HR, 2.660; P = .046). Conclusion. Distinctive risk factors for graft survival of living-related and deceased donor KTx were found.
Objectives. To compare the clinical outcome of kidney transplantation from living related and deceased donors. Patients and methods. Consecutive adult kidney transplants from living-related or deceased donors from February 2004 to December 2015 in a single center were enrolled for retrospective analysis. Estimated glomerular filtration rate (eGFR) was compared with linear mixed models controlling the effect of repeated measurement at different time points. Results. There were 536 living-related and 524 deceased donor kidney transplants enrolled. The 1-year, 3-year, and 5-year graft survival rates were 98.8%, 98.5% and 97.2% in living-related kidney transplantation (KTx), and 94.9%, 91.3% and 91.3% in deceased donor KTx (log-rank, P < .001). A significantly higher incidence of delayed graft function (DGF) was observed in deceased donor KTx (20.6% vs 2.6%, P < .001). eGFR in deceased donor KTx was significantly higher than that in living-related KTx (68.0 +/- 23.7 vs 64.7 +/- 17.9 mL/min/1.73 m(2) at 1 year postoperation, 70.1 +/- 23.3 vs 64.3 +/- 19.3 mL/min/1.73 m(2) at 2 years postoperation, and 72.5 +/- 26.2 vs 65.2 +/- 20.4 mL/min/1.73 m(2) at 3 years postoperation; P < .001). The donor age was significantly higher in living-related KTx group (47.5 +/- 11.0 vs 31.1 +/- 14.4 years, P < .001). Conclusion. Living-related graft survival is superior to deceased graft survival at this center, while better 5-year renal allograft function is obtained in deceased donor KTx patients, which may be attributable to the higher age of living-related donors.
Background. The online percent coefficient of variation reporting system could monitor the variation of tacrolimus trough level (T-0) and identify kidney transplant recipients (KTRs) with a higher percent coefficient of variation (%CV) instantly. Consequently, transplant doctors and pharmacists could take actions to improve drug variability. The purpose of this study was to determine the efficacy of the system for higher intrapatient variability of To in KTRs. Methods. The To data were collected with KTRs routinely followed up at an outpatient clinic between June 2016 and November 2016. The %CV was calculated with To data within 6 months before and after the index date. The last outpatient clinic visit date was before December 1, 2016. The KTRs with %CV of To greater than 22% were enrolled. Results. The study consisted of 183 KTRs (96 male, 87 female), the median age was 50 years (interquartile range [IQR], 41.0-57.0), and the median years post-kidney transplantation was 7 years (IQR, 3.0-12.4). The median To and creatinine level at baseline were 6.09 ng/mL (IQR, 4.80-7.52) and 1.33 mg/dL (IQR, 1.03-1.72), respectively. After the intervention, the median %CV of To was significantly lower than before, 32% (IQR, 26%-42%) vs 22% (IQR, 15%-33%), P < .001. The average improvement of %CV was also significantly better in KTRs with %CV >= 30% (median, from 41% to 25%) than KTRs with %CV between 22% and 30% (median, from 26% to 20%), P < .001. Conclusions. The results of this study indicate that continuously aggressive intervention with an online %CV reporting system effectively improves intrapatient variability of To in KTRs.
Background. We calculated the population pharmacokinetics of mizoribine in adult Chinese patients and compared the parameters with those of Japanese patients to determine whether there are any ethnic differences in blood concentration transition between these 2 populations. Methods. The blood concentrations of mizoribine in 21 Chinese patients who were administered mizoribine after renal transplantation were measured at 304 time points. The absorption lag time, absorption rate constant, apparent distribution volume, and oral clearance were thereafter calculated and compared with the respective Japanese references. Results. The absorption lag time, absorption rate constant, and apparent distribution volume calculated in this study were, respectively, 0.353 hour, 0.856 hour-1, and 0.776 L/kg. The oral clearance was calculated as 2.18 times the creatinine clearance using creatinine clearance as a function. The absorption rate constant, apparent distribution volume, and oral clearance are determinants of the maximum blood concentration, trough, and area under the blood concentration time curve. The relative absorption rate constant, apparent distribution volume, and oral clearance were 0.9-, 0.9-, and 1.2-fold, respectively, in Chinese patients compared with those in Japanese patients. These values are within the confidence limit, suggesting that there is no significant PK difference between the 2 ethnic groups. Conclusions. Results of this study showed no ethnic difference in blood mizoribine concentration transition between Chinese and Japanese patients. In addition, the population pharmacokinetic parameters obtained in this study are useful in determining the initial dosage or in the Bayesian analysis of mizoribine concentrations using scarce time points.
The aim of the study was to investigate the effect of immunosuppression therapy early after kidney transplantation, particularly exposure of mycophenolic acid (MPA) and calcineurin inhibitor (CNI), on posttransplantation de novo HLA antibody production. Methods. A single-center retrospective cohort study was performed at the First Affiliated Hospital of Sun Yat-sen University, enrolling the kidney transplant or pancreas-kidney transplant recipients who had surgery between January 2010 and February 2016. Results. A total of 214 recipients were included in the study with a median follow-up period of 1.06 years. A total of 30 recipients (14.0%) were positive in HLA antibody detection posttransplant with a median follow-up period of 1.46 years. Ten recipients (4.7%) lost their allograft function during follow-up, and 6 of them (60%) developed de novo HLA antibody after graft failure. Multivariate analysis showed that acute rejection significantly increased the risk of de novo HLA antibody (hazard ratio [HR], 2.732). Intensified MPA dosing therapy reduced the risk by 59.8% (HR, 0.402); low-dose CNI therapy increased the risk by 33.3% (HR, 1.333), and the effect of extremely low-dose CNI therapy was even larger (HR, 2.242). Conclusion. The risk of de novo HLA antibody can be decreased by reducing the risk of acute rejection. A tendency was seen in low-dose CNI therapy to increase the risk of de novo HLA antibody, but intensified MPA dosing therapy may provide an umbrella protection effect by reducing the risk. Prospective study was required to confirm the effects.
Background. Tumor necrosis factor-alpha-induced protein-8 like-2 (TIPE2) is a negative regulator of innate immunity and cellular immunity, yet the expression pattern of TIPE2 in acute rejection of cardiac allograft remain enigmatic. Methods. We developed cardiac transplantation models and divided into 3 groups: a naive group, a syngeneic group, and an allogeneic group. Then, we detected the messenger RNA and protein of TIPE2 in cardiac allografts. Real-time polymerase chain reaction showed expression of CD4 and CD8 in the donor heart, and immunofluorescence assay revealed the association between T cells and TIPE2. Results. In our study, we first found that the expression of TIPE2 in cardiac allografts is upregulated compared with the syngeneic control, and increases in a time-dependent manner The immunocytochemistry of heart grafts revealed a strong expression of TIPE2 in the inflammatory cells, but not in the cardiomyocytes. Finally, we proved that CD4(+) and CD8(+) T cells infiltrated cardiac allografts abundantly, which express ample TIPE2. Conclusions. The upregulated expression of TIPE2 in cardiac allografts, mainly came from T cells, which infiltrated the donor heart. This finding indicates that there may be an association between TIPE2 and acute cardiac allograft rejection.
Rikkunshito (TJ-43), an eight-component traditional Japanese herbal medicine, has been used in clinics for gastritis, vomiting, and appetite loss. We investigated the effects of TJ-43 on the amelioration of appetite loss in the surgical-exposed model of murine cardiac allograft transplantation. CBA mice underwent transplantation of a CBA (syngeneic group) or C57BL/6 heart (allogeneic group) and received oral administration of 2 g/kg/d of TJ-43 from the day of transplantation until 7 days afterward. The amount of food intake (FI) and weight change after operation were recorded from 1 to 28 postoperative days. The allogeneic group had less average amounts of FI for 1 week compared with the syngeneic group (FI was 1.90 +/- 0.43 g and 2.66 +/- 0.46 g, respectively). Average FIs between the syngeneic and allogeneic groups with TJ-43 for 1 week were 2.36 +/- 0.44 g and 2.30 +/- 0.13 g, respectively, and those with distilled water were 2.66 +/- 0.46 g and 1.90 +/- 0.43 g, respectively, suggesting that exposure with TJ-43 tended to ameliorate the reduction of FI. Similarly, the effect on the amelioration of average FI in syngeneic and allogeneic groups exposed for 2 weeks was confirmed. However, exposure to with TJ-43 had no effects on FI after 4 weeks. TJ-43 could prevent reduction of average FI induced by the surgical-exposed model of murine cardiac allograft transplantation.
Shigyakusan (also known as Tsumura Japan [TJ]-35) is composed of peony, bitter orange, licorice, and Bupleuri radix is used for cholecystitis and gastritis as an anti-inflammatory agent. We investigated the effect of TJ-35 on alloimmune response in a murine heart transplantation model. CBA mice that underwent transplantation of a C57BL/6 (B6) heart were assigned to four groups: no treatment, TJ-35-exposed, each component-exposed, or each component missing-exposed. The four groups above each received oral administration of TJ-35, each component, or TJ-35 with each component missing from the day of transplantation until 7 days, respectively. Untreated CBA recipients rejected B6 cardiac grafts acutely (median survival time [MST], 7 days). TJ-35-exposed CBA recipients had significantly prolonged B6 allograft survival (MST, 20.5 days). However, MSTs of CBA recipients that had been administered each component and TJ-35 with each component missing did not reach that of TJ-35-exposed recipients. Adoptive transfer of CD4(+) splenocytes from TJ-35-exposed primary allograft recipients resulted in significant prolonged allograft survival in naive secondary recipients (MST, 63 days). Flow cytometry studies showed that the percentage of CD4(+)CD25(+)Foxp3(+) cell population was increased in TJ-35-exposed CBA recipients. In conclusion, TJ-35-induced prolongation of fully allogeneic cardiac allografts and may generate regulatory CD4(+)CD25(+)Foxp3(+) cells in our model. The effect seemed to require all components of TJ-35.
The rat orthotopic liver transplantation model with extremely short anhepatic phase was established to study its protective effect on the recipients and graft. One hundred fifty adult male Wistar rats were randomly divided into three groups: group A (n = 30), using magnetic rings for the suprahepatic vena cava reconstruction; group B (n = 30), using 7/0 Prolene sutures for suprahepatic vena cava running anastomosis as control; and a sham-operated group (n = 30) as a blank control group. The changes in liver enzyme, serum creatinine, endotoxin, and cytokine levels and histopathology were recorded. The serum creatinine, potassium, alanine transaminase, and alkaline phosphatase levels at different points in time in group A were lower than those in group B (P < .05). The level of portal vein blood endotoxin in group A was significantly lower than that in group B at each point (P < .01). At the same time, all the cytokines in group B were higher than those in group A, and the two groups were higher than those in the sham operation group. The mean levels of tumor necrosis factor-alpha (TNF-alpha), interferon-gamma, (IFN-gamma), and interleukin-1 beta (IL-1 beta) at 3 hours were higher than at 6 hours in group A. IL-10 and tissue inhibitor of metalloproteinase-1 (TIMP-1) were all higher at 3 hours in groups A and B. Levels of monocyte chemotactic protein-1, L-selectin, and TIMP-1 in group A and IL-10, monocyte chemotactic protein-1, L-selectin, and TIMP-1 in group B were higher in blood than in the liver. Levels of TNF-alpha, IFN-gamma, IL-1, IL-10, and intracellular adhesion molecule-1 in group A and TNF-alpha, IFN-gamma IL-1 beta, and intracellular adhesion molecule-1 in group B were higher in the liver than in blood. We conclude that the extremely short anhepatic phase has protective effects on recipients and grafts in rat liver transplantation because it is related to alleviating ischemia-reperfusion injury and reducing the endotoxin release.
Background. Mild hypothermia is known to be protected against ischemia reperfusion (IR) injury. But the exact mechanisms of protection have not yet been fully understood and its usage has been limited. Mild hypothermia pretreatment (MHP) is used to investigate the mechanisms of the protective effects against liver IR injury. Methods. Anesthetized male Sprague-Dawley rats were randomly divided into five groups including the normal group (N), sham group (S), MHP group, normothermia pretreatment (NP) + IR group, and the MHP + IR group. In the pretreatment groups, mild hypothermia (32.2 +/- 0.3 degrees C) and normothermia (37 +/- 0.5 degrees C) pretreatment were applied for 2 hours, respectively. Then the IR groups suffered partial (70%) hepatic ischemia for 1 hour and reperfusion for 6 hours. At last, hepatic injury, apoptosis, and protein expression were assessed. Results. Levels of serum alanine transaminase, hepatic injury, hepatocyte apoptosis, and c-Jun N-terminal kinase (JNK) phosphorylation were significantly higher in the IR groups. But when compared to NP, all these changes induced by IR were markedly attenuated by MHP. Serum alanine transaminase levels were 383.4 +/- 13.1U/L in the MHP + IR group and 951.3 +/- 39.4 U/L in the NP + IR group. The histologic score of liver injury in the MHP + IR group was 4.83 +/- 1.17, whereas in the NP + IR group it was 10.5 +/- 1.05. The proportion of apoptotic cells in the MHP + IR group was 11.58 +/- 0.60, but in the NP + IR group, it was 44.95 +/- 1.61. The phosphorylation of JNK was also significantly reduced in the MHP + IR group. All these differences are statistically significant (P < .05). Conclusions. MHP could markedly reduce liver IR injury, and these protective effects may be mainly exerted via inhibition of JNK phosphorylation.