Abstract
The diagnosis system for allograft loss lacks accurate individual risk stratification on the basis of donor-specific anti-HLA antibody (anti-HLA DSA) characterization. We investigated whether systematic monitoring of DSA with extensive characterization increases performance in predicting kidney allograft loss. This prospective study included 851 kidney recipients transplanted between 2008 and 2010 who were systematically screened for DSA at transplant, 1 and 2 years post-transplant, and the time of post-transplant clinical events. We assessed DSA characteristics and performed systematic allograft biopsies at the time of post-transplant serum evaluation. At transplant, 110 (12.9%) patients had DSAs; post-transplant screening identified 186 (21.9%) DSA-positive patients. Post-transplant DSA monitoring improved the prediction of allograft loss when added to a model that included traditional determinants of allograft loss (increase in c statistic from 0.67; 95% confidence interval [95% CI], 0.62 to 0.73 to 0.72; 95% CI, 0.67 to 0.77). Addition of DSA IgG3 positivity or C1q binding capacity increased discrimination performance of the traditional model at transplant and post-transplant. Compared with DSA mean fluorescence intensity, DSA IgG3 positivity and C1q binding capacity adequately reclassified patients at lower or higher risk for allograft loss at transplant (category-free net reclassification index, 1.30; 95% CI, 0.94 to 1.67; P<0.001 and 0.93; 95% CI, 0.49 to 1.36; P<0.001, respectively) and post-transplant (category-free net reclassification index, 1.33; 95% CI, 1.03 to 1.62; P<0.001 and 0.95; 95% CI, 0.62 to 1.28; P<0.001, respectively). Thus, pre- and post-transplant DSA monitoring and characterization may improve individual risk stratification for kidney allograft loss.
Copyright © 2016 by the American Society of Nephrology.