Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6161989 | Kidney International | 2015 | 7 Pages |
Abstract
Intravenous (IV) iron is required for optimal management of anemia in the majority of hemodialysis (HD) patients. While IV iron prescription has increased over time, the best dosing strategy is unknown and any effect of IV iron on survival is unclear. Here we used adjusted Cox regression to analyze associations between IV iron dose and clinical outcomes in 32,435 HD patients in 12 countries from 2002 to 2011 in the Dialysis Outcomes and Practice Patterns Study. The primary exposure was total prescribed IV iron dose over the first 4 months in the study, expressed as an average dose/month. Compared with 100-199Â mg/month (the most common dose range), case-mix-adjusted mortality was similar for the 0, 1-99, and 200-299Â mg/month categories but significantly higher for the 300-399Â mg/month (HR of 1.13, 95% CI of 1.00-1.27) and 400Â mg/month or more (HR of 1.18, 95% CI of 1.07-1.30) groups. Convergent validity was proved by an instrumental variable analysis, using HD facility as the instrument, and by an analysis expressing IV iron dose/kg body weight. Associations with cause-specific mortality (cardiovascular, infectious, and other) were generally similar to those for all-cause mortality. The hospitalization risk was elevated among patients receiving 300Â mg/month or more compared with 100-199Â mg/month (HR of 1.12, 95% CI of 1.07-1.18). In light of these associations, a well-powered clinical trial to evaluate the safety of different IV iron-dosing strategies in HD patients is urgently needed.
Related Topics
Health Sciences
Medicine and Dentistry
Nephrology
Authors
George R. Bailie, Maria Larkina, David A. Goodkin, Yun Li, Ronald L. Pisoni, Brian Bieber, Nancy Mason, Lin Tong, Francesco Locatelli, Mark R. Marshall, Masaaki Inaba, Bruce M. Robinson,