Article ID Journal Published Year Pages File Type
3420859 Transactions of the Royal Society of Tropical Medicine and Hygiene 2007 5 Pages PDF
Abstract

SummaryWe evaluated peripheral blood tests to diagnose iron deficiency on medical wards in Blantyre, Malawi, where infection and HIV are prevalent. We compared full blood count, ferritin and serum transferrin receptor (TfR) levels with an estimation of iron in bone marrow aspirates. Of consecutive adults admitted with severe anaemia (haemoglobin <7 g/dl), 81 had satisfactory bone marrow aspirates. The main outcome measures were the validity of each test (sensitivity, specificity, and positive and negative predictive values) and likelihood ratios (LR) for iron deficiency. Twenty patients (25%) were iron deficient and 64 (79%) were HIV-positive. Iron deficiency was more common in HIV-negative compared with HIV-positive patients (59% vs. 16%; P < 0.001). In HIV-positive patients, the optimal ferritin cut-off was 150 μg/l (sensitivity 20%, specificity 93%, LR 2.7), but no test was accurate enough to be clinically useful. In HIV-negative patients, ferritin was the single most accurate test (cut-off <70 μg/l, 100% specificity, 90% sensitive, LR if positive ∞, LR if negative 10). TfR measurement did not improve the accuracy. Mean cell volume was not a good predictor of iron status except in HIV-negative patients (cut-off <85 fl, specificity 71%, sensitivity 90%). In populations with high levels of infection and HIV, an HIV test is necessary to interpret any tests of iron deficiency. In HIV-negative patients, ferritin is the best blood test for iron deficiency, using a higher cut-off than usual. For HIV-positive patients, it is difficult to diagnose iron deficiency without bone marrow aspirates.

Related Topics
Life Sciences Immunology and Microbiology Applied Microbiology and Biotechnology
Authors
, , , , , ,