Summary: | Junichi Yoshida, Yukiko Harada, Tetsuya Kikuchi, Ikuyo Asano, Takako Ueno, Nobuo Matsubara Infection Control Committee, Shimonoseki City Hospital, Shimonoseki, Japan Abstract: The aim of this study was to elucidate risk factors, including ward antimicrobial use density (AUD), for central line-associated bloodstream infection (CLABSI) as defined by the Centers for Disease Control and Prevention in a 430-bed community hospital using central venous lines with closed-hub systems. We calculated AUD as (total dose)/(defined daily dose × patient days) ×1,000 for a total of 20 drugs, nine wards, and 24 months. Into each line day data, we inputed AUD and device utilization ratios, number of central line days, and CLABSI. The ratio of susceptible strains in isolates were subjected to correlation analysis with AUD. Of a total of 9,997 line days over 24 months, CLABSI was present in 33 cases (3.3 ‰), 14 (42.4%) of which were on surgical wards out of nine wards. Of a total of 43 strains isolated, eight (18.6%) were methicillin-resistant Staphylococcus aureus (MRSA); none of the MRSA-positive patients had received cefotiam before the onset of infection. Receiver-operating characteristic analysis showed that central line day 7 had the highest accuracy. Logistic regression analysis showed the central line day showed an odds ratio of 5.511 with a 95% confidence interval of 1.936–15.690 as did AUD of cefotiam showing an odds ratio of 0.220 with 95% confidence interval of 0.00527–0.922 (P=0.038). Susceptible strains ratio and AUD showed a negative correlation (R2=0.1897). Thus, CLABSI could be prevented by making the number of central line days as short as possible. The preventative role of AUD remains to be investigated. Keywords: bloodstream infection, central line, antimicrobial use density
|