Computer vision supported pedestrian tracking: A demonstration on trail bridges in rural Rwanda

Author/Creator ORCID

Date

2020-10-26

Department

Program

Citation of Original Publication

Thomas E, Gerster S, Mugabo L, Jean H, Oates T (2020) Computer vision supported pedestrian tracking: A demonstration on trail bridges in rural Rwanda. PLoS ONE 15(10): e0241379. https://doi.org/10.1371/journal.pone.0241379

Rights

This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Attribution 4.0 International

Subjects

Abstract

Trail bridges can improve access to critical services such as health care, schools, and markets. In order to evaluate the impact of trail bridges in rural Rwanda, it is helpful to objectively know how and when they are being used. In this study, we deployed motion-activated digital cameras across several trail bridges installed by the non-profit Bridges to Prosperity. We conducted and validated manual counting of bridge use to establish a ground truth. We adapted an open source computer vision algorithm to identify and count bridge use reflected in the digital images. We found a reliable correlation with less than 3% error bias of bridge crossings per hour between manual counting and those sites at which the cameras logged short video clips. We applied this algorithm across 186 total days of observation at four sites in fall 2019, and observed a total of 33,800 daily bridge crossings ranging from about 20 to over 1,100 individual uses per day, with no apparent correlation between daily or total weekly rainfall and bridge use, potentially indicating that transportation behaviors, after a bridge is installed, are no longer impacted by rainfall conditions. Higher bridge use was observed in the late afternoons, on market and church days, and roughly equal use of the bridge crossings in each direction. These trends are consistent with the design-intent of these bridges.