Informedia@TRECVID 2014 MED and MER
dc.contributor.author | Yu, Shoou-I. | |
dc.contributor.author | Jiang, Lu | |
dc.contributor.author | Xu, Zhongwen | |
dc.contributor.author | Lan, Zhenzhong | |
dc.contributor.author | Xu, Shicheng | |
dc.contributor.author | Chang, Xiaojun | |
dc.contributor.author | Martin, Lara J. | |
dc.date.accessioned | 2025-03-11T14:42:34Z | |
dc.date.available | 2025-03-11T14:42:34Z | |
dc.date.issued | 2014 | |
dc.description.abstract | We report on our system used in the TRECVID 2014 Multimedia Event Detection (MED) and Multimedia Event Recounting (MER) tasks. On the MED task, the CMU team achieved leading performance in the Semantic Query (SQ), 000Ex, 010Ex and 100Ex settings. Furthermore, SQ and 000Ex runs are significantly better than the submissions from the other teams. We attribute the good performance to 4 main components: 1) large-scale semantic concept detectors trained on video shots for SQ/000Ex systems, 2) better features such as improved trajectories and deep learning features for 010Ex/100Ex systems, 3) a novel Multistage Hybrid Late Fusion method for 010Ex/100Ex systems and 4) improved reranking methods for Pseudo Relevance Feedback for 000Ex/010Ex systems. On the MER task, our system utilizes a subset of features and detection results from the MED system from which the recounting is then generated. Recounting evidence is presented by selecting the most likely concepts detected in the salient shots of a video. Salient shots are detected by searching for shots which have high response when predicted by the video level event detector. | |
dc.description.sponsorship | This work has been supported by the Intelligence Advanced Research Projects Activity (IARPA) via Department of Interior National Business Center contract number D11PC20068. The U.S. government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright annotation thereon. Disclaimer: The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of IARPA, DoI/NBC, or the U.S. Government. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number OCI-1053575. Specifically, it used the Blacklight system at the Pittsburgh Supercomputing Center (PSC). | |
dc.description.uri | https://researchmgt.monash.edu/ws/portalfiles/portal/316305615/315545229_oa.pdf | |
dc.format.extent | 14 pages | |
dc.genre | journal articles | |
dc.genre | preprints | |
dc.identifier | doi:10.13016/m2pcms-bkos | |
dc.identifier.uri | http://hdl.handle.net/11603/37750 | |
dc.language.iso | en_US | |
dc.relation.isAvailableAt | The University of Maryland, Baltimore County (UMBC) | |
dc.relation.ispartof | UMBC Computer Science and Electrical Engineering Department | |
dc.rights | This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author. | |
dc.title | Informedia@TRECVID 2014 MED and MER | |
dc.type | Text | |
dcterms.creator | https://orcid.org/0000-0002-0623-599X |
Files
Original bundle
1 - 1 of 1