Neural Normalized Compression Distance and the Disconnect Between Compression and Classification
dc.contributor.author | Hurwitz, John | |
dc.contributor.author | Nicholas, Charles | |
dc.contributor.author | Raff, Edward | |
dc.date.accessioned | 2024-12-11T17:02:31Z | |
dc.date.available | 2024-12-11T17:02:31Z | |
dc.date.issued | 2024-10-20 | |
dc.description | 38th Conference on Neural Information Processing Systems (NeurIPS 2024), Machine Learning and Compression Workshop, Dec 10-Dec 15 2024 | |
dc.description.abstract | It is generally well understood that predictive classification and compression are intrinsically related concepts in information theory. Indeed, many deep learning methods are explained as learning a kind of compression, and that better compression leads to better performance. We interrogate this hypothesis via the Normalized Compression Distance (NCD), which explicitly relies on compression as the means of measuring similarity between sequences and thus enables nearest-neighbor classification. By turning popular large language models (LLMs) into lossless compressors, we develop a Neural NCD and compare LLMs to classic general-purpose algorithms like gzip. In doing so, we find that classification accuracy is not predictable by compression rate alone, among other empirical aberrations not predicted by current understanding. Our results imply that our intuition on what it means for a neural network to ``compress'' and what is needed for effective classification are not yet well understood. | |
dc.description.uri | http://arxiv.org/abs/2410.15280 | |
dc.format.extent | 10 pages | |
dc.genre | conference papers and proceedings | |
dc.genre | postprints | |
dc.identifier | doi:10.13016/m2gnus-ec7n | |
dc.identifier.uri | https://doi.org/10.48550/arXiv.2410.15280 | |
dc.identifier.uri | http://hdl.handle.net/11603/37077 | |
dc.language.iso | en_US | |
dc.relation.isAvailableAt | The University of Maryland, Baltimore County (UMBC) | |
dc.relation.ispartof | UMBC Faculty Collection | |
dc.relation.ispartof | UMBC Data Science | |
dc.relation.ispartof | UMBC Computer Science and Electrical Engineering Department | |
dc.relation.ispartof | UMBC Student Collection | |
dc.rights | This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author. | |
dc.subject | Computer Science - Machine Learning | |
dc.subject | UMBC Discovery, Research, and Experimental Analysis of Malware Lab (DREAM Lab) | |
dc.subject | UMBC Interactive Robotics and Language Lab (IRAL Lab) | |
dc.subject | Statistics - Machine Learning | |
dc.title | Neural Normalized Compression Distance and the Disconnect Between Compression and Classification | |
dc.type | Text | |
dcterms.creator | https://orcid.org/0000-0001-9494-7139 | |
dcterms.creator | https://orcid.org/0000-0002-9900-1972 |
Files
Original bundle
1 - 1 of 1