Unsupervised Natural Language Inference Using PHL Triplet Generation

dc.contributor.authorVarshney, Neeraj
dc.contributor.authorBanerjee, Pratyay
dc.contributor.authorGokhale, Tejas
dc.contributor.authorBaral, Chitta
dc.date.accessioned2025-06-05T14:03:17Z
dc.date.available2025-06-05T14:03:17Z
dc.date.issued2022-05
dc.description.abstractTransformer-based models achieve impressive performance on numerous Natural Language Inference (NLI) benchmarks when trained on respective training datasets. However, in certain cases, training samples may not be available or collecting them could be time-consuming and resource-intensive. In this work, we address the above challenge and present an explorative study on unsupervised NLI, a paradigm in which no human-annotated training samples are available. We investigate it under three settings: PH, P, and NPH that differ in the extent of unlabeled data available for learning. As a solution, we propose a procedural data generation approach that leverages a set of sentence transformations to collect PHL (Premise, Hypothesis, Label) triplets for training NLI models, bypassing the need for human-annotated training data. Comprehensive experiments with several NLI datasets show that the proposed approach results in accuracies of up to 66.75%, 65.9%, 65.39% in PH, P, and NPH settings respectively, outperforming all existing unsupervised baselines. Furthermore, fine-tuning our model with as little as ~0.1% of the human-annotated training dataset (500 instances) leads to 12.2% higher accuracy than the model trained from scratch on the same 500 instances. Supported by this superior performance, we conclude with a recommendation for collecting high-quality task-specific data.
dc.description.sponsorshipThis research was supported by in Natural Language Processing (EMNLP), pages 5511–5520, Online. Association for Computational Linguistics.
dc.description.urihttps://aclanthology.org/2022.findings-acl.159/
dc.format.extent14 pages
dc.genrejournal articles
dc.identifierdoi:10.13016/m2qz56-fwst
dc.identifier.citationVarshney, Neeraj, Pratyay Banerjee, Tejas Gokhale, and Chitta Baral. “Unsupervised Natural Language Inference Using PHL Triplet Generation.” Edited by Smaranda Muresan, Preslav Nakov, and Aline Villavicencio. Findings of the Association for Computational Linguistics: ACL 2022, May 2022, 2003–16. https://doi.org/10.18653/v1/2022.findings-acl.159.
dc.identifier.urihttps://doi.org/10.18653/v1/2022.findings-acl.159
dc.identifier.urihttp://hdl.handle.net/11603/38681
dc.language.isoen_US
dc.publisherACL
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/deed.en
dc.titleUnsupervised Natural Language Inference Using PHL Triplet Generation
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0002-5593-2804

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2022.findingsacl.159.pdf
Size:
543.91 KB
Format:
Adobe Portable Document Format