Prompt-based learning, which exploits knowledge from pre-trained language models by providing textual prompts and designing appropriate answer-category mapping methods, has achieved impressive successes on few-shot text classification and natural language inference (NLI). Because of the diverse linguistic expression, there exist many answer tokens for the same category. However, both manual answer design and automatic answer search constrain answer space and therefore hardly achieve ideal performance. To address this issue, we propose an answer space clustered prompting model (ASCM) together with a synonym initialization method (SI) which automatically categorizes all answer tokens in a semantic-clustered embedding space. We also propose a stable semi-supervised method named stair learning (SL) that orderly distills knowledge from better models to weaker models. Extensive experiments demonstrate that our ASCM+SL significantly outperforms existing state-of-the-art techniques in few-shot settings.
Zhen Wang, Yating Yang, Zhou Xi, Bo Ma, Lei Wang, Rui Dong, and Azmat Anwar. 2022.ASCM: An Answer Space Clustered Prompting Method without Answer Engineering. InFindings of the Association for Computational Linguistics: ACL 2022, pages 2455–2469, Dublin, Ireland. Association for Computational Linguistics.
@inproceedings{wang-etal-2022-ascm, title = "{ASCM}: An Answer Space Clustered Prompting Method without Answer Engineering", author = "Wang, Zhen and Yang, Yating and Xi, Zhou and Ma, Bo and Wang, Lei and Dong, Rui and Anwar, Azmat", editor = "Muresan, Smaranda and Nakov, Preslav and Villavicencio, Aline", booktitle = "Findings of the Association for Computational Linguistics: ACL 2022", month = may, year = "2022", address = "Dublin, Ireland", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2022.findings-acl.193/", doi = "10.18653/v1/2022.findings-acl.193", pages = "2455--2469", abstract = "Prompt-based learning, which exploits knowledge from pre-trained language models by providing textual prompts and designing appropriate answer-category mapping methods, has achieved impressive successes on few-shot text classification and natural language inference (NLI). Because of the diverse linguistic expression, there exist many answer tokens for the same category. However, both manual answer design and automatic answer search constrain answer space and therefore hardly achieve ideal performance. To address this issue, we propose an answer space clustered prompting model (ASCM) together with a synonym initialization method (SI) which automatically categorizes all answer tokens in a semantic-clustered embedding space. We also propose a stable semi-supervised method named stair learning (SL) that orderly distills knowledge from better models to weaker models. Extensive experiments demonstrate that our ASCM+SL significantly outperforms existing state-of-the-art techniques in few-shot settings."}
%0 Conference Proceedings%T ASCM: An Answer Space Clustered Prompting Method without Answer Engineering%A Wang, Zhen%A Yang, Yating%A Xi, Zhou%A Ma, Bo%A Wang, Lei%A Dong, Rui%A Anwar, Azmat%Y Muresan, Smaranda%Y Nakov, Preslav%Y Villavicencio, Aline%S Findings of the Association for Computational Linguistics: ACL 2022%D 2022%8 May%I Association for Computational Linguistics%C Dublin, Ireland%F wang-etal-2022-ascm%X Prompt-based learning, which exploits knowledge from pre-trained language models by providing textual prompts and designing appropriate answer-category mapping methods, has achieved impressive successes on few-shot text classification and natural language inference (NLI). Because of the diverse linguistic expression, there exist many answer tokens for the same category. However, both manual answer design and automatic answer search constrain answer space and therefore hardly achieve ideal performance. To address this issue, we propose an answer space clustered prompting model (ASCM) together with a synonym initialization method (SI) which automatically categorizes all answer tokens in a semantic-clustered embedding space. We also propose a stable semi-supervised method named stair learning (SL) that orderly distills knowledge from better models to weaker models. Extensive experiments demonstrate that our ASCM+SL significantly outperforms existing state-of-the-art techniques in few-shot settings.%R 10.18653/v1/2022.findings-acl.193%U https://aclanthology.org/2022.findings-acl.193/%U https://doi.org/10.18653/v1/2022.findings-acl.193%P 2455-2469
[ASCM: An Answer Space Clustered Prompting Method without Answer Engineering](https://aclanthology.org/2022.findings-acl.193/) (Wang et al., Findings 2022)
Zhen Wang, Yating Yang, Zhou Xi, Bo Ma, Lei Wang, Rui Dong, and Azmat Anwar. 2022.ASCM: An Answer Space Clustered Prompting Method without Answer Engineering. InFindings of the Association for Computational Linguistics: ACL 2022, pages 2455–2469, Dublin, Ireland. Association for Computational Linguistics.