Machine learning in hyperbolic spaces has attracted much attention in natural language processing and many other fields. In particular, Hyperbolic Neural Networks (HNNs) have improved a wide variety of tasks, from machine translation to knowledge graph embedding. Although some studies have reported the effectiveness of embedding into the product of multiple hyperbolic spaces, HNNs have mainly been constructed in a single hyperbolic space, and their extension to product spaces has not been sufficiently studied. Therefore, we propose a novel method to extend a given HNN in a single space to a product of hyperbolic spaces. We apply our method to Hyperbolic Graph Convolutional Networks (HGCNs), extending several HNNs. Our model improved the graph node classification accuracy especially on datasets with tree-like structures. The results suggest that neural networks in a product of hyperbolic spaces can be more effective than in a single space in representing structural data.
Jun Takeuchi, Noriki Nishida, and Hideki Nakayama. 2022.Neural Networks in a Product of Hyperbolic Spaces. InProceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop, pages 211–221, Hybrid: Seattle, Washington + Online. Association for Computational Linguistics.
@inproceedings{takeuchi-etal-2022-neural, title = "Neural Networks in a Product of Hyperbolic Spaces", author = "Takeuchi, Jun and Nishida, Noriki and Nakayama, Hideki", editor = "Ippolito, Daphne and Li, Liunian Harold and Pacheco, Maria Leonor and Chen, Danqi and Xue, Nianwen", booktitle = "Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop", month = jul, year = "2022", address = "Hybrid: Seattle, Washington + Online", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2022.naacl-srw.27/", doi = "10.18653/v1/2022.naacl-srw.27", pages = "211--221", abstract = "Machine learning in hyperbolic spaces has attracted much attention in natural language processing and many other fields. In particular, Hyperbolic Neural Networks (HNNs) have improved a wide variety of tasks, from machine translation to knowledge graph embedding. Although some studies have reported the effectiveness of embedding into the product of multiple hyperbolic spaces, HNNs have mainly been constructed in a single hyperbolic space, and their extension to product spaces has not been sufficiently studied. Therefore, we propose a novel method to extend a given HNN in a single space to a product of hyperbolic spaces. We apply our method to Hyperbolic Graph Convolutional Networks (HGCNs), extending several HNNs. Our model improved the graph node classification accuracy especially on datasets with tree-like structures. The results suggest that neural networks in a product of hyperbolic spaces can be more effective than in a single space in representing structural data."}
<?xml version="1.0" encoding="UTF-8"?><modsCollection xmlns="http://www.loc.gov/mods/v3"><mods ID="takeuchi-etal-2022-neural"> <titleInfo> <title>Neural Networks in a Product of Hyperbolic Spaces</title> </titleInfo> <name type="personal"> <namePart type="given">Jun</namePart> <namePart type="family">Takeuchi</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Noriki</namePart> <namePart type="family">Nishida</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Hideki</namePart> <namePart type="family">Nakayama</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <originInfo> <dateIssued>2022-07</dateIssued> </originInfo> <typeOfResource>text</typeOfResource> <relatedItem type="host"> <titleInfo> <title>Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop</title> </titleInfo> <name type="personal"> <namePart type="given">Daphne</namePart> <namePart type="family">Ippolito</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Liunian</namePart> <namePart type="given">Harold</namePart> <namePart type="family">Li</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Maria</namePart> <namePart type="given">Leonor</namePart> <namePart type="family">Pacheco</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Danqi</namePart> <namePart type="family">Chen</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Nianwen</namePart> <namePart type="family">Xue</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <originInfo> <publisher>Association for Computational Linguistics</publisher> <place> <placeTerm type="text">Hybrid: Seattle, Washington + Online</placeTerm> </place> </originInfo> <genre authority="marcgt">conference publication</genre> </relatedItem> <abstract>Machine learning in hyperbolic spaces has attracted much attention in natural language processing and many other fields. In particular, Hyperbolic Neural Networks (HNNs) have improved a wide variety of tasks, from machine translation to knowledge graph embedding. Although some studies have reported the effectiveness of embedding into the product of multiple hyperbolic spaces, HNNs have mainly been constructed in a single hyperbolic space, and their extension to product spaces has not been sufficiently studied. Therefore, we propose a novel method to extend a given HNN in a single space to a product of hyperbolic spaces. We apply our method to Hyperbolic Graph Convolutional Networks (HGCNs), extending several HNNs. Our model improved the graph node classification accuracy especially on datasets with tree-like structures. The results suggest that neural networks in a product of hyperbolic spaces can be more effective than in a single space in representing structural data.</abstract> <identifier type="citekey">takeuchi-etal-2022-neural</identifier> <identifier type="doi">10.18653/v1/2022.naacl-srw.27</identifier> <location> <url>https://aclanthology.org/2022.naacl-srw.27/</url> </location> <part> <date>2022-07</date> <extent unit="page"> <start>211</start> <end>221</end> </extent> </part></mods></modsCollection>
%0 Conference Proceedings%T Neural Networks in a Product of Hyperbolic Spaces%A Takeuchi, Jun%A Nishida, Noriki%A Nakayama, Hideki%Y Ippolito, Daphne%Y Li, Liunian Harold%Y Pacheco, Maria Leonor%Y Chen, Danqi%Y Xue, Nianwen%S Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop%D 2022%8 July%I Association for Computational Linguistics%C Hybrid: Seattle, Washington + Online%F takeuchi-etal-2022-neural%X Machine learning in hyperbolic spaces has attracted much attention in natural language processing and many other fields. In particular, Hyperbolic Neural Networks (HNNs) have improved a wide variety of tasks, from machine translation to knowledge graph embedding. Although some studies have reported the effectiveness of embedding into the product of multiple hyperbolic spaces, HNNs have mainly been constructed in a single hyperbolic space, and their extension to product spaces has not been sufficiently studied. Therefore, we propose a novel method to extend a given HNN in a single space to a product of hyperbolic spaces. We apply our method to Hyperbolic Graph Convolutional Networks (HGCNs), extending several HNNs. Our model improved the graph node classification accuracy especially on datasets with tree-like structures. The results suggest that neural networks in a product of hyperbolic spaces can be more effective than in a single space in representing structural data.%R 10.18653/v1/2022.naacl-srw.27%U https://aclanthology.org/2022.naacl-srw.27/%U https://doi.org/10.18653/v1/2022.naacl-srw.27%P 211-221
Jun Takeuchi, Noriki Nishida, and Hideki Nakayama. 2022.Neural Networks in a Product of Hyperbolic Spaces. InProceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop, pages 211–221, Hybrid: Seattle, Washington + Online. Association for Computational Linguistics.