Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
Springer Nature Link
Log in

Piecewise graph convolutional network with edge-level attention for relation extraction

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Graph Convolutional Network (GCN) is a critical method to capture non-sequential information of sentences and recognize long-distance syntactic information. However, the adjacency matrix of GCN has two problems: redundant syntactic information and wrong dependency parsing results. Because the syntactic information is represented by unweighted adjacency matrices in most existing GCN methods. Toward this end, we propose a novel model,PGCN-EA, usingPiecewiseGraphConvolutionalNetwork withEdge-levelAttention to address these two problems. In specific, we first employ the piecewise adjacency matrix based on entity pair, which aims to dynamically reduce the sentence’s redundant features. Second, we propose Edge-level Attention to assign the different weights among nodes based on GCN’s input and create the weight adjacency matrix, emphasizing the importance of child words with the target word and alleviating the influence of wrong dependency parsing. Our model on a benchmark dataset has carried out extensive experiments and achieved the best PR curve as compared to seven baseline models, which are at least more than\(2.3\%\).

This is a preview of subscription content,log in via an institution to check access.

Access this article

Log in via an institution

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

References

  1. Chen D, Manning CD (2014) A fast and accurate dependency parser using neural networks. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, EMNLP 2014, October 25-29, 2014, Doha, Qatar, A meeting of SIGDAT, a Special Interest Group of the ACL, pp 740–750.https://doi.org/10.3115/v1/d14-1082

  2. Chen J, Fu T, Lee C, Ma W (2021) H-FND: hierarchical false-negative denoising for distant supervision relation extraction. In: C. Zong, F. Xia, W. Li, R. Navigli (eds.) Findings of the Association for Computational Linguistics: ACL/IJCNLP 2021, Online Event, August 1-6, 2021, pp 2579–2593. Association for Computational Linguistics .https://doi.org/10.18653/v1/2021.findings-acl.228

  3. Chen T, Shi H, Tang S, Chen Z, Wu F, Zhuang Y (2021) CIL: contrastive instance learning framework for distantly supervised relation extraction. In: C. Zong, F. Xia, W. Li, R. Navigli (eds.) Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Volume 1: Long Papers), Virtual Event, August 1-6, 2021, pp. 6191–6200. Association for Computational Linguistics .https://doi.org/10.18653/v1/2021.acl-long.483

  4. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. In: D.D. Lee, M. Sugiyama, U. von Luxburg, I. Guyon, R. Garnett (eds.) Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, December 5–10, 2016, Barcelona, Spain, pp. 3837–3845.http://papers.nips.cc/paper/6081-convolutional-neural-networks-on-graphs-with-fast-localized-spectral-filtering

  5. Dietterich TG, Lathrop RH, Lozano-Pérez T (1997) Solving the multiple instance problem with axis-parallel rectangles. Artif Intell 89(1–2):31–71.https://doi.org/10.1016/S0004-3702(96)00034-3

    Article MATH  Google Scholar 

  6. Guo Z, Zhang Y, Lu W (2019) Attention guided graph convolutional networks for relation extraction. CoRRarXiv:1906.07510

  7. He K, Zhang X, Ren S, Sun J (2015) Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In: 2015 IEEE International Conference on Computer Vision, ICCV 2015, Santiago, Chile, December 7-13, 2015, pp. 1026–1034. IEEE Computer Society .https://doi.org/10.1109/ICCV.2015.123

  8. He Z, Chen W, Li Z, Zhang M, Zhang, W, Zhang, M (2018) SEE: syntax-aware entity embedding for neural relation extraction. In: S.A. McIlraith, K.Q. Weinberger (eds.) Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, (AAAI-18), the 30th innovative Applications of Artificial Intelligence (IAAI-18), and the 8th AAAI Symposium on Educational Advances in Artificial Intelligence (EAAI-18), New Orleans, Louisiana, USA, February 2-7, 2018, pp. 5795–5802. AAAI Press.https://www.aaai.org/ocs/index.php/AAAI/AAAI18/paper/view/16362

  9. Hoffmann R, Zhang C, Ling X, Zettlemoyer LS, Weld DS (2011) Knowledge-based weak supervision for information extraction of overlapping relations. In: D. Lin, Y. Matsumoto, R. Mihalcea (eds.) The 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference, 19-24 June, 2011, Portland, Oregon, USA, pp. 541–550. The Association for Computer Linguistics.https://www.aclweb.org/anthology/P11-1055/

  10. Huynh D, TranD, Ma W, Sharma D (2011) A new term ranking method based on relation extraction and graph model for text classification. In: Proceedings of ACSC, pp. 145–152 (2011).http://crpit.com/abstracts/CRPITV113Huynh.html

  11. Jat S, Khandelwal S, Talukdar P (2018) Improving distantly supervised relation extraction using word and entity based attention. CoRRarXiv:1804.06987

  12. Ji G, Liu K, He S, Zhao J (2017) Distant supervision for relation extraction with sentence-level attention and entity descriptions. In: S.P. Singh, S. Markovitch (eds.) Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, February 4-9, 2017, San Francisco, California, USA, pp. 3060–3066. AAAI Press.http://aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14491

  13. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. CoRRarXiv:1609.02907

  14. Kipf TN, Welling M (2017)Semi-supervised classification with graph convolutional networks. In: Proceedings of ICLR

  15. Li Y, Long G, Shen T, Zhou T, Yao L, Huo H, Jiang J (2019) Self-attention enhanced selective gate with entity-aware embedding for distantly supervised relation extraction. CoRRabs/1911.11899.arXiv:1911.11899

  16. Lin Y, Shen S, Liu Z, Luan H, Sun M (2016) Neural relation extraction with selective attention over instances. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016, August 7-12, 2016, Berlin, Germany, Volume 1: Long Papers. The Association for Computer Linguistics .https://doi.org/10.18653/v1/p16-1200

  17. Liu T, Wang K, Chang B, Sui Z (2017) A soft-label method for noise-tolerant distantly supervised relation extraction. In: Proceedings of EMNLP, pp 1790–1795

  18. Liu T, Zhang X, Zhou W, Jia W (2018) Neural relation extraction via inner-sentence noise reduction and transfer learning. In: Proceedings of EMNLP, pp 2195–2204

  19. Ma R, Gui T, Li L, Zhang Q, Huang X, Zhou Y (2021) SENT: sentence-level distant relation extraction via negative training. In: C. Zong, F. Xia, W. Li, R. Navigli (eds.) Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Volume 1: Long Papers), Virtual Event, August 1-6, 2021, pp. 6201–6213. Association for Computational Linguistics .https://doi.org/10.18653/v1/2021.acl-long.484

  20. Marcheggiani D, Titov I (2017) Encoding sentences with graph convolutional networks for semantic role labeling. In: Proceedings of EMNLP, pp 1506–1515

  21. Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. CoRRarXiV:1301.3781

  22. Mintz M, Bills S, Snow R, Jurafsky D (2009) Distant supervision for relation extraction without labeled data. In: K. Su, J. Su, J. Wiebe (eds.) ACL 2009, Proceedings of the 47th Annual Meeting of the Association for Computational Linguistics and the 4th International Joint Conference on Natural Language Processing of the AFNLP, 2-7 August 2009, Singapore, pp. 1003–1011. The Association for Computer Linguistics .https://www.aclweb.org/anthology/P09-1113/

  23. Riedel S, Yao L, McCallum A (2010) Modeling relations and their mentions without labeled text. In: Proceedings of ECML/PKDD, pp 148–163

  24. Sadeghi F, Divvala SK, Farhadi A (2015) Viske: Visual knowledge extraction and question answering by visual verification of relation phrases. In: Proceedings of CVPR, pp 1456–1464.https://doi.org/10.1109/CVPR.2015.7298752

  25. Shore JE, Johnson RW (1980) Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. Inform Theory IEEE Trans 26(1):26–37

    Article MathSciNet  Google Scholar 

  26. Srivastava N, Hinton GE, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958

    MathSciNet MATH  Google Scholar 

  27. Sundermeyer M, Schluter R, Ney H (2012) Lstm neural networks for language modeling. In: Proceedings of INTERSPEECH, pp 601–608

  28. Surdeanu M, Tibshirani J, Nallapati R, Manning CD (2012) Multi-instance multi-label learning for relation extraction. In: Proceedings of EMNLP-CoNLL, pp 455–465 .http://www.aclweb.org/anthology/D12-1042

  29. Vashishth S, Joshi R, Prayaga SS, Bhattacharyya C, Talukdar P (2018) RESIDE: improving distantly-supervised neural relation extraction using side information. In: Proceedings of EMNLP, pp 1257–1266

  30. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Proceedings of NIPS, pp 6000–6010

  31. Velickovic P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2017) Graph attention networks.arXiV:1710.10903

  32. Xu W, Chen K, Zhao T (2021) Discriminative reasoning for document-level relation extraction. In: C. Zong, F. Xia, W. Li, R. Navigli (eds.) Findings of the Association for Computational Linguistics: ACL/IJCNLP 2021, Online Event, August 1-6, 2021, pp. 1653–1663. Association for Computational Linguistics.https://doi.org/10.18653/v1/2021.findings-acl.144

  33. Yan Y, Okazaki N, Matsuo Y, Yang Z, Ishizuka M (2009) Unsupervised relation extraction by mining wikipedia texts using information from the web. In: Proceedings of ACL/IJCNLP, pp. 1021–1029 .http://www.aclweb.org/anthology/P09-1115

  34. Yang W, Ruan N, Gao W, Wang K, Ran W, Jia W (2017) Crowdsourced time-sync video tagging using semantic association graph. In: Proceedings of ICME, pp 547–552

  35. Yuan C, Huang H, Feng C, Liu X, Wei X (2019) Distant supervision for relation extraction with linear attenuation simulation and non-iid relevance embedding. In: Proceedings of AAAI, pp 7418–7425

  36. Zeng D, Liu K, Chen Y, Zhao J (2015) Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of EMNLP, pp 1753–1762

  37. Zeng D, Liu K, Lai S, Zhou G, Zhao J (2014) Relation classification via convolutional deep neural network. In: Proceedings of COLING, pp 2335–2344

  38. Zhang H, Huang W, Liu L, Chow TWS (2020) Learning to match clothing from textual feature-based compatible relationships. IEEE Trans. Ind. Informatics 16(11):6750–6759.https://doi.org/10.1109/TII.2019.2924725

    Article  Google Scholar 

  39. Zhang H, Ji Y, Huang W, Liu L (2018) Sitcom-stars oriented video advertising via clothing retrieval. In: Database Systems for Advanced Applications - 23rd International Conference, DASFAA 2018, Gold Coast, QLD, Australia, May 21-24, 2018, Proceedings, Part II, pp 638–646.https://doi.org/10.1007/978-3-319-91458-9_39

  40. Zhang Y, Qi P, Manning CD (2018) Graph convolution over pruned dependency trees improves relation extraction. In: Proceedings of EMNLP, pp 2205–2215

Download references

Acknowledgements

This work was supported by the National Key R&D Plan (No.2017YFB0803302) and National Key Research and Development Plan (No.2016QY03D0602).

Author information

Authors and Affiliations

  1. Beijing Institute of Technology, Beijing, China

    Changsen Yuan, Heyan Huang, Chong Feng & Qianwen Cao

Authors
  1. Changsen Yuan

    You can also search for this author inPubMed Google Scholar

  2. Heyan Huang

    You can also search for this author inPubMed Google Scholar

  3. Chong Feng

    You can also search for this author inPubMed Google Scholar

  4. Qianwen Cao

    You can also search for this author inPubMed Google Scholar

Corresponding author

Correspondence toHeyan Huang.

Ethics declarations

Conflict of interest

There is no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yuan, C., Huang, H., Feng, C.et al. Piecewise graph convolutional network with edge-level attention for relation extraction.Neural Comput & Applic34, 16739–16751 (2022). https://doi.org/10.1007/s00521-022-07312-3

Download citation

Keywords

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Advertisement


[8]ページ先頭

©2009-2025 Movatter.jp