Movatterモバイル変換


[0]ホーム

URL:


CN111160027A - Cyclic neural network event time sequence relation identification method based on semantic attention - Google Patents

Cyclic neural network event time sequence relation identification method based on semantic attention
Download PDF

Info

Publication number
CN111160027A
CN111160027ACN201911335582.8ACN201911335582ACN111160027ACN 111160027 ACN111160027 ACN 111160027ACN 201911335582 ACN201911335582 ACN 201911335582ACN 111160027 ACN111160027 ACN 111160027A
Authority
CN
China
Prior art keywords
vector
trigger word
event
word
trigger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911335582.8A
Other languages
Chinese (zh)
Inventor
徐小良
高通
王宇翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi UniversityfiledCriticalHangzhou Dianzi University
Priority to CN201911335582.8ApriorityCriticalpatent/CN111160027A/en
Publication of CN111160027ApublicationCriticalpatent/CN111160027A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本发明公开了一种基于语义注意力的循环神经网络事件时序关系识别方法,该方法主要包括以下步骤:首先对输入的事件句进行句法依存关系分析,截取有关的触发词语义依赖分支,然后利用循环神经网络获取对应的隐藏状态向量;随后,计算除去触发词之外的注意力权重向量,对不同分词按照不同权重进行融合,然后与触发词进行拼接,得到事件句状态向量;最后,将事件句状态向量放入softmax函数中进行时序关系预测。本发明能够有效捕捉到事件句中隐含的语义信息,可对不同分词进行有效关联与融合,进而提高事件时序关系识别精度。

Figure 201911335582

The invention discloses a method for recognizing the time sequence relationship of events in a cyclic neural network based on semantic attention. The method mainly includes the following steps: firstly, analyzing the syntactic dependency relationship of the input event sentence, intercepting the semantic dependency branch of the relevant trigger word, and then using the following steps: The cyclic neural network obtains the corresponding hidden state vector; then, calculates the attention weight vector excluding the trigger word, fuses different word segmentations according to different weights, and then splices with the trigger word to obtain the event sentence state vector; finally, the event sentence state vector is obtained; The sentence state vector is put into thesoftmax function for temporal relationship prediction. The invention can effectively capture the semantic information implied in the event sentence, and can effectively associate and fuse different word segmentations, thereby improving the recognition accuracy of the event sequence relationship.

Figure 201911335582

Description

Cyclic neural network event time sequence relation identification method based on semantic attention
Technical Field
The invention relates to the field of natural language processing, in particular to a recurrent neural network event time sequence relation identification method based on semantic attention.
Background
Events have received much attention in the field of natural language processing as an important form of knowledge representation. An event is a set of related descriptions about a subject that objectively characterize what a particular subject (person or persons and objects) happens in a detailed time and place environment as an important way of conveying information. The event time sequence relation refers to the chronological sequence relation of events when the events occur, is a semantic relation among the events, and connects the evolution process of a certain subject event from the beginning to the end and the mutual relation of the events. An example of event timing relationship identification (taken from the Timebank-detect corpus) is listed below.
Event sentence 1: conseco Inc. sand it is sealing for the reconstruction on Dec 7 of the 800000remaining shares.
Event sentence 2: the actual center all conversion rights on The stockwell terminate on Nov 30.
In the above example, there are two events, "catching" and "terminate," which are preceded by a timing relationship, i.e., the "catching" event occurs before the "terminate" event. The goal of event timing relationship identification is to accurately identify the timing relationship of related events according to a given corpus.
In the previous research work, the most common method for determining the time sequence relationship of events is a pattern matching method, and the method identifies the relationship between events by the relationship between event triggering words through matching the event relationship pairs in the text according to the manually defined template. The trigger is a predicate for identifying an event, and there are many common verbs and nouns. However, the manually defined event relationship template is limited by the data format or content, and is also prone to low recall rate. In addition, the template construction usually has certain field limitations, different fields make different templates, and a universal event relation template cannot be applied to various different types of events. With the establishment of a corpus and a knowledge base, a plurality of research works begin to introduce a machine learning method to carry out the research of event timing, the basic idea is to obtain the dependency relationship and entity labels among each participle in a sentence by additionally analyzing the syntax and lexical characteristics of related sentences, and finally put the participle into classifiers such as an SVM (support vector machine) for classification, but the accuracy rate of the obtained timing relationship is low and is only more than 40%. With the rapid development of deep learning technology, some researchers apply models such as CNN and RNN to event timing sequence recognition, and the effect is further improved. Subsequently, some researchers have applied semantic dependencies to construct the input vector representation by intercepting the shortest dependency path associated with the trigger. However, the method only intercepts the unidirectional branches related to the trigger words, neglects the information of partial neighbors of the trigger words, and may cause the omission of some important semantic information.
Through analysis, the semantic information implicit in the event sentence is difficult to capture by the methods, and effective connection and information fusion are lacked among different participles, so that the recognition accuracy is not ideal.
Disclosure of Invention
The invention provides a recurrent neural network event time sequence relation recognition method based on a semantic attention mechanism, and aims to solve the problems that semantic information implicit in an event sentence is difficult to capture and effective connection and fusion information among different participles are lacked in the conventional event time sequence relation recognition method.
The technical scheme of the invention is as follows:
step 1: and constructing a trigger word sense dependent branch. Trigger is a predicate used to identify an event, and there are many verbs and nouns in general. Firstly, carrying out syntactic dependency relationship analysis on an input event sentence to obtain a complete dependency syntactic tree, searching the position of a trigger word, and searching a father node and a brother node of the trigger word until a root node is finished; if the trigger is not a leaf node, its child nodes are recursively searched downward from the trigger position. And combining the two parts of information to form a trigger word meaning dependence branch. Each participle in the trigger word sense dependent branch has three corresponding vectors, namely a word vector xvPart of speech vector xpAnd dependent branch vector xt. The three vectors are spliced to form an input vector x of the word segmentation, namely:
Figure BDA0002330830460000021
step 2: a hidden state vector is obtained. Respectively training from the head and the tail of the semantic dependent branch of the trigger word by utilizing a cyclic neural network to obtain the forward propagation information h of the event sentenceleftAnd back propagation information hrightAnd then splicing the two vectors to obtain a hidden state vector h corresponding to the input vector x, namely:
h=[hleft;hright](2)
step3, calculating attention weight vectors except for trigger words, wherein different participles in an event sentence have different influence degrees on event time sequence, and adjusting the influence degrees among different words by introducing the attention weight vector β1,h2,h3,…,hmAnd m is the number of word segmentation. Then, calculation of the attention weight vector excluding the trigger word is started.
ui=tanh(Wuhi+bu) ⑶
Figure BDA0002330830460000031
Wherein, WuIs a weight vector, buAs an offset value, βiFor a certain participle h in an event sentenceiT represents the position index of the trigger word in the event sentence.
Fusing each participle to different degrees according to the attention weight vector obtained by calculation, and splicing the participle with the trigger word vector to obtain an event sentence state vector e*Namely:
Figure BDA0002330830460000032
wherein h istRepresenting a trigger word hidden state vector.
Step 4: and (6) classifying the result. The experimental corpus is trained in the form of event sentence pairs, namely, two event sentences exist in a line of corpus, and after each event sentence is trained through the steps, state vectors are respectively obtained
Figure BDA0002330830460000033
And
Figure BDA0002330830460000034
the two vectors are spliced and then put into a softmax function for classification, and the most possible time sequence relation is predicted, namely:
Figure BDA0002330830460000035
Figure BDA0002330830460000036
wherein, WleftAnd WrightRepresenting a weight vector, W, with respect to the state vectorclassRepresenting weight vectors with respect to classification, bclassRepresenting the bias value for the classification.
The invention has the beneficial effects that:
(1) the invention provides a cyclic neural network event time sequence relation identification method based on a semantic attention mechanism. The method can effectively capture the semantic information hidden in the event sentence, and effective fusion and connection can be established among different participles.
(2) The invention provides a recurrent neural network event time sequence relation identification method based on a semantic attention mechanism, which calculates attention weight vectors except for trigger words. Different participles in the event sentence have different degrees of influence on the event time sequence, different vectors are fused in different degrees by calculating the attention weight vector, and because the trigger word vector is the most important vector in the event sentence, the trigger word is not put into the calculation of the attention weight vector, but is additionally spliced in the subsequent vector calculation fusion, so that the trigger word information is completely reserved.
Drawings
FIG. 1 is an example 1 of triggering word sense dependent branches referred to in the identification of the recurrent neural network event timing relationship based on the semantic attention mechanism proposed by the present invention.
FIG. 2 is an example 2 of trigger word sense dependent branches referred to in the identification of recurrent neural network event timing relationships based on the semantic attention mechanism proposed by the present invention.
FIG. 3 is a flow chart of the recurrent neural network event timing relationship identification based on the semantic attention mechanism proposed by the present invention.
FIG. 4 is a model diagram of the recurrent neural network event timing relationship identification based on the semantic attention mechanism proposed in the present invention.
Detailed Description
For a better understanding of the present invention, the invention will be further explained with reference to the accompanying drawings and specific examples, wherein the following detailed description is given:
the invention comprises the following steps:
step 1: and constructing a trigger word sense dependent branch. Trigger is a predicate used to identify an event, and there are many verbs and nouns in general. Firstly, carrying out syntactic dependency relationship analysis on an input event sentence to obtain a complete dependency syntactic tree, searching the position of a trigger word, and searching a father node and a brother node of the trigger word until a root node is finished; if the trigger is not a leaf node, its child nodes are recursively searched downward from the trigger position. Through experimental result analysis, the effect is best when the two times of downward recursion searching are carried out, the method can effectively capture the semantic information hidden in the event sentence, and effective fusion and connection can be established among different participles. And combining the two parts of information to form a trigger word meaning dependence branch. Each participle in the trigger word sense dependent branch has three corresponding vectors, namely a word vector xvPart of speech vector xpAnd dependent branch vector xt. The three vectors are spliced to form an input vector x of the word segmentation, namely:
Figure BDA0002330830460000051
for example, for a Timebank-Dense corpus:
event sentence 1: conseco Inc. sand it is sealing for the reconstruction on Dec 7 of the 800000remaining shares.
Event sentence 2: the actual center all conversion rights on The stockwell terminate on Nov 30.
And analyzing the syntactic dependency relationship of the event sentence to obtain a complete dependency syntactic tree, wherein the specific tree structure is shown in fig. 1 and fig. 2. Then finding the specific positions of the triggering words "capturing" and "terminate", and starting from the current position of the triggering word, searching the participles related to the triggering word according to the above rules. For the first event sentence, the participles related to the triggering word "catching" are "said", "it", "is", "for", "redaction" by searching; for the second event sentence, the participle related to the trigger word "terminate" is "said", "rights", "will", "all", "conversion", "on", "Nov" by search. By triggering the word dependence branches, the parts of speech and the dependence relationship of the words can be acquired. Converting them into corresponding vector information, and obtaining the vector representation e of theevent sentence 11={xsaid,xit,xis,xcalling,xfor,xredemptionVector representation e of f and event sentence 22={xsaid,xrights,xall,xwill,xterminate,xconversion,xon,xNov}。
Step 2: a hidden state vector is obtained. Respectively training from the head and the tail of the semantic dependent branch of the trigger word by utilizing a cyclic neural network to obtain the forward propagation information h of the event sentenceleftAnd back propagation information hrightAnd then splicing the two vectors to obtain a hidden state vector h corresponding to the input vector x, namely:
h=[hleft;hright]⑼
e.g. event sentence e described above1And e2Training can obtain corresponding hidden state vector
Figure BDA0002330830460000052
And
Figure BDA0002330830460000053
Figure BDA0002330830460000054
step3, calculating attention weight vectors except for trigger words, wherein different participles in an event sentence have different influence degrees on event time sequences, and the influence degrees among different words are adjusted by introducing the attention weight vector β1,h2,h3,…,hmAnd m is the number of word segmentation. Then, calculation of the attention weight vector excluding the trigger word is started.
ui=tanh(Wuhi+bu) (10)
Figure BDA0002330830460000061
Wherein, WuIs a weight vector, buAs an offset value, βiFor a certain participle h in an event sentenceiT represents the position index of the trigger word in the event sentence.
Fusing each participle to different degrees according to the attention weight vector obtained by calculation, and splicing the participle with the trigger word vector to obtain an event sentence state vector e*Namely:
Figure BDA0002330830460000062
wherein h istRepresenting a trigger word hidden state vector.
Event sentence h such as described abovee1Invoking the above formula can obtain usaid=tanh(Wuhsaid+bu),uit=tanh(Wuhit+bu),ufor=tanh(Wuhfor+bu) And uredemption=tanh(Wuhredemption+bu) And each participle is then given a corresponding attention weight value βsaid,βit,βforAnd βredemption. Then calculate the event sentence he1State vector of
Figure BDA0002330830460000063
Figure BDA0002330830460000064
Similarly, an event sentence h can be obtainede2State vector of
Figure BDA0002330830460000065
Step 4: and (6) classifying the result. The experimental corpus is trained in the form of event sentence pairs, namely, two event sentences exist in a line of corpus, and after each event sentence is trained through the steps, state vectors are respectively obtained
Figure BDA0002330830460000066
And
Figure BDA0002330830460000067
the two vectors are spliced and then put into a softmax function for classification, and the most possible time sequence relation is predicted, namely:
Figure BDA0002330830460000068
Figure BDA0002330830460000069
wherein, WleftAnd WrightRepresenting a weight vector, W, with respect to the state vectorclassRepresenting weight vectors with respect to classification, bclassRepresenting the bias value for the classification.
Event sentence state vectors such as those described above
Figure BDA0002330830460000071
And
Figure BDA0002330830460000072
and vector splicing and putting the vector spliced data into a softmax method to obtain an array with the length of 6. Because six relationships are defined in the Timebank-detect corpus, an array of length 6 is created. The probability of the time sequence relation 'BEFORE' is presumed to be the maximum according to the result, so that the time sequence relation of the predicted events 'catching' and 'terminate' is 'BEFORE'.
The experiment used the accuracy P, recall R and F1 values as evaluation criteria. Five different experimental tasks are set in the experiment, and the DP-based LSTMs models proposed by CNN, LSTM, Bi-LSTM and Cheng Fei and the method provided by the invention are respectively used for training and comparison, and the actual results are shown in the table:
TABLE 1 comparative results of the experiments
Figure BDA0002330830460000073
As can be seen from the above training data, Bi-LSTM has better processing effect than the traditional CNN model. Because CNN can only extract the feature with unchanged position in the process of capturing the word feature of the sentence and lacks the consideration of the global context information, and the hidden state of Bi-LSTM can fully memorize and learn the information of the whole context, better performance effect is obtained. The DP-based LSTMs model is different from the input vector of the Bi-LSTM, the DP-based LSTMs model only intercepts the shortest dependent path, ignores the information of the neighbor nodes of the trigger word part, causes the omission of some important semantic information, and only uses the single-layer LSTM for training, and has a little deviation of the actual effect. An attention mechanism is introduced on the basis of the Bi-LSTM model, and the experimental effect is further improved. The attention mechanism is that for the Bi-LSTM model, the influence degree of different participles on the context can be obtained, so that the context relation between the participles and the event trigger word can be fully mined, and finally the time sequence relation of the event candidate pair is correctly predicted.
The input vectors of this experiment included word vector WvPart of speech vector WpAnd dependent branch vector Wt. The three vectors are combined, and the influence of different vector combinations on the event timing identification is observed.
TABLE 2 different input vector combinations influence the results
Figure BDA0002330830460000081
As can be seen from the results in table 2, when a word vector, a part-of-speech vector, and a dependent branch vector are simultaneously input during input, context semantic information can be sufficiently represented, and event timing identification can be performed better.
The embodiments of the present invention are explained in detail with reference to the drawings, but the embodiments of the present invention are not limited thereto, and modifications and substitutions by other skilled persons based on the present invention are within the protection scope of the present invention.

Claims (1)

Translated fromChinese
1.基于语义注意力的循环神经网络事件时序关系识别方法,该方法包含如下步骤:1. A recurrent neural network event sequence relationship recognition method based on semantic attention, the method includes the following steps:Step1:构建触发词语义依赖分支;Step1: Build trigger word semantic dependency branch;对事件句进行句法依存关系分析,获得完整的依存句法树,查找到触发词的位置,获取它的父兄弟节点,然后向上递归查找它的父节点,直到根节点结束;如果触发词不是叶子节点,则从触发词位置向下递归查找它的子节点;Perform syntactic dependency analysis on the event sentence, obtain a complete dependency syntax tree, find the position of the trigger word, obtain its parent and sibling nodes, and then recursively search for its parent node upward until the end of the root node; if the trigger word is not a leaf node , then recursively look down its child nodes from the trigger word position;将向上和向下递归查找得到的两部分信息合并,构成触发词语义依赖分支;其中触发词语义依赖分支中的每一个分词都有对应的三个向量,即词向量xv,词性向量xp和依赖分支向量xt;将这三个向量进行拼接,构成该分词的输入向量x,即:Combine the two parts of information obtained by recursively searching upward and downward to form a trigger word semantic dependency branch; each participle in the trigger word semantic dependency branch has three corresponding vectors, namely word vector xv , part-of-speech vector xp and the dependency branch vector xt ; splicing these three vectors to form the input vector x of the word segmentation, namely:
Figure FDA0002330830450000011
Figure FDA0002330830450000011
Step2:获取隐藏状态向量;Step2: Obtain the hidden state vector;利用循环神经网络,分别从触发词语义依赖分支的首部和尾部开始训练,得到事件句的前向传播信息向量hleft和后向传播信息向量hright,之后对这两个向量进行拼接,得到输入向量x对应的隐藏状态向量h,即:Using the recurrent neural network, the training starts from the head and tail of the semantic dependency branch of the trigger word, respectively, to obtain the forward propagation information vector hleft and the backward propagation information vector hright of the event sentence, and then splicing these two vectors to obtain the input The hidden state vector h corresponding to the vector x, namely:h=[hleft;hright] (2)h=[hleft ; hright ] (2)Step3:计算除去触发词之外的注意力权重向量;Step3: Calculate the attention weight vector excluding trigger words;设触发词语义依赖分支的隐藏状态向量为h={h1,h2,h3,…,hm},其中,m为分词个数;计算除去触发词之外的注意力权重向量;Let the hidden state vector of the trigger word semantic dependency branch be h={h1 , h2 , h3 ,...,hm }, where m is the number of word segments; calculate the attention weight vector excluding the trigger word;ui=tanh(Wuhi+bu) ⑶ui =tanh(Wu hi +bu ) ⑶
Figure FDA0002330830450000012
Figure FDA0002330830450000012
其中,Wu为一个权重向量,bu为偏置值,βi为事件句中某个分词hi的注意力权重值,t表示触发词在事件句中的位置下标;Among them, Wu is a weight vector, buis the bias value, βi is the attention weight value of a participlehi in the event sentence, t represents the position index of the trigger word in the event sentence;根据计算得到的注意力权重向量对各个分词进行不同程度的融合,并与触发词向量进行拼接,得到事件句状态向量e*,即:According to the calculated attention weight vector, each word segment is fused to different degrees, and spliced with the trigger word vector to obtain the event sentence state vector e* , namely:
Figure FDA0002330830450000021
Figure FDA0002330830450000021
其中,ht代表触发词隐藏状态向量,W1和W2是共享的学习的权重向量;Among them, ht represents the trigger word hidden state vector, and W1 and W2 are the shared learned weight vectors;Step4:结果分类;Step4: result classification;每个事件句经过上述步骤后,分别获得状态向量
Figure FDA0002330830450000022
Figure FDA0002330830450000023
对这两个向量进行拼接,然后放入到softmax函数中进行分类,预测出最有可能的时序关系,即:
After each event sentence goes through the above steps, the state vector is obtained separately
Figure FDA0002330830450000022
and
Figure FDA0002330830450000023
The two vectors are spliced, and then put into the softmax function for classification, and the most likely time series relationship is predicted, namely:
Figure FDA0002330830450000024
Figure FDA0002330830450000024
Figure FDA0002330830450000025
Figure FDA0002330830450000025
其中,Wclass代表关于分类的权重向量,bclass代表关于分类的偏置值。Among them, Wclass represents the weight vector about the classification, and bclass represents the bias value about the classification.
CN201911335582.8A2019-12-232019-12-23Cyclic neural network event time sequence relation identification method based on semantic attentionPendingCN111160027A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201911335582.8ACN111160027A (en)2019-12-232019-12-23Cyclic neural network event time sequence relation identification method based on semantic attention

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201911335582.8ACN111160027A (en)2019-12-232019-12-23Cyclic neural network event time sequence relation identification method based on semantic attention

Publications (1)

Publication NumberPublication Date
CN111160027Atrue CN111160027A (en)2020-05-15

Family

ID=70557807

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201911335582.8APendingCN111160027A (en)2019-12-232019-12-23Cyclic neural network event time sequence relation identification method based on semantic attention

Country Status (1)

CountryLink
CN (1)CN111160027A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112507077A (en)*2020-12-152021-03-16杭州电子科技大学Event time sequence relation identification method based on relational graph attention neural network
CN113761337A (en)*2020-12-312021-12-07国家计算机网络与信息安全管理中心Event prediction method and device based on implicit elements and explicit relations of events
CN114970498A (en)*2021-12-202022-08-30昆明理工大学Dependency information-fused news event time sequence relation identification method
US11573992B2 (en)2020-06-302023-02-07Beijing Baidu Netcom Science And Technology Co., Ltd.Method, electronic device, and storage medium for generating relationship of events
CN111859911B (en)*2020-07-282023-07-25中国平安人寿保险股份有限公司Image description text generation method, device, computer equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110334213A (en)*2019-07-092019-10-15昆明理工大学 Recognition method of temporal relationship of Chinese-Vietnamese news events based on two-way cross-attention mechanism

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110334213A (en)*2019-07-092019-10-15昆明理工大学 Recognition method of temporal relationship of Chinese-Vietnamese news events based on two-way cross-attention mechanism

Cited By (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11573992B2 (en)2020-06-302023-02-07Beijing Baidu Netcom Science And Technology Co., Ltd.Method, electronic device, and storage medium for generating relationship of events
CN111859911B (en)*2020-07-282023-07-25中国平安人寿保险股份有限公司Image description text generation method, device, computer equipment and storage medium
CN112507077A (en)*2020-12-152021-03-16杭州电子科技大学Event time sequence relation identification method based on relational graph attention neural network
CN112507077B (en)*2020-12-152022-05-20杭州电子科技大学 Recognition method of event sequence relationship based on relational graph attention neural network
CN113761337A (en)*2020-12-312021-12-07国家计算机网络与信息安全管理中心Event prediction method and device based on implicit elements and explicit relations of events
CN113761337B (en)*2020-12-312023-10-27国家计算机网络与信息安全管理中心Event prediction method and device based on implicit event element and explicit connection
CN114970498A (en)*2021-12-202022-08-30昆明理工大学Dependency information-fused news event time sequence relation identification method

Similar Documents

PublicationPublication DateTitle
CN112989005B (en) A method and system for question answering of knowledge graph based on staged query
Li et al.Leveraging linguistic structures for named entity recognition with bidirectional recursive neural networks
CN111160027A (en)Cyclic neural network event time sequence relation identification method based on semantic attention
CN109543183B (en)Multi-label entity-relation combined extraction method based on deep neural network and labeling strategy
CN110349568B (en)Voice retrieval method, device, computer equipment and storage medium
CN108829801B (en) An event-triggered word extraction method based on document-level attention mechanism
CN110880019B (en) Methods for training target domain classification models via unsupervised domain adaptation
Alayrac et al.Unsupervised learning from narrated instruction videos
CN107818141B (en) A biomedical event extraction method incorporating structured element recognition
CN112163425A (en)Text entity relation extraction method based on multi-feature information enhancement
CN110895932A (en) Multilingual Speech Recognition Method Based on Cooperative Classification of Language Type and Speech Content
CN112215013B (en) A deep learning-based clone code semantic detection method
CN104036023B (en)Method for creating context fusion tree video semantic indexes
CN112507077B (en) Recognition method of event sequence relationship based on relational graph attention neural network
CN106777957B (en)The new method of biomedical more ginseng event extractions on unbalanced dataset
CN107608960B (en)Method and device for linking named entities
CN114265932A (en)Event context generation method and system integrating deep semantic relation classification
CN114897163A (en)Pre-training model data processing method, electronic device and computer storage medium
CN114332519B (en) A method for image description generation based on external triples and abstract relations
CN113449084A (en)Relationship extraction method based on graph convolution
CN108038099B (en) A low-frequency keyword recognition method based on word clustering
CN112489622A (en) A method and system for recognizing speech content of a multilingual continuous speech stream
CN115017268B (en) A method and system for heuristic log extraction based on tree structure
CN113742733A (en) Reading comprehension vulnerability event trigger word extraction and vulnerability type identification method and device
CN112364132A (en)Similarity calculation model and system based on dependency syntax and method for building system

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication
RJ01Rejection of invention patent application after publication

Application publication date:20200515


[8]ページ先頭

©2009-2025 Movatter.jp