Movatterモバイル変換


[0]ホーム

URL:


CN111159425A - A Temporal Knowledge Graph Representation Method Based on Historical Relation and Bi-Graph Convolutional Networks - Google Patents

A Temporal Knowledge Graph Representation Method Based on Historical Relation and Bi-Graph Convolutional Networks
Download PDF

Info

Publication number
CN111159425A
CN111159425ACN201911392419.5ACN201911392419ACN111159425ACN 111159425 ACN111159425 ACN 111159425ACN 201911392419 ACN201911392419 ACN 201911392419ACN 111159425 ACN111159425 ACN 111159425A
Authority
CN
China
Prior art keywords
graph
historical
representation
edge
relation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911392419.5A
Other languages
Chinese (zh)
Other versions
CN111159425B (en
Inventor
陈岭
汤星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJUfiledCriticalZhejiang University ZJU
Priority to CN201911392419.5ApriorityCriticalpatent/CN111159425B/en
Publication of CN111159425ApublicationCriticalpatent/CN111159425A/en
Application grantedgrantedCritical
Publication of CN111159425BpublicationCriticalpatent/CN111159425B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本发明公开了一种基于历史关系和双图卷积网络的时态知识图谱表示方法,包括:1)对时态知识图谱中的数据进行预处理,提取事件和历史关系;2)构建原图和边图分别表示实体之间和历史关系之间的交互;3)利用参数确定的基于时间自注意力机制的历史关系编码器对原图构建多范围时间依赖关系,获得历史关系表示;4)利用参数确定的双图卷积网络根据历史关系表示、原图和边图获得实体表示;5)利用参数确定的语义匹配模型根据实体表示和历史关系预测实体间未来一段时间发生的关系,根据该未来一段时间可能发生的关系可以构建新时态知识图谱。该时态知识图谱表示方法提升了模型性能,在国际关系预测、社会网络分析等领域具有广阔的应用前景。

Figure 201911392419

The invention discloses a temporal knowledge graph representation method based on historical relationship and double graph convolutional network, comprising: 1) preprocessing data in the temporal knowledge graph to extract events and historical relationships; 2) constructing the original graph and edge graphs respectively represent the interactions between entities and between historical relations; 3) The historical relation encoder based on the temporal self-attention mechanism determined by the parameters is used to construct multi-range temporal dependencies on the original graph, and the historical relation representation is obtained; 4) Use the parameter-determined bi-graph convolutional network to obtain entity representation according to the historical relationship representation, the original image and the edge graph; 5) Use the parameter-determined semantic matching model to predict the relationship between entities in the future according to the entity representation and historical relationship. The relationships that may occur in the future can build a new tense knowledge graph. The temporal knowledge graph representation method improves the performance of the model and has broad application prospects in the fields of international relations prediction, social network analysis and other fields.

Figure 201911392419

Description

Temporal knowledge graph representation method based on historical relationship and double-graph convolution network
Technical Field
The invention relates to the field of representation of a temporal knowledge graph, in particular to a temporal knowledge graph representation method based on a historical relationship and a double-graph convolution network.
Background
The temporal knowledge graph contains a large amount of knowledge with time marks, can be regarded as a multi-relation graph, and is widely applied to the fields of international relation prediction, social network analysis and the like. Global event knowledge maps (Global Database of events, Language, and Tone, GDELT) and conflict event knowledge maps (ICEWS) are typical event-based temporal knowledge maps that represent knowledge over time in the form of event quads (head entity, relationship, tail entity, timestamp). The temporal knowledge graph representation learning maps entities and relations in the knowledge graph into low-dimensional and continuous vector representation by introducing time information, and has important significance for temporal knowledge graph completion.
Traditional temporal knowledge graph representation learning methods explicitly model timestamps as hyperplanes, vector representations, or fixed-format encodings. These methods simply embed the corresponding time stamp into the low-dimensional vector space, ignoring the time dependence. To address this problem, researchers have proposed temporal knowledge graph representation learning methods based on sequence learning models to model temporal dependencies.
The temporal knowledge graph representation learning method based on the sequence learning model utilizes the sequence learning model to model time dependence and can be divided into two categories. The first type is that a cyclic Neural Network (RNN) is used to model a sequence of relationships for a given pair of entities, which results in a representation of the entities. However, such approaches ignore simultaneous interactions between different entities. The second type is that a Gated Recurrent Unit (GRU) is used to model a tail entity sequence given a head entity and a relationship, and three different aggregation modes are designed to obtain a neighbor-based representation of the entity. However, such methods cannot take into account the effects of different relationships. In addition, existing temporal knowledge graph representation learning methods ignore interactions between relationships, particularly interactions between historical relationships (i.e., a sequence of relationships between pairs of entities with timestamps).
Disclosure of Invention
The technical problem to be solved by the invention is how to obtain a temporal knowledge graph representation by simultaneously considering interaction between entities and historical relations.
In order to solve the above problems, the present invention provides a temporal knowledge graph representation method based on a historical relationship and a dual graph convolution network, comprising the following steps:
1) preprocessing data in the temporal knowledge graph, and extracting an event and historical relation;
2) constructing an original graph and an edge graph to respectively represent interaction between entities and historical relations, wherein the entities are used as nodes, the historical relations between the entities are used as edges, and the original graph is constructed; regarding edges of the original graph as nodes, regarding the mutual influence among the historical relations as edges, and constructing an edge graph;
3) constructing a multi-range time dependency relationship for the original image by using a historical relationship encoder which is determined by parameters and is based on a time self-attention mechanism, and obtaining historical relationship representation;
4) obtaining entity representation according to historical relationship representation, original graphs and edge graphs by using a double-graph convolution network determined by parameters;
5) and predicting the relation between the entities in a future period of time according to the entity representation and the historical relation by utilizing the semantic matching model determined by the parameters, and constructing a new temporal knowledge graph according to the relation which is likely to occur in the future period of time.
According to the method, the original graph and the edge graph are constructed, and the double-graph convolution network is introduced for representation learning, so that interaction between entities and historical relations can be captured simultaneously, and the performance of the model is improved. Compared with the prior art, the method has the advantages that:
1) a historical relational encoder based on a temporal self-attention mechanism was introduced to model multi-range temporal dependencies.
2) And constructing an original graph according to the historical relationship, constructing an edge graph according to the mutual influence between edges, and introducing a double graph convolution network to model the interaction between the entities and the historical relationship.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart of a temporal knowledge graph representation method based on historical relationships and a two-graph convolutional network according to an embodiment;
FIG. 2 is a schematic diagram of a historical relationship encoder structure based on a time-based self-attention mechanism according to an embodiment;
fig. 3 is a block diagram of a dual graph convolutional network according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the detailed description and specific examples, while indicating the scope of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a flowchart of a temporal knowledge graph representation method based on a historical relationship and a two-graph convolutional network according to an embodiment. Referring to fig. 1, the temporal knowledge graph representation method includes data preprocessing, original drawing and edge drawing construction, temporal knowledge graph representation learning, and an application stage, which is similar to the data preprocessing, the original drawing and edge drawing construction, and the temporal knowledge graph representation learning, and therefore is not shown in fig. 1, and each stage is described in detail below.
Data preprocessing stage
The specific flow of data preprocessing is as follows:
step 1-1, inputting a temporal knowledge map TKB, and extracting events to obtain an event set.
The temporal knowledge graph TKB comprises a large amount of time-labeled knowledge, and event extraction is performed on the temporal knowledge graph TKB to form an event set. Events are represented in the form of quadruplets (s, r, o, t), where s represents the head entity, r represents the relationship, o represents the tail entity, and t represents the timestamp.
Figure BDA0002345338270000045
Represents a collection of entities, and
Figure BDA0002345338270000042
Figure BDA0002345338270000043
represents a set of relationships, and
Figure BDA0002345338270000044
t represents a set of timestamps, and T ∈ T.
And 1-2, extracting historical relations of the ergodic event set through an entity composed of head and tail entities to obtain a complete training data set.
A historical relationship is a sequence of relationships between pairs of entities with a time stamp. In this step, the historical relationship h is extracted by traversing the event set through the entity pair (s, o), which is formally represented as h { (r)1,t1),…(ri,ti),(rj,tj) Where t isi<tjDenotes the relationship riRatio relation rjOccurs first.
Original and edge graph construction stage
The specific flow of the construction of the original graph and the edge graph is as follows:
step 2-1 is to construct the original image G ═ V, E, and a from the history relationship, taking the entity as a node and the history relationship as an edge.
Where V denotes a node set (i.e., an entity set), E denotes an edge set (i.e., a history relation set), and a denotes an adjacency matrix of the original image. If a historical relationship exists between two entities, the two entities are considered to have an edge therebetween, and the adjacency matrix A of the original graph has the following calculation formula:
Figure BDA0002345338270000041
wherein A isi,jConnectivity between entities is represented.
Step 2-2, regarding the edges in the original image as nodes, defining the edges according to whether the edges of the original image are communicated or not, and constructing an edge graph Gedge=(Vedge,Eedge,Aedge)。VedgeRepresenting a set of nodes (i.e., a set of historical relationships), EedgeRepresenting sets of edges, the edges of the edge graph being based onWhether the edges of the original are connected or not (i.e. whether the edges are connected or not), AedgeAn adjacency matrix representing an edge graph.
In this step, the edge graph is obtained by converting the original graph, the nodes of the edge graph are the edges of the original graph, the edges in the edge graph are defined according to whether the edges of the original graph are communicated, and the definition rule is as follows: if two edges in the original graph share a vertex, i.e. it indicates that two edges in the original graph are connected, then there is an edge between two nodes in the edge graph. Adjacency matrix A of edge mapedgeThe calculation formula is as follows:
Figure BDA0002345338270000051
wherein A isedgei→j,u→vNot equal to 0 indicates the communication between the historical relationships i → j and u → v, weight wi→j,u→vThe influence degree between the history relations (namely two edges in the original) is shown, the influence degree is determined by the degree of the shared vertex in the original, the larger the degree of the shared vertex is, the smaller the influence of the side i → j on the side u → v is, and conversely, the larger the influence is. Weight wi→j,u→vThe calculation formula of (a) is as follows:
Figure BDA0002345338270000052
wherein α is a hyperparameter, and 0< α < 0.5.
Temporal knowledge graph representation learning phase
The specific flow of the tense knowledge graph representation learning stage is as follows:
and 3-1, sampling the training data set for Z times according to a fixed size, sampling one subgraph every time, wherein the subgraph comprises P nodes representing entities and all edges representing historical relations among the nodes, and executing the steps 3-2 to 3-5 for each subgraph.
And 3-2, for the historical relationship h in the subgraph, modeling a multi-range time dependence relationship by using a historical relationship encoder based on a time self-attention mechanism to obtain a representation h of the historical relationship.
The self-attention mechanism may model the context of each portion in the sequence, assigning weights to the different portions. As shown in fig. 2, the historical relationship encoder based on the time self-attention mechanism takes the self-attention mechanism as a component, consists of the time-based intra-block self-attention mechanism and the time-based inter-block self-attention mechanism, and can simultaneously model the local time dependency relationship and the global time dependency relationship.
In this step, the history relation h { (r)1,t1),…(ri,ti),(rj,tj) Split into M blocks, formally defined as h ═ z1,z2,…,zM]. Each block contains N relationships, e.g. the first block formally defined as z1={(r1,t1),…(rN-1,tN-1),(rN,tN)}. Time-based intra-block self-attention mechanism assigns a weight to each relationship within a block
Figure BDA0002345338270000061
Taking the first block as an example, the calculation formula is as follows:
Figure BDA0002345338270000062
Figure BDA0002345338270000063
wherein, Wintra
Figure BDA0002345338270000064
For a learnable parameter, σ (-) is the activation function,
Figure BDA0002345338270000065
bintraas an offset, piFor each relationship and relationship r in the first blockiRelative time between pi={(t1-ti),…(tN-1-ti),(tN-ti)},riIs a relation riIs shown inThe representation z of each block can be obtained by weighted summation of each relation in the block through random initializationkTaking the first block as an example, the calculation formula is as follows:
Figure BDA0002345338270000066
time-based inter-block self-attention mechanism assigns weights to each block
Figure BDA0002345338270000067
The calculation formula is as follows:
Figure BDA0002345338270000068
Figure BDA0002345338270000069
wherein, Winter
Figure BDA00023453382700000610
For a learnable parameter, σ (-) is the activation function,
Figure BDA00023453382700000611
binteras an offset, qkFor the first relation in each block and block zkThe relative time of the first relationship in (a),
Figure BDA0002345338270000071
Figure BDA0002345338270000072
is a block zkThe time of the first relationship in (a),
Figure BDA0002345338270000073
is a block zMThe representation h of each historical relationship can be obtained by weighted summation of the representations of each block at the time of the first relationship:
Figure BDA0002345338270000074
step 3-3, representing all historical relations by the set of h, the adjacent matrixes A and A of the original graph and the edge graphedgeThe two-graph convolutional network is input together, and the representation of each node (namely the representation of the entity) is obtained by modeling the interaction between the entities and the historical relationship.
The Graph Convolution Network (GCN) is a deep neural network that performs convolution operation on a graph, can model message transmission between nodes, and is widely used in the fields of traffic prediction, image classification, and the like. The graph convolution network calculation formula used is as follows:
Figure BDA0002345338270000075
where gc (-) represents the graph convolution operation, X is the input node representation,
Figure BDA0002345338270000076
in order to normalize the laplacian matrix of the graph,
Figure BDA0002345338270000077
i is an identity matrix and D is a diagonalization matrix of nodes, i.e. Dii=∑jAij
Figure BDA0002345338270000078
And representing an activation function, wherein ReLU is taken as the activation function, and theta is a parameter of the graph convolution network. The one-layer graph convolution network can aggregate the messages of the 1-hop neighbors for each node, and the range of the message-passing neighbors can be enlarged by stacking the multi-layer graph convolution networks. As shown in fig. 3, the dual graph convolution network uses the graph convolution network as a component, and includes k-layer original graph convolution network and k-1 layer edge graph convolution network, so that the interaction between entities and between historical relationships can be modeled simultaneously.
In this step, a node-edge indication matrix M is first constructedneTo show the corresponding relation between the original image nodes and the edges when the original image and the edge image are merged. Node-edge indicating momentMatrix MneThe calculation formula is as follows:
Figure BDA0002345338270000079
then, the historical relationship is expressed as a set of h, an adjacent matrix A of the original image and an adjacent matrix A of the edge mapedgeNode-edge indication matrix MneAnd inputting the data into the double-graph convolution network to obtain the representation of the node (namely the representation of the entity).
In the original convolutional network, the input of k layer is the output X of the original convolutional network of k-1 layer(k-1)And the output Y of the edge graph convolution network of the k-1 layer(k-1)Except for the first layer. The calculation formula is as follows:
Figure BDA0002345338270000081
Figure BDA0002345338270000087
wherein, X(0)Is the initial representation of the original image node, is obtained by random initialization,
Figure BDA0002345338270000082
is the laplacian matrix of the original. Thetanode(k-1)Is a parameter of convolution of the original image of the (k-1) th layer, thetaen(k-1)Is the parameter of the k-1 layer edge map-artwork fusion convolution, [, ]]It is shown that the splicing operation is performed,
Figure BDA0002345338270000083
showing that the splicing result is activated;
in the edge graph convolution network, the input of the k-1 layer is the output Y of the edge graph convolution network of the k-2 layer(k-2)And the output X of the original convolutional network of the k-1 layer(k-1)Except for the first layer. The calculation formula is as follows:
Figure BDA0002345338270000084
Figure BDA0002345338270000085
wherein, Y(0)For an initial representation of the nodes of the edge graph,
Figure BDA0002345338270000086
normalized Laplace matrix, Y, for edge maps(0)Is obtained by linear conversion of the historical relationship expression hedge(k-2)Is a parameter of the k-2 layer edge map convolution, thetaen(k-1)The parameters of the k-1 layer original image-edge image fusion convolution are shown.
And 3-4, predicting the relation which may exist in the entity pair in a future period of time by utilizing a semantic matching model based on the entity representation output in the step 3-3.
The semantic matching model adopts a DistMult model, the DistMult model is a knowledge graph representation learning model, and the matching degree of the entity pairs and each relation is measured through a bilinear function. That is, the matching score is obtained by using the DistMult model, and the score function of the DistMult model is defined as follows:
f(s,r,o)=(sTMro) (16)
wherein s and o represent representations of a head entity and a tail entity, obtained from the output of the dual graph convolutional network; mrFor each relationship corresponding diagonal matrix, Mr=diag(mr) Diag () will vector mrConversion to diagonal matrix MrWherein m isr=Mhr,hrFor the historical relationship representation of the entity pair (s, o), M is a learnable weight matrix. Semantic matching is performed by using the DistMult model, the relation r which is possibly generated between entity pairs in a future period of time can be predicted, and
Figure BDA0002345338270000092
and 3-5, generating negative samples for all training samples in the subgraph by a method of randomly replacing head and tail entities, calculating the prediction loss of all samples in the subgraph, and then adjusting network parameters of a historical relationship encoder, a dual-graph convolution network and a semantic matching model based on a time self-attention mechanism according to a loss function.
In this step, cross entropy loss of all samples in the subgraph is calculated according to the prediction result, and the calculation formula is as follows:
Figure BDA0002345338270000091
where D is the set of positive and negative samples, and for positive samples (s, r, o), negative samples are obtained by randomly replacing s and o. Sig (·) is a sigmoid function, y takes a value of {0, 1}, y of a positive sample takes a value of 1, and a negative sample takes a value of 0.
Application phase
After determining the network parameters of the historical relationship encoder, the dual-graph convolutional network and the semantic matching model based on the time attention mechanism, namely building the model comprising the historical relationship encoder, the dual-graph convolutional network and the semantic matching model based on the time attention mechanism determined by the network parameters, the prediction of the future relationship between the entities can be carried out, and the specific process is as follows:
(a) preprocessing data in the temporal knowledge graph, and extracting an event and historical relation;
(b) constructing an original graph and an edge graph according to the steps 2-1 and 2-2;
(c) constructing a multi-range time dependency relationship for the original image by using a historical relationship encoder which is determined by parameters and is based on a time self-attention mechanism, and obtaining historical relationship representation;
(d) obtaining entity representation according to historical relationship representation, original graphs and edge graphs by using a double-graph convolution network determined by parameters;
(e) and predicting the relation between the entities in a future period of time according to the entity representation and the historical relation by utilizing the semantic matching model determined by the parameters, and constructing a new temporal knowledge graph according to the relation which is likely to occur in the future period of time.
Specifically, a semantic matching model determined by the parameters is used for calculating a matching score according to the entity representation and the historical relationship representation, and predicting a relationship which may occur in a future period of time according to the matching score, namely, a corresponding relationship of a part with a larger matching score can be selected as a relationship which may occur in the future period of time; a new temporal knowledge graph may be constructed based on the relationships that may occur over the future period of time.
According to the temporal knowledge graph representation method, the original graph and the edge graph are constructed, the double-graph convolution network is introduced for representation learning, the correlation between entities and the correlation between historical relations can be utilized, the model performance is improved, namely the prediction accuracy of the relation which is likely to occur in a period of time in the future is improved, and the method has a wide application prospect in the fields of international relation prediction, social network analysis and the like.
The above-mentioned embodiments are intended to illustrate the technical solutions and advantages of the present invention, and it should be understood that the above-mentioned embodiments are only the most preferred embodiments of the present invention, and are not intended to limit the present invention, and any modifications, additions, equivalents, etc. made within the scope of the principles of the present invention should be included in the scope of the present invention.

Claims (6)

Translated fromChinese
1.一种基于历史关系和双图卷积网络的时态知识图谱表示方法,包括以下步骤:1. A temporal knowledge graph representation method based on historical relationship and dual graph convolutional network, comprising the following steps:1)对时态知识图谱中的数据进行预处理,提取事件和历史关系;1) Preprocess the data in the temporal knowledge graph to extract events and historical relationships;2)构建原图和边图分别表示实体之间和历史关系之间的交互,其中,将实体作为节点,将实体之间的历史关系作为边,构造原图;将原图的边看作节点,将历史关系之间的相互影响作为边,构造边图;2) Constructing the original graph and the edge graph respectively represent the interaction between entities and historical relationships, in which the entity is used as a node, and the historical relationship between entities is used as an edge to construct the original graph; the edge of the original graph is regarded as a node , using the interaction between historical relationships as edges to construct an edge graph;3)利用参数确定的基于时间自注意力机制的历史关系编码器对原图构建多范围时间依赖关系,获得历史关系表示;3) Use the historical relation encoder based on the temporal self-attention mechanism determined by the parameters to construct multi-range temporal dependencies on the original image, and obtain the historical relation representation;4)利用参数确定的双图卷积网络根据历史关系表示、原图和边图获得实体表示;4) Using the parameter-determined dual-graph convolutional network to obtain entity representations according to historical relation representations, original graphs and edge graphs;5)利用参数确定的语义匹配模型根据实体表示和历史关系预测实体间未来一段时间发生的关系,根据该未来一段时间可能发生的关系可以构建新时态知识图谱。5) The semantic matching model determined by parameters is used to predict the relationship between entities in the future according to entity representation and historical relationship, and a new temporal knowledge graph can be constructed according to the relationship that may occur in the future.2.如权利要求1所述的基于历史关系和双图卷积网络的时态知识图谱表示方法,其特征在于,步骤3)中,基于时间自注意力机制的历史关系编码器以自注意力机制为组件,由基于时间的块内自注意力机制和基于时间的块间自注意力机制组成,可以同时建模局部时间依赖关系和全局时间依赖关系;2. The temporal knowledge graph representation method based on historical relationship and dual graph convolutional network as claimed in claim 1, is characterized in that, in step 3), the historical relationship encoder based on time self-attention mechanism uses self-attention The mechanism is a component, consisting of a time-based intra-block self-attention mechanism and a time-based inter-block self-attention mechanism, which can simultaneously model local temporal dependencies and global temporal dependencies;将历史关系h={(r1,t1),…(ri,ti),(rj,tj)}拆分成M块,其形式化定义为h=[z1,z2,…,zM]。每块包含N个关系,例如,第一块形式化定义为z1={(r1,t1),…(rN-1,tN-1),(rN,tN)}。基于时间的块内自注意力机制为块内每个关系分配权重
Figure FDA0002345338260000011
以第一块为例,计算公式如下:Divide the historical relation h={(r1 ,t1 ),...(ri ,ti ),(rj ,tj )} into M blocks, which is formally defined as h=[z1 ,z2 ,…,zM ]. Each block contains N relations, eg, the first block is formally defined as z1 ={(r1 ,t1 ),...(rN-1 ,tN-1 ),(rN ,tN )}. Time-based intra-block self-attention mechanism assigns weights to each relation within a block
Figure FDA0002345338260000011
Taking the first block as an example, the calculation formula is as follows:
Figure FDA0002345338260000021
Figure FDA0002345338260000021
Figure FDA0002345338260000022
Figure FDA0002345338260000022
其中,Wintra
Figure FDA0002345338260000023
为可学习的参数,σ(·)为激活函数,
Figure FDA0002345338260000024
bintra为偏置量,pi为第一块中每个关系和关系ri之间的相对时间,pi={(t1-ti),…(tN-1-ti),(tN-ti)},ri是关系ri的表示,通过随机初始化得到,将块内每个关系加权求和,可以得到每个块的表示zk,以第一块为例,计算公式如下:
Among them, Wintra ,
Figure FDA0002345338260000023
is the learnable parameter, σ( ) is the activation function,
Figure FDA0002345338260000024
bintra is the offset,pi is the relative time between each relation and relation ri in the first block,pi ={(t1 -ti ),...(tN-1 -ti) , (tN -ti )},ri is the representation of the relationri , obtained by random initialization, the weighted summation of each relation in the block, the representation zk of each block can be obtained, taking the first block as an example, Calculated as follows:
Figure FDA0002345338260000025
Figure FDA0002345338260000025
基于时间的块间自注意力机制为每个块分配权重
Figure FDA0002345338260000026
计算公式如下:
Time-based inter-block self-attention mechanism assigns weights to each block
Figure FDA0002345338260000026
Calculated as follows:
Figure FDA0002345338260000027
Figure FDA0002345338260000027
Figure FDA0002345338260000028
Figure FDA0002345338260000028
其中,Winter
Figure FDA0002345338260000029
为可学习的参数,σ(·)为激活函数,
Figure FDA00023453382600000210
binter为偏置量,qk为每个块中第一个关系和块zk中第一个关系的相对时间,
Figure FDA00023453382600000211
Figure FDA00023453382600000212
为块zk中第一个关系的时间,
Figure FDA00023453382600000213
为块zM中第一个关系的时间,将每个块的表示加权求和可以得到每个历史关系的表示h:
Among them, Winter ,
Figure FDA0002345338260000029
is the learnable parameter, σ( ) is the activation function,
Figure FDA00023453382600000210
binter is the offset, qk is the relative time between the first relation in each block and the first relation in block zk ,
Figure FDA00023453382600000211
Figure FDA00023453382600000212
is the time of the first relation in blockzk ,
Figure FDA00023453382600000213
For the time of the first relation in blockzM , the weighted summation of the representations of each block gives the representation h of each historical relation:
Figure FDA00023453382600000214
Figure FDA00023453382600000214
3.如权利要求1所述的基于历史关系和双图卷积网络的时态知识图谱表示方法,其特征在于,步骤4)中,首先构建节点-边指示矩阵Mne来表示原图和边图融合时原图节点和边的对应关系,节点-边指示矩阵Mne计算公式如下:3. the temporal knowledge graph representation method based on historical relationship and double graph convolutional network as claimed in claim 1, is characterized in that, in step 4), at first construct node-side indication matrix Mne to represent original graph and side The corresponding relationship between the nodes and edges of the original graph when the graph is merged, the calculation formula of the node-edge indicator matrix Mne is as follows:
Figure FDA00023453382600000215
Figure FDA00023453382600000215
然后将历史关系表示h的集合、原图的邻接矩阵A、边图的邻接矩阵Aedge、节点-边指示矩阵Mne输入到双图卷积网络中,得到节点的表示;Then input the set of historical relationship representation h, the adjacency matrix A of the original graph, the adjacency matrix Aedge of the edge graph, and the node-edge indication matrix Mne into the double graph convolutional network to obtain the representation of the node;在原图卷积网络中,k层的输入是k-1层的原图卷积网络的输出X(k-1)和k-1层的边图卷积网络的输出Y(k-1),第一层除外;计算公式如下:In the original graph convolutional network, the input of the k layer is the output X(k-1) of the original graph convolutional network of the k-1 layer and the output Y(k-1) of the edge graph convolutional network of the k-1 layer, Except for the first layer; the calculation formula is as follows:
Figure FDA0002345338260000031
Figure FDA0002345338260000031
Figure FDA0002345338260000032
Figure FDA0002345338260000032
其中,X(0)为原图节点的初始表示,通过随机初始化得到,
Figure FDA0002345338260000033
为原图的拉普拉斯矩阵,θnode(k-1)为第k-1层原图卷积的参数,θen(k-1)为第k-1层边图-原图融合卷积的参数,[·,·]表示拼接操作,
Figure FDA0002345338260000034
表示对拼接结果进行激活;
Among them, X(0) is the initial representation of the original graph node, obtained by random initialization,
Figure FDA0002345338260000033
is the Laplacian matrix of the original image, θnode(k-1) is the parameter of the k-1 layer original image convolution, and θen(k-1) is the k-1 layer edge image-original image fusion volume The parameters of the product, [·,·] represent the splicing operation,
Figure FDA0002345338260000034
Indicates that the splicing result is activated;
在边图卷积网络中,k-1层的输入是k-2层的边图卷积网络的输出Y(k-2)和k-1层的原图卷积网络的输出X(k-1),第一层除外,计算公式如下:In the edge graph convolutional network, the input of the k-1 layer is the output Y(k-2) of the k-2 layer of the edge graph convolutional network and the output of the k-1 layer of the original graph convolutional network X(k- 1) , except for the first layer, the calculation formula is as follows:
Figure FDA0002345338260000035
Figure FDA0002345338260000035
Figure FDA0002345338260000036
Figure FDA0002345338260000036
其中,Y(0)为边图节点的初始表示,
Figure FDA0002345338260000037
为边图的归一化拉普拉斯矩阵,Y(0)由历史关系表示h经过线性转换得到,θedge(k-2)为第k-2层边图卷积的参数,θen(k-1)为第k-1层原图-边图融合卷积的参数。
where Y(0) is the initial representation of the edge graph node,
Figure FDA0002345338260000037
is the normalized Laplacian matrix of the edge graph, Y(0) is obtained by linear transformation of the historical relation representation h, θedge(k-2) is the parameter of the k-2 layer edge graph convolution, θen( k-1) is the parameter of the original image-edge image fusion convolution of the k-1th layer.
4.如权利要求1所述的基于历史关系和双图卷积网络的时态知识图谱表示方法,其特征在于,步骤5)中,利用参数确定的语义匹配模型根据实体表示和历史关系表示计算匹配得分,根据匹配得分预测未来一段时间可能发生的关系。4. the temporal knowledge graph representation method based on historical relation and double graph convolutional network as claimed in claim 1, is characterized in that, in step 5), utilizes the semantic matching model determined by parameter to represent and calculate according to entity representation and historical relation Match score, based on the match score to predict the relationship that may occur in the future.5.如权利要求4所述的基于历史关系和双图卷积网络的时态知识图谱表示方法,其特征在于,语义匹配模型采用DistMult模型,利用利用DistMult模型获得匹配得分,DistMult模型得分函数定义如下:5. the temporal knowledge map representation method based on historical relationship and double graph convolutional network as claimed in claim 4, it is characterised in that the semantic matching model adopts DistMult model, utilizes and utilizes DistMult model to obtain matching score, DistMult model score function definition as follows:f(s,r,o)=(sTMro)f(s,r,o)=(sT Mr o)其中,s和o代表头实体和尾实体的表示,由双图卷积网络输出得到;Mr为每个关系对应的对角矩阵,Mr=diag(mr),diag()将向量mr转换为对角矩阵Mr,其中,mr=Mhr,hr为实体对(s,o)的历史关系表示,M为权重矩阵。Among them, s and o represent the representation of the head entity and the tail entity, which are obtained by the output of the double-graph convolutional network; Mr is the diagonal matrix corresponding to each relationship, Mr =diag(mr ), diag() converts the vector mr is converted into a diagonal matrix Mr , where mr =Mhr , hr is the historical relationship representation of the entity pair (s, o) , and M is the weight matrix.6.如权利要求4所述的基于历史关系和双图卷积网络的时态知识图谱表示方法,其特征在于,训练时,计算子图中所有样本的交叉熵损失,计算公式如下:6. The temporal knowledge graph representation method based on historical relationship and double graph convolutional network as claimed in claim 4, is characterized in that, during training, calculates the cross entropy loss of all samples in subgraph, and the calculation formula is as follows:
Figure FDA0002345338260000041
Figure FDA0002345338260000041
其中,D为正负样本的集合,对于正样本(s,r,o),负样本通过随机替换s和o得到。Sig(·)为sigmoid函数,y取值为{0,1},正样本的y取值为1,负样本为0;Among them, D is the set of positive and negative samples. For positive samples (s, r, o), the negative samples are obtained by randomly replacing s and o. Sig( ) is a sigmoid function, y is {0, 1}, y is 1 for positive samples, and 0 for negative samples;根据损失函数,对基于时间自注意力机制的历史关系编码器、双图卷积网络以及语义匹配模型的网络参数进行调整,确定参数。According to the loss function, the network parameters of the historical relation encoder based on the temporal self-attention mechanism, the bi-graph convolutional network and the semantic matching model are adjusted to determine the parameters.
CN201911392419.5A2019-12-302019-12-30Temporal knowledge graph representation method based on historical relationship and double-graph convolution networkActiveCN111159425B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201911392419.5ACN111159425B (en)2019-12-302019-12-30Temporal knowledge graph representation method based on historical relationship and double-graph convolution network

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201911392419.5ACN111159425B (en)2019-12-302019-12-30Temporal knowledge graph representation method based on historical relationship and double-graph convolution network

Publications (2)

Publication NumberPublication Date
CN111159425Atrue CN111159425A (en)2020-05-15
CN111159425B CN111159425B (en)2023-02-10

Family

ID=70558928

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201911392419.5AActiveCN111159425B (en)2019-12-302019-12-30Temporal knowledge graph representation method based on historical relationship and double-graph convolution network

Country Status (1)

CountryLink
CN (1)CN111159425B (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111753054A (en)*2020-06-222020-10-09神思电子技术股份有限公司Machine reading inference method based on graph neural network
CN111812450A (en)*2020-06-012020-10-23复旦大学 A method for identifying dangerous faults in power grid
CN111949892A (en)*2020-08-102020-11-17浙江大学 A Multi-Relation Aware Temporal Interaction Network Prediction Method
CN112085293A (en)*2020-09-182020-12-15支付宝(杭州)信息技术有限公司Method and device for training interactive prediction model and predicting interactive object
CN112364108A (en)*2020-11-132021-02-12四川省人工智能研究院(宜宾)Time sequence knowledge graph completion method based on space-time architecture
CN112527915A (en)*2020-11-172021-03-19北京科技大学Linear cultural heritage knowledge graph construction method, system, computing device and medium
CN112529071A (en)*2020-12-082021-03-19广州大学华软软件学院Text classification method, system, computer equipment and storage medium
CN112668700A (en)*2020-12-302021-04-16广州大学华软软件学院Width map convolutional network model based on grouping attention and training method thereof
CN112711664A (en)*2020-12-312021-04-27山西三友和智慧信息技术股份有限公司Text emotion classification method based on TCN + LSTM
CN112905738A (en)*2021-02-052021-06-04中山大学Social relationship evolution prediction method based on temporal knowledge graph reasoning
CN112925953A (en)*2021-03-092021-06-08南京航空航天大学Dynamic network representation method and system
CN112989060A (en)*2020-11-242021-06-18杭州电子科技大学GCN-based major event trend prediction method
CN113033209A (en)*2021-05-252021-06-25腾讯科技(深圳)有限公司Text relation extraction method and device, storage medium and computer equipment
CN113051408A (en)*2021-03-302021-06-29电子科技大学Sparse knowledge graph reasoning method based on information enhancement
CN113204647A (en)*2021-04-292021-08-03哈尔滨工程大学Joint weight-based encoding and decoding framework knowledge graph embedding method
CN113377968A (en)*2021-08-162021-09-10南昌航空大学Knowledge graph link prediction method adopting fused entity context
CN113392229A (en)*2021-08-132021-09-14四川新龟科技有限公司Supply chain relation construction and prediction method, device, equipment and storage medium
CN113627676A (en)*2021-08-182021-11-09湘潭大学Traffic prediction method and system based on multi-attention causal relationship
CN113641829A (en)*2021-07-132021-11-12北京百度网讯科技有限公司Method and device for training neural network of graph and complementing knowledge graph
CN113688253A (en)*2021-08-122021-11-23浙江大学Hierarchical perception temporal knowledge map representation learning method
CN113722510A (en)*2021-09-132021-11-30中国人民解放军国防科技大学Knowledge graph complex problem generation method and system based on graph neural network
CN113742491A (en)*2021-08-122021-12-03上海熙业信息科技有限公司Representation learning-based time knowledge graph reasoning method
CN113761337A (en)*2020-12-312021-12-07国家计算机网络与信息安全管理中心Event prediction method and device based on implicit elements and explicit relations of events
CN113849163A (en)*2021-10-092021-12-28中国科学院软件研究所 Operating system intelligent programming method and device based on API document map
CN114186069A (en)*2021-11-292022-03-15江苏大学 A Knowledge Graph Construction Method for Deep Video Understanding Based on Multimodal Heterogeneous Graph Attention Network
CN114745183A (en)*2022-04-142022-07-12浙江网商银行股份有限公司Alarm method and device
CN115033662A (en)*2022-06-102022-09-09华中科技大学 A Distributed Attention Time Series Knowledge Graph Reasoning Method
WO2023039901A1 (en)*2021-09-182023-03-23京东方科技集团股份有限公司Text recommendation method and apparatus, model training method and apparatus, and readable storage medium
CN115905551A (en)*2021-09-292023-04-04杭州海康威视数字技术股份有限公司Traffic state-based knowledge graph generation and traffic state prediction method and device
CN116340524A (en)*2022-11-112023-06-27华东师范大学Method for supplementing small sample temporal knowledge graph based on relational adaptive network
CN116484016A (en)*2023-03-302023-07-25中国科学院计算机网络信息中心 A time series knowledge graph reasoning method and system based on automatic maintenance of time series paths

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH06161854A (en)*1992-07-161994-06-10Internatl Business Mach Corp <Ibm> Source and target resolution of relationships in a versioned database management system
US20140149418A1 (en)*2012-11-282014-05-29Share This Inc.Method and system for measuring social influence and receptivity of users
CN109213872A (en)*2018-09-112019-01-15中国电子科技集团公司第二十八研究所Knowledge based indicates the entity relationship prediction technique and forecasting system of study
CN109902183A (en)*2019-02-132019-06-18北京航空航天大学 A Knowledge Graph Embedding Method Based on Diverse Graph Attention Mechanism

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH06161854A (en)*1992-07-161994-06-10Internatl Business Mach Corp <Ibm> Source and target resolution of relationships in a versioned database management system
US20140149418A1 (en)*2012-11-282014-05-29Share This Inc.Method and system for measuring social influence and receptivity of users
CN109213872A (en)*2018-09-112019-01-15中国电子科技集团公司第二十八研究所Knowledge based indicates the entity relationship prediction technique and forecasting system of study
CN109902183A (en)*2019-02-132019-06-18北京航空航天大学 A Knowledge Graph Embedding Method Based on Diverse Graph Attention Mechanism

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MARTIN ATZMUELLER: "Onto Model-based Anomalous Link Pattern Mining", 《ACM ISBN 978-1-4503-6675-5/19/05》*

Cited By (54)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111812450B (en)*2020-06-012022-03-18复旦大学 A method for identifying dangerous faults in power grid
CN111812450A (en)*2020-06-012020-10-23复旦大学 A method for identifying dangerous faults in power grid
CN111753054B (en)*2020-06-222023-02-03神思电子技术股份有限公司Machine reading inference method based on graph neural network
CN111753054A (en)*2020-06-222020-10-09神思电子技术股份有限公司Machine reading inference method based on graph neural network
CN111949892A (en)*2020-08-102020-11-17浙江大学 A Multi-Relation Aware Temporal Interaction Network Prediction Method
CN111949892B (en)*2020-08-102022-04-05浙江大学 A Multi-Relation Aware Temporal Interaction Network Prediction Method
CN112085293A (en)*2020-09-182020-12-15支付宝(杭州)信息技术有限公司Method and device for training interactive prediction model and predicting interactive object
CN112085293B (en)*2020-09-182022-09-09支付宝(杭州)信息技术有限公司Method and device for training interactive prediction model and predicting interactive object
CN112364108A (en)*2020-11-132021-02-12四川省人工智能研究院(宜宾)Time sequence knowledge graph completion method based on space-time architecture
CN112527915A (en)*2020-11-172021-03-19北京科技大学Linear cultural heritage knowledge graph construction method, system, computing device and medium
CN112989060A (en)*2020-11-242021-06-18杭州电子科技大学GCN-based major event trend prediction method
CN112989060B (en)*2020-11-242022-04-15杭州电子科技大学 A GCN-based method for predicting the trend of major events
CN112529071B (en)*2020-12-082023-10-17广州大学华软软件学院Text classification method, system, computer equipment and storage medium
CN112529071A (en)*2020-12-082021-03-19广州大学华软软件学院Text classification method, system, computer equipment and storage medium
CN112668700A (en)*2020-12-302021-04-16广州大学华软软件学院Width map convolutional network model based on grouping attention and training method thereof
CN112668700B (en)*2020-12-302023-11-28广州大学华软软件学院Width graph convolution network model system based on grouping attention and training method
CN112711664A (en)*2020-12-312021-04-27山西三友和智慧信息技术股份有限公司Text emotion classification method based on TCN + LSTM
CN113761337B (en)*2020-12-312023-10-27国家计算机网络与信息安全管理中心Event prediction method and device based on implicit event element and explicit connection
CN112711664B (en)*2020-12-312022-09-20山西三友和智慧信息技术股份有限公司Text emotion classification method based on TCN + LSTM
CN113761337A (en)*2020-12-312021-12-07国家计算机网络与信息安全管理中心Event prediction method and device based on implicit elements and explicit relations of events
CN112905738A (en)*2021-02-052021-06-04中山大学Social relationship evolution prediction method based on temporal knowledge graph reasoning
CN112905738B (en)*2021-02-052022-04-22中山大学 A social relationship evolution prediction method based on temporal knowledge graph reasoning
CN112925953B (en)*2021-03-092024-02-20南京航空航天大学Dynamic network representation method and system
CN112925953A (en)*2021-03-092021-06-08南京航空航天大学Dynamic network representation method and system
CN113051408B (en)*2021-03-302023-02-14电子科技大学 A Sparse Knowledge Graph Reasoning Method Based on Information Enhancement
CN113051408A (en)*2021-03-302021-06-29电子科技大学Sparse knowledge graph reasoning method based on information enhancement
CN113204647A (en)*2021-04-292021-08-03哈尔滨工程大学Joint weight-based encoding and decoding framework knowledge graph embedding method
CN113204647B (en)*2021-04-292023-01-03哈尔滨工程大学Joint weight-based encoding and decoding framework knowledge graph embedding method
CN113033209A (en)*2021-05-252021-06-25腾讯科技(深圳)有限公司Text relation extraction method and device, storage medium and computer equipment
CN113641829B (en)*2021-07-132023-11-24北京百度网讯科技有限公司 Graph neural network training and knowledge graph completion methods and devices
CN113641829A (en)*2021-07-132021-11-12北京百度网讯科技有限公司Method and device for training neural network of graph and complementing knowledge graph
CN113742491A (en)*2021-08-122021-12-03上海熙业信息科技有限公司Representation learning-based time knowledge graph reasoning method
CN113688253A (en)*2021-08-122021-11-23浙江大学Hierarchical perception temporal knowledge map representation learning method
CN113688253B (en)*2021-08-122024-05-07浙江大学Hierarchical perception temporal knowledge graph representation learning method
CN113392229A (en)*2021-08-132021-09-14四川新龟科技有限公司Supply chain relation construction and prediction method, device, equipment and storage medium
CN113377968A (en)*2021-08-162021-09-10南昌航空大学Knowledge graph link prediction method adopting fused entity context
CN113627676A (en)*2021-08-182021-11-09湘潭大学Traffic prediction method and system based on multi-attention causal relationship
CN113627676B (en)*2021-08-182023-09-01湘潭大学Traffic prediction method and system based on multi-attention causal relationship
CN113722510A (en)*2021-09-132021-11-30中国人民解放军国防科技大学Knowledge graph complex problem generation method and system based on graph neural network
CN113722510B (en)*2021-09-132024-04-05中国人民解放军国防科技大学Knowledge graph complex problem generation method and system based on graph neural network
WO2023039901A1 (en)*2021-09-182023-03-23京东方科技集团股份有限公司Text recommendation method and apparatus, model training method and apparatus, and readable storage medium
CN115905551A (en)*2021-09-292023-04-04杭州海康威视数字技术股份有限公司Traffic state-based knowledge graph generation and traffic state prediction method and device
CN113849163B (en)*2021-10-092022-10-14中国科学院软件研究所API (application program interface) document map-based operating system intelligent programming method and device
CN113849163A (en)*2021-10-092021-12-28中国科学院软件研究所 Operating system intelligent programming method and device based on API document map
CN114186069B (en)*2021-11-292023-09-29江苏大学 Knowledge graph construction method for deep video understanding based on multi-modal heterogeneous graph attention network
CN114186069A (en)*2021-11-292022-03-15江苏大学 A Knowledge Graph Construction Method for Deep Video Understanding Based on Multimodal Heterogeneous Graph Attention Network
CN114745183B (en)*2022-04-142023-10-27浙江网商银行股份有限公司Alarm method and device
CN114745183A (en)*2022-04-142022-07-12浙江网商银行股份有限公司Alarm method and device
CN115033662A (en)*2022-06-102022-09-09华中科技大学 A Distributed Attention Time Series Knowledge Graph Reasoning Method
CN115033662B (en)*2022-06-102025-03-11华中科技大学 A distributed attention-based temporal knowledge graph reasoning method
CN116340524A (en)*2022-11-112023-06-27华东师范大学Method for supplementing small sample temporal knowledge graph based on relational adaptive network
CN116340524B (en)*2022-11-112024-03-08华东师范大学Method for supplementing small sample temporal knowledge graph based on relational adaptive network
CN116484016A (en)*2023-03-302023-07-25中国科学院计算机网络信息中心 A time series knowledge graph reasoning method and system based on automatic maintenance of time series paths
CN116484016B (en)*2023-03-302024-08-13中国科学院计算机网络信息中心Time sequence knowledge graph reasoning method and system based on automatic maintenance of time sequence path

Also Published As

Publication numberPublication date
CN111159425B (en)2023-02-10

Similar Documents

PublicationPublication DateTitle
CN111159425A (en) A Temporal Knowledge Graph Representation Method Based on Historical Relation and Bi-Graph Convolutional Networks
CN113313947B (en) A Road Condition Evaluation Method Based on Graph Convolutional Networks for Short-Term Traffic Prediction
CN114265986B (en) An information push method and system integrating knowledge graph structure and path semantics
Nie et al.Network traffic prediction based on deep belief network and spatiotemporal compressive sensing in wireless mesh backbone networks
CN113255895B (en)Structure diagram alignment method and multi-diagram joint data mining method based on diagram neural network representation learning
CN112256981B (en) A Rumor Detection Method Based on Linear and Nonlinear Propagation
CN111709474A (en) A Graph Embedding Link Prediction Method Fusing Topology Structure and Node Attributes
CN110677284A (en) A meta-path-based method for link prediction in heterogeneous networks
CN111445963B (en) A Subgraph Isomorphism Constraint Solving Method Based on Graph Node Information Aggregation
CN115422441A (en)Continuous interest point recommendation method based on social space-time information and user preference
CN114039871B (en)Method, system, device and medium for cellular traffic prediction
CN117671952A (en) Traffic flow prediction method and system based on spatiotemporal synchronized dynamic graph attention network
CN113537613B (en)Temporal network prediction method for die body perception
CN113780002A (en)Knowledge reasoning method and device based on graph representation learning and deep reinforcement learning
CN116842260A (en) A knowledge-enhanced recommendation method based on graph neural network multi-space interaction modeling
CN113688253B (en)Hierarchical perception temporal knowledge graph representation learning method
CN117236492B (en)Traffic demand prediction method based on dynamic multi-scale graph learning
CN117743884A (en) A time series forecasting method based on multi-source heterogeneous data
CN114547276A (en)Three-channel diagram neural network-based session recommendation method
CN107995278A (en) A scene intelligent analysis system and method based on metropolitan-level Internet of Things perception data
CN117634723A (en) A group trajectory prediction method for large scenarios
CN116485501A (en)Graph neural network session recommendation method based on graph embedding and attention mechanism
CN119479300A (en) A traffic flow prediction method based on graph convolutional network
CN114997506B (en)Atmospheric pollution propagation path prediction method based on link prediction
CN118585811A (en) A traffic flow prediction method based on dynamic space-time graph and neural differential equation

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp