Movatterモバイル変換


[0]ホーム

URL:


CN113673152B - Group level KKS coding intelligent mapping recommendation method based on digital twin - Google Patents

Group level KKS coding intelligent mapping recommendation method based on digital twin
Download PDF

Info

Publication number
CN113673152B
CN113673152BCN202110905893.4ACN202110905893ACN113673152BCN 113673152 BCN113673152 BCN 113673152BCN 202110905893 ACN202110905893 ACN 202110905893ACN 113673152 BCN113673152 BCN 113673152B
Authority
CN
China
Prior art keywords
attention
kks
self
coding
digital twin
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110905893.4A
Other languages
Chinese (zh)
Other versions
CN113673152A (en
Inventor
傅骏伟
俞荣栋
柴真琦
郭鼎
王豆
罗一凡
戴程鹏
邵建宇
高凯楠
徐哲源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Zheneng Digital Technology Co Ltd
Zhejiang Energy Group Research Institute Co Ltd
Original Assignee
Zhejiang Zheneng Digital Technology Co ltd
Zhejiang Energy Group Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Zheneng Digital Technology Co ltd, Zhejiang Energy Group Research Institute Co LtdfiledCriticalZhejiang Zheneng Digital Technology Co ltd
Priority to CN202110905893.4ApriorityCriticalpatent/CN113673152B/en
Publication of CN113673152ApublicationCriticalpatent/CN113673152A/en
Application grantedgrantedCritical
Publication of CN113673152BpublicationCriticalpatent/CN113673152B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The invention relates to a digital twin-based intelligent mapping recommendation method for group-level KKS codes, which comprises the following steps: constructing a digital twin body based on the whole power production flow; constructing a professional dictionary according to the obtained digital twin body; and acquiring codes under the original coding rule and codes under the new coding rule by using acquisition equipment to construct a KKS coding data set. The beneficial effects of the invention are as follows: the invention systemizes a complex power generation process, constructs a digital twin body based on a power full production process, and instantiates a group digital twin technology; the invention also constructs a KKS generation model based on the attention stacking network, excites the vitality of the original digital infrastructure of the power plant, opens the island of data, breaks the information barrier, promotes the real-time fusion between the physical space and the information space, and realizes the intelligent mapping task under various KKS coding systems.

Description

Group level KKS coding intelligent mapping recommendation method based on digital twin
Technical Field
The invention belongs to the technical field of power plant information, and particularly relates to a digital twin-body-based intelligent mapping recommendation method for group-level KKS codes.
Background
Digital twinning is the digital representation of a real thing with a specific purpose, namely, the real situation of a physical entity is presented in real time in the virtual digital world. With the popularization and application of the digital twin concept, how to map the entities of the information space and the physical space one by one becomes one of important basic works.
Currently, the main foundation supporting digital twins in the power industry application is the power plant identification system (KKS), which enables efficient identification and management of power plant equipment assets. The KKS coding itself has a number of limitations:
1. Only the production needs of the power plant are considered at the beginning of programming, and the universality of the codes is not considered;
2. The compiling main body is an electric power design institute and an equipment manufacturer, and operation maintenance personnel lack the capability of maintaining codes;
3. for iterative updates of new technologies, the KKS coding system cannot respond quickly.
Therefore, new technology, especially digital twin technology, is being applied in the power industry, and the premise is to realize automation and intellectualization of the KKS coding system. Although the invention patent cn201711434013.X adopts an OPC UA server, an OPC UA client and a data mapping dictionary to implement a digital twin mapping model, it cannot be expanded only for coding rules in the OPC UA protocol. In addition, the invention patent CN201910956494.3 adopts a heterogeneous protocol conversion mode to realize the data virtual-real interactive coding and decoding process, and also has the recognition problem aiming at the same entity under different coding conditions. For instantiating a group digital twin technology, exciting the vitality of the original digital infrastructure of a power plant, opening a data island, breaking an information barrier, promoting the real-time fusion between a physical space and the information space, and realizing the intelligent mapping task under various KKS coding systems is still a difficult problem to be solved urgently.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provides a digital twin-based intelligent mapping recommendation method for group-level KKS codes.
The intelligent mapping recommendation method for the group-level KKS code based on the digital twin body comprises the following steps:
Step 1, according to a digital twin modeling process, firstly systemizing a complex power generation flow, and constructing a digital twin body based on a full power production flow:
In the above-mentioned method, the step of,Representing the various subsystems that make up the digital twin,/>A digital twin representing a full power-based production process;
Step 2, the digital twin body obtained according to the step 1Construction of a professional dictionary/>
Step 3, acquiring codes under the original coding rule by using acquisition equipmentAnd coding under the new coding rules/>Wherein/>Refers to the nth code encoded according to the original coding rule,/>Refers to the nth code encoded according to the new coding rule; coding/>, under the original coding rules, according to manual (experience of production operators)Coding under the new coding rules/>Mapping and matching are carried out, and a KKS coding data set is constructed:
Step 4, performing KKS coding data set obtained in step 3Data preprocessing is carried out to obtain a training setAnd test set/>
Step 5, training set obtained by step 4Training a KKS generation model based on an attention stack network, wherein the KKS generation model based on the attention stack network is formed by stacking multiple layers of attention networks;
Step 6, the professional dictionary obtained according to the step 2Generating a KKS coding predicted value under a new coding rule by adopting a reconstruction function according to the probability mapping value p of the fusion attention characteristic in the word vector obtained in the step 5, and carrying out similarity calculation on the KKS coding predicted value under the new coding rule by adopting a minimum editing distance in a character string mode without converting the KKS coding predicted value into a vector mode for calculation; the most similar g KKS code values are obtained for KKS code mapping recommendation:
In the above, byReconstruction of the function to obtain KKS coding predicted value/>Then pass/>Calculating the similarity of KKS codes by using a minimum edit distance function, and finally adopting/>And selecting g most similar KKS code values for KKS code mapping recommendation, and storing the recommendation result in a storage unit.
Preferably, when the complex power generation flow is systemized in step 1, the complex power generation flow is divided into different power subsystems, and three relations including parallel and series are present among the different power subsystems.
Preferably, the step 2 specifically includes the following steps:
step 2.1 for digital twinThe detailed description of each subsystem included in the system is used for word segmentation to obtain professional word segmentation data;
Step 2.2, constructing a multi-level professional dictionary according to the professional word segmentation data obtained in the step 2.1:
The above expression multi-level professional dictionaryBy the main system dictionary/>And subsystem dictionary/>Two parts are formed, and a multi-level professional dictionary/>Stored in a storage device.
Preferably, the proprietary dictionary in step 2The three-key value pairs formed by the serial numbers, the serial numbers and the professional codes of the electric subsystem are used as data structures and stored in a storage unit of the storage device, and the storage unit accesses files in the Json format and provides data interface services.
Preferably, the original coding rule in the step 3 is a KKS coding rule adopted when a power plant is designed by a design institute, the new coding rule is a KKS coding rule formulated by a power generation group, and the two rules have the same purpose but different coding results; and 3, the acquisition unit in the acquisition equipment performs data acquisition by running the Python script, and the acquired data is transmitted outwards through the data interface.
Preferably, the step 4 specifically includes the following steps:
step 4.1, professional dictionary obtained according to step 2Encoding a data set for KKS/>The coded data in the code word is segmented to obtain corresponding segmented phrase/>,/>Are words which form coded data, a code/>Divided into/>Individual words/>; Each word/>Corresponds to a number/>The codes of the word groups are digitally encoded, and the numbers/>Recombination to form coding vector/>Obtaining a digital coding data set;
step 4.2, dividing the digital coding data set obtained in the step 4.1 into training sets in a random sampling modeAnd test set/>
Preferably, the KKS encoded data set is encoded in step 4.1When the word segmentation results obtained after word segmentation is inconsistent in length, the longest word segmentation quantity is used as a standard, and the word segmentation with insufficient length is expanded by adding corresponding quantity of placeholders; training set/>, step 4.2And test set/>The ratio of (2) is 4:1.
Preferably, the step 5 specifically includes the following steps:
step 5.1, coding according to the original coding rule according to the full connection layerEncoded according to the new encoding rules/>Word vectorization coding is carried out to obtain vectorization results/>, with fixed dimensions, respectively
In the above formula, the superscript 1st represents an original coding rule, and the superscript 2nd represents a new coding rule;
step 5.2, vectorization result obtained according to step 5.1The attention features are calculated by the self-attention layer:
in the above, the vectorized result with fixed dimension is input to the self-attention layerThe self-attention layer is formed by stacking N self-attention modules, the input of the latter module is taken as the output of the last module in a self-attention layer form, and each self-attention module is divided into two layers; through self-attention network/>Get attention moment array:/>
In the above, self-attention networkRespectively by weight/>、/>、/>Obtaining characteristic value、/>、/>; Then adopt/>Obtain based on eigenvalue/>、/>、/>Attention weight of (2), additionally/>、/>Is d; then pass through feedforward full-connection layerCalculate attention characteristics/>Wherein/>Representing the calculation in the nth self-attention module;
Step 5.3, vectorizing the result obtained in step 5.1Self-attention feature obtained in step 5.2Performing feature fusion calculation; the fusion calculation module is formed by stacking M fusion self-attention modules, and the input of the next module in the fusion calculation module is used as the output of the last module in a form; the fusion self-attention module consists of a three-layer structure:
In the above, willAs self-attention network/>First through a self-attention network/>Obtain the attention matrix/>
In the above, self-attention networkRespectively by weight/>、/>、/>Obtaining characteristic value、/>、/>; Then adopt/>Obtain based on eigenvalue/>、/>、/>Attention weight/>In addition, d is/>、/>Vector dimensions of (a); and then pass/>Two attention matrices/>Attention features/>Fusing to obtain a fused attention matrix/>; Finally through the feedforward full-connection layer/>Attention moment array according to fusionCalculate and get the fusion attention feature/>Wherein/>Representing the calculation in the mth fused self-attention module;
Step 5.4, fused attention features obtained according to step 5.3By normalizing an exponential functionCalculating to obtain probability mapping values p of each fusion attention feature in the word vector:
Step 5.5, the probability mapping value p of the fused attention feature in the word vector obtained in the step 5.4 is used for evaluating a model training result by adopting a cross entropy result as a loss function for training a KKS generation model based on an attention stacking network; determining a suspension condition of KKS generation model training based on the attention stack network according to the iteration times and the loss function convergence value; if the KKS generating model based on the attention stacking network is to be continuously trained, repeating the steps 5.1 to 5.4 until the iteration times are reached; and after the iteration times are reached, obtaining a KKS generation model based on the attention stack network, and storing the KKS generation model in a computing unit.
Preferably, each layer of self-care network in step 5.2Combining the output features of the previous layer, each layer being self-attentive to the network/>The results of the step (a) are normalized by the jump layer, and then are combined with the characteristics in the current output result; step 5.5, matching corresponding word segmentation results according to the index of the maximum value of the vector probability of each dimension word, and then sequentially performing word segmentation and splicing to reconstruct the KKS code; the computing unit in step 5.5 provides a model operating environment using Tensorflow architecture and model compression and acceleration using TensorRT optimization tool.
Preferably, in step 6, when the similarity calculation is performed on the KKS code predicted value under the new coding rule by using the minimum editing distance in the form of a character string, the closer to 1 the similarity result is, the more similar the KKS code is, and the closer to 0 the more dissimilar the KKS code is.
The beneficial effects of the invention are as follows: the invention systemizes a complex power generation process, constructs a digital twin body based on a power full production process, and instantiates a group digital twin technology; the invention also constructs a KKS generation model based on the attention stacking network, excites the vitality of the original digital infrastructure of the power plant, opens the island of data, breaks the information barrier, promotes the real-time fusion between the physical space and the information space, and realizes the intelligent mapping task under various KKS coding systems.
Drawings
FIG. 1 is a summary diagram of a digital twins-based set level KKS code intelligent mapping recommendation method;
FIG. 2 is a schematic diagram of a specialized dictionary based on digital twinning;
FIG. 3 is a schematic diagram of an attention stack network;
FIG. 4 is a diagram of the logical relationship of the acquisition unit, the storage unit, and the calculation unit;
FIG. 5 is a schematic diagram of a collection device;
FIG. 6 is a schematic diagram of a memory device;
FIG. 7 is a schematic diagram of a computing device.
Detailed Description
The invention is further described below with reference to examples. The following examples are presented only to aid in the understanding of the invention. It should be noted that it will be apparent to those skilled in the art that modifications can be made to the present invention without departing from the principles of the invention, and such modifications and adaptations are intended to be within the scope of the invention as defined in the following claims.
The same model or the same type of equipment has a common digital twin body which is one of the main characteristics of the group-level digital twin architecture. In order to ensure the consistency between the digital space and the physical space, the bottom coding standardization is the basis of the digital twin biochemistry of the equipment. At present, the production systems and other auxiliary systems of most power plants have adopted own coding rules for many years, and great difficulties exist in modifying the coding specifications of the systems in operation. In addition, the power plant also has the problem that the original coding system cannot cover all devices due to the updating of the technology. If the promotion of the group-level digital twin technology is to be realized, the unified coding mapping is required to be constructed for the coding of each power plant, but the work is extremely complicated. Therefore, the invention provides a digital twin-based intelligent mapping recommendation method for group-level KKS codes
Example 1
The first embodiment of the application provides a digital twin-based intelligent mapping recommendation method for group level KKS codes, which is shown in FIG. 1:
Step 1, according to a digital twin modeling process, firstly systemizing a complex power generation flow, and constructing a digital twin body based on a full power production flow:
In the above-mentioned method, the step of,Representing the various subsystems that make up the digital twin,/>A digital twin representing a full power-based production process;
Step 2, the digital twin body obtained according to the step 1Construction of a professional dictionary/>
Step 3, acquiring codes under the original coding rule by using acquisition equipmentAnd coding under the new coding rules/>Wherein/>Refers to the nth code encoded according to the original coding rule,/>Refers to the nth code encoded according to the new coding rule; coding/>, under the original coding rules, according to manual (experience of production operators)Coding under the new coding rules/>Mapping and matching are carried out, and a KKS coding data set is constructed:
Step 4, performing KKS coding data set obtained in step 3Data preprocessing is carried out to obtain a training setAnd test set/>
Step 5, training set obtained by step 4Training a KKS generation model based on an attention stack network, wherein the KKS generation model based on the attention stack network is formed by stacking multiple layers of attention networks;
step 5.1, coding according to the original coding rule according to the full connection layerEncoded according to the new encoding rules/>Word vectorization coding is carried out to obtain vectorization results/>, with fixed dimensions, respectively
In the above formula, the superscript 1st represents an original coding rule, and the superscript 2nd represents a new coding rule;
step 5.2, vectorization result obtained according to step 5.1The attention features are calculated by the self-attention layer:
in the above, the vectorized result with fixed dimension is input to the self-attention layerThe self-attention layer is formed by stacking N self-attention modules, the input of the latter module is taken as the output of the last module in a self-attention layer form, and each self-attention module is divided into two layers; through self-attention network/>Get attention moment array:/>
In the above, self-attention networkRespectively by weight/>、/>、/>Obtaining characteristic value、/>、/>; Then adopt/>Obtain based on eigenvalue/>、/>、/>Attention weight of (2), additionally/>、/>Is d; then pass through the feedforward full-connection layer/>Calculate attention characteristics/>Wherein/>Representing the calculation in the nth self-attention module;
Step 5.3, vectorizing the result obtained in step 5.1Self-attention feature obtained in step 5.2Performing feature fusion calculation; the fusion calculation module is formed by stacking M fusion self-attention modules, and the input of the next module in the fusion calculation module is used as the output of the last module in a form; the fusion self-attention module consists of a three-layer structure:
In the above, willAs self-attention network/>First through a self-attention network/>Obtain the attention matrix/>
In the above, self-attention networkRespectively by weight/>、/>、/>Obtaining characteristic value、/>、/>; Then adopt/>Obtain based on eigenvalue/>、/>、/>Attention weight/>In addition, d is/>、/>Vector dimensions of (a); and then pass/>Two attention matrices/>Attention features/>Fusion is carried out to obtain a fusion attention moment array; Finally through the feedforward full-connection layer/>Attention moment array according to fusionCalculate and get the fusion attention feature/>Wherein/>Representing the calculation in the mth fused self-attention module;
Step 5.4, fused attention features obtained according to step 5.3By normalizing an exponential functionCalculating to obtain probability mapping values p of each fusion attention feature in the word vector:
Step 5.5, the probability mapping value p of the fused attention feature in the word vector obtained in the step 5.4 is used for evaluating a model training result by adopting a cross entropy result as a loss function for training a KKS generation model based on an attention stacking network; determining a suspension condition of KKS generation model training based on the attention stack network according to the iteration times and the loss function convergence value; if the KKS generating model based on the attention stacking network is to be continuously trained, repeating the steps 5.1 to 5.4 until the iteration times are reached; after the iteration times are reached, a KKS generation model based on an attention stacking network is obtained and stored in a computing unit;
Step 6, the professional dictionary obtained according to the step 2Generating a KKS coding predicted value under a new coding rule by adopting a reconstruction function according to the probability mapping value p of the fusion attention characteristic in the word vector obtained in the step 5, and carrying out similarity calculation on the KKS coding predicted value under the new coding rule by adopting a minimum editing distance in a character string mode without converting the KKS coding predicted value into a vector mode for calculation; the most similar g KKS code values are obtained for KKS code mapping recommendation:
In the above, byReconstruction of the function to obtain KKS coding predicted value/>Then pass/>Calculating the similarity of KKS codes by using a minimum edit distance function, and finally adopting/>And selecting g most similar KKS code values for KKS code mapping recommendation, and storing the recommendation result in a storage unit.
Example two
On the basis of the first embodiment, the second embodiment of the application provides an application of the intelligent mapping recommendation method for the group level KKS code based on the digital twin in the first embodiment in a digital twin standard coding project of a certain power generation group:
Step 1, systemizing a complex power generation process by adopting a system engineering idea, and constructing a digital twin body based on a power generation full production processS represents each subsystem forming a digital twin body, taking an ultralow-emission coal-fired unit as an example, and the main system is divided into five systems of a boiler, a steam turbine, environmental protection, electricity and chemistry;
Step 2, the digital twin body obtained according to the step 1Construction of a Multi-level professional dictionary/>As shown in fig. 2, the specific steps are as follows:
Step 2.1 digital twinsThe detailed classification of the five main systems and related subsystems is used for word segmentation to obtain professional word segmentation data, and a data structure of three key value pairs is constructed;
step 2.2, constructing a multi-level professional dictionary according to the professional word segmentation data obtained in the step 2.1Mainly by the main system dictionary/>Subsystem dictionary/>The dictionary is stored in a storage unit, and the structure of the storage unit is shown in fig. 6;
step 3, using the acquisition equipment, the structure of which is shown in figure 5, to complete the acquisition of the codes under the original code rule,/>Representing the original coding rule; coding under the new coding rules/>Representing a new encoding rule; matching 10000 codes under two rules of new and old according to experience of production operators to construct KKS code data set/>; The constructed data set is transmitted outwards through a data interface;
Step 4, performing KKS coding data set obtained in step 3The data preprocessing is carried out, and the specific steps are as follows:
step 4.1, professional dictionary obtained according to step 2Coding data is segmented by adopting Jieba segmentation tools, and corresponding segmentation results/>,/>Representing each word segment; according to the professional dictionary/>The codes of all phrases in the Chinese character set are digitally encoded to obtain/>,/>Representing individual word segments in a dictionaryCorresponding to the number of the corresponding number;
step 4.2, training the digital coding data set obtained in the step 4.1 in a random sampling modeAnd test set/>Dividing the training set into 8000 groups of data and the testing set into 2000 groups of data;
step 5, training set obtained by step 4.2An attention stack network model is trained, which consists of a stack of multiple layers of attention networks, as shown in fig. 3, with the following specific steps:
step 5.1, according to the input original coding dataWord vectorization encoding to obtain a word with fixed dimensionThe fixed dimension is 32:
The data dimension of the vectorized code is 64×20×32, wherein 64 refers to batchsize, and one batch can train 64 batch code pairs simultaneously, and each batch code pair contains 20 groups of codes;
step 5.2, vectorization result obtained according to step 5.1The attention characteristic calculation is carried out through the self-attention layer, the self-attention layer is formed by stacking 3 self-attention modules end to end, and meanwhile, the self-attention modules are formed by a two-layer structure:
Wherein the features are inputFirst through self-attention network/>Get attention moment arrayThe matrix dimension is 64 x 20 x 32; then pass through feedforward full connection layer/>Calculate attention characteristics/>The characteristic dimension is 64 x 20 x 32;
Step 5.3, vectorization result obtained according to step 5.1Self-attention feature obtained in step 5.2And performing feature fusion calculation, wherein the fusion process is formed by stacking 3 fusion self-attention modules end to end, and the fusion self-attention modules are formed by three layers of structures:
Wherein the features are inputFirst through self-attention network/>Get attention moment arrayThe matrix dimension is 64 x 20 x 32; then pass/>Two attention matrices/>Attention features/>Fusion is carried out to obtain a fusion attention matrix/>The matrix dimension is 64 x 20 x 32; finally through the feedforward full-connection layer/>According to the attention matrix/>Calculate and get the fusion attention feature/>The characteristic dimension is 64 x 20 x 32;
Step 5.4, fused attention features obtained according to step 5.3By/>Calculating to obtain a probability mapping value p of each feature in the word vector, wherein the dimension of the probability matrix is 64 x 20 x 1365:
And 5.5, evaluating a model training result by taking the cross entropy result as a loss function of model training, wherein the model tends to converge when training is performed for 12K times, the loss function value is 0.132, and storing and obtaining a KKS code generation model based on an attention stacking network, wherein the model is stored in a calculation unit for model multiplexing.
Step 6, professional dictionary obtained in step 2And 5, adopting a new KKS code generated by a reconstruction function with the probability mapping value p obtained in the step, adopting a minimum editing distance to calculate similarity, and obtaining 8 KKS code values which are the most similar to each other for KKS code mapping recommendation:
wherein by means ofReconstruction of the function to obtain KKS coding predicted value/>Simultaneously verifying that the generated codes accord with the grading of all systems in the digital twin; then pass/>Calculating the similarity of KKS codes by using a minimum edit distance function; finally adopt/>The most similar 10 KKS codes are selected for storage in the memory unit and used by the operator to match the mapping results. In this embodiment, after standardized coding is performed on a certain gas power plant subordinate to a certain group by adopting a KKS generation model based on an attention stack network, the obtained coding conditions are shown in the following table 1:
TABLE 1 coding Condition Table for a gas Power plant under a group
The intelligent mapping recommendation method based on the digital twin body set level KKS code in the table 1 is used for recommending statistics through intelligent mapping of 1442 measuring point codes, has recommendation accuracy of 84.3% and meets the working demands of power plant management staff.
In this embodiment, the standard coding work is performed by adopting the KKS generation model based on the attention stack network for the hydropower plant subordinate to the group, and the obtained coding conditions are shown in the following table 2:
TABLE 2 hydropower plant coding conditions under a certain group
The intelligent mapping recommendation statistics of 5113 measuring point codes in the table 2 show that the recommendation accuracy is 82.7%, and the method also meets the working requirements of power plant management staff.

Claims (1)

step 3, acquiring codes under the original coding rule by using acquisition equipmentAnd coding under the new coding rules/>Wherein/>Refers to the nth code encoded according to the original coding rule,/>Refers to the nth code encoded according to the new coding rule; according to the coding/>, under the original coding rule, by manpowerCoding under the new coding rules/>Mapping and matching are carried out, and a KKS coding data set is constructed: the original coding rule in the step 3 is a KKS coding rule adopted when a power plant is designed by a design institute, and the new coding rule is a KKS coding rule formulated by a power generation group; in the step 3, the acquisition unit in the acquisition equipment performs data acquisition by running a Python script, and the acquired data is transmitted outwards through a data interface;
CN202110905893.4A2021-08-092021-08-09Group level KKS coding intelligent mapping recommendation method based on digital twinActiveCN113673152B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110905893.4ACN113673152B (en)2021-08-092021-08-09Group level KKS coding intelligent mapping recommendation method based on digital twin

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110905893.4ACN113673152B (en)2021-08-092021-08-09Group level KKS coding intelligent mapping recommendation method based on digital twin

Publications (2)

Publication NumberPublication Date
CN113673152A CN113673152A (en)2021-11-19
CN113673152Btrue CN113673152B (en)2024-06-14

Family

ID=78541823

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110905893.4AActiveCN113673152B (en)2021-08-092021-08-09Group level KKS coding intelligent mapping recommendation method based on digital twin

Country Status (1)

CountryLink
CN (1)CN113673152B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114911797B (en)*2022-05-052025-01-28福建安能数通科技有限公司 A method for integrating KKS code and trellis code
CN115689399B (en)*2022-10-102024-05-10中国长江电力股份有限公司Rapid construction method of hydropower equipment information model based on industrial Internet platform

Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CA3158765A1 (en)*2019-11-252021-06-03Strong Force Iot Portfolio 2016, LlcIntelligent vibration digital twin systems and methods for industrial environments

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CA3098670A1 (en)*2018-05-062019-11-14Strong Force TX Portfolio 2018, LLCMethods and systems for improving machines and systems that automate execution of distributed ledger and other transactions in spot and forward markets for energy, compute, storage and other resources
US10901834B2 (en)*2019-03-132021-01-26Accenture Global Solutions LimitedInteractive troubleshooting assistant
CN110781680B (en)*2019-10-172023-04-18江南大学Semantic similarity matching method based on twin network and multi-head attention mechanism

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CA3158765A1 (en)*2019-11-252021-06-03Strong Force Iot Portfolio 2016, LlcIntelligent vibration digital twin systems and methods for industrial environments

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
流域梯级水电厂KKS 信号标识的设计及应用;蒋承峰等;《中国设备工程》;20201123;第-卷(第-期);P8-9*

Also Published As

Publication numberPublication date
CN113673152A (en)2021-11-19

Similar Documents

PublicationPublication DateTitle
CN115579880A (en) Comprehensive energy multiple load short-term forecasting method and system
CN101063643B (en) Aircraft Fault Intelligent Diagnosis Method and System
CN113673152B (en)Group level KKS coding intelligent mapping recommendation method based on digital twin
CN112329937B (en)GIS fault diagnosis method based on case and fault reasoning
CN116007937B (en) Intelligent fault diagnosis method and device for transmission parts of mechanical equipment
CN111158640B (en)One-to-many demand analysis and identification method based on deep learning
CN116187508A (en) A Wind Turbine Fault Prediction Method Fused with Knowledge Graph and Deep Learning
CN116842337A (en)Transformer fault diagnosis method based on LightGBM (gallium nitride based) optimal characteristics and COA-CNN (chip on board) model
CN115481841A (en) Material Demand Forecasting Method Based on Feature Extraction and Improved Random Forest
CN118316030A (en)Photovoltaic power generation power prediction method, system and computer equipment
CN111538639A (en) A log parsing method
CN115983710A (en)High-proportion new energy access electric power system infrastructure project decision method and system
CN115470786A (en) Entity Information Extraction Method for Power Defect Text Based on Improved Transformer Encoder
CN116291336B (en) An automatic segmentation and clustering system based on deep self-attention neural network
CN113536508A (en)Method and system for classifying manufacturing network nodes
CN111861256B (en) A method and system for active distribution network reconstruction decision-making
CN117593044A (en) A dual-angle marketing activity effect prediction method, medium and system
CN118194487A (en)Automatic arrangement method, medium and system for circuit and electric equipment
CN116975634A (en)Micro-service extraction method based on program static attribute and graph neural network
CN117057416B (en)Sub-solar photovoltaic power generation prediction method and system
CN115169426A (en) Anomaly detection method and system based on similarity learning fusion model
CN119128166A (en) A method and system for completing knowledge graph of power equipment based on Mamba-GPT model
CN118586988A (en) A financial product recommendation method and system based on big data analysis
CN118889392A (en) A load forecasting method based on time series library and time series model
CN118095900A (en) New energy operation data monitoring system and method based on multi-source data fusion

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right

Effective date of registration:20220818

Address after:Room 307, No. 32, Gaoji Street, Xihu District, Hangzhou City, Zhejiang Province, 310002

Applicant after:Zhejiang Zheneng Digital Technology Co., Ltd.

Applicant after:ZHEJIANG ENERGY R & D INSTITUTE Co.,Ltd.

Address before:5 / F, building 1, No. 2159-1, yuhangtang Road, Wuchang Street, Yuhang District, Hangzhou City, Zhejiang Province

Applicant before:ZHEJIANG ENERGY R & D INSTITUTE Co.,Ltd.

TA01Transfer of patent application right
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp