Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
1-4, A digital asset security verification and information monitoring method and system according to the present embodiment may specifically include:
step S101, classifying the digital asset by adopting a multi-dimensional classification model according to the type attribute of the digital asset, obtaining a classification label and a feature vector representation of the digital asset, and establishing a digital asset feature database.
The method comprises the steps of obtaining multi-dimensional attribute information of digital assets, clustering the digital assets by adopting a K-means clustering algorithm to obtain coarse-granularity classification labels of the digital assets, extracting common features of the digital assets aiming at each coarse-granularity classification, constructing a TF-IDF feature vector representation model of the digital assets, classifying the feature vectors of the digital assets by adopting a support vector machine SVM classification algorithm to obtain fine-granularity classification labels of the digital assets, storing the classification labels of the digital assets and the corresponding feature vectors into a MongoDB database, establishing a mapping relation between the digital asset labels and the features, obtaining digital asset classification labels input by a user when a digital asset retrieval request of the user is received, obtaining digital asset feature vectors corresponding to the digital asset classification labels from the MongoDB database, calculating cosine similarity of the image feature vectors of the user and the digital asset feature vectors, recommending the digital assets with the highest similarity to the user, extracting multi-dimensional attribute information of the digital assets by using the TF-IDF feature vector representation model when the digital assets are newly increased, obtaining feature vectors, updating the feature vectors by using the classification labels and the corresponding feature vectors, updating the frequency of the digital assets and the digital asset classification model according to the frequency of the digital asset classification model, updating the frequency of the digital asset classification labels and the frequency of the digital asset classification labels, and updating the digital asset classification labels according to the frequency change, and the frequency change-free frequency change of the digital asset classification labels, and the frequency change update of the digital asset classification labels.
In particular, digital assets may contain many dimensions, such as publishers, time of release, total volume, traffic, consensus mechanisms, application scenarios, and so forth. These dimensions may be quantized, e.g., the release time may be represented by a timestamp, and the consensus mechanism may be represented by a one-hot code. Assume that multi-dimensional attribute information for 1000 digital assets is collected. These digital assets are clustered using the K-means algorithm. Setting the number of clusters K to 5, the algorithm will divide 1000 digital assets into 5 clusters. Assuming that one cluster contains mainstream cryptocurrency such as a digital currency, the other cluster contains various de-centralised tokens, and the other cluster contains NFT related digital assets. this results in coarse-grained classification tags for digital assets, such as "mainstream cryptocurrency", "off-center tokens", "NFT assets", and the like. Coarse-grained classification is performed because direct fine-grained classification increases computational complexity, while coarse-grained classification can help reduce the scale and increase efficiency. For the coarse granularity classification of 'mainstream cryptocurrency', the commonality characteristic of the digital asset can be extracted. For example, they typically have a large market value, a high transaction volume, a long history, a broad community consensus, and so forth. The feature vector representation model may be constructed using the TF-IDF algorithm. The TF-IDF algorithm may evaluate the importance of a word to one of the documents in a document set or corpus. Each digital asset may be considered herein as a "document" and their attributes as "words". For example, the TF-IDF feature vector of "a digital currency" may contain features such as "market value high", "large transaction amount", "long history", etc., and their weights are calculated by the TF-IDF algorithm. And classifying the feature vectors of the digital asset by adopting a Support Vector Machine (SVM) algorithm to obtain the fine-grained classification labels of the digital asset. For example, under the coarse-grained classification of "main stream cryptocurrency", it may be further subdivided into "POW consensus mechanism", "POS consensus mechanism", and the like. Through the SVM algorithm, the A digital currency can be divided into fine-grained classifications of the POW consensus mechanism. The classification tags (coarse and fine granularity) and corresponding feature vectors for the digital assets are stored in a mongo db database. MongoDB is a non-relational database that can flexibly store various data types. A collection may be created for storing information of the digital asset including names, class labels, feature vectors, etc. For example, two class labels, the name "A digital currency", the "mainstream cryptocurrency", and the "POW consensus mechanism", and its TF-IDF feature vector may be stored in the MongoDB database. This has the advantage of facilitating subsequent retrieval and updating. When the user inputs the classification label of the POW consensus mechanism for searching, the system acquires the feature vectors of all digital assets belonging to the POW consensus mechanism from the MongoDB database. The portrait feature vector of the user is assumed to include features such as "high security" and "high decentration degree". The system calculates the cosine similarity of the user portrait feature vector to each digital asset feature vector. The higher the cosine similarity, the higher the degree of matching of the user to the digital asset. The system recommends Top-N digital assets with the highest similarity to the user. When a digital asset named "ABC token" is added, its multidimensional attribute information is extracted. The feature vector of the "ABC token" can be obtained by the previously constructed TF-IDF feature vector representation model. Then, the trained SVM classification model is used for judging the category of the ABC token, such as the mainstream encryption currency and POS consensus mechanism. Finally, the classification tag and feature vector information for "ABC tokens" are inserted into the MongoDB database. The digital asset characteristic data in the MongoDB database is periodically maintained and updated. For digital assets that are stable in long-term attributes like digital a currency, the update frequency can be reduced. While for some more frequently changing decentralised tokens, it is desirable to increase their update frequency. by the aid of the method, accuracy and timeliness of data in the database can be guaranteed, and therefore the effect of the recommendation system is improved.
Step S102, a self-adaptive verification algorithm is designed for the classified digital asset characteristic database, and the comprehensive verification of the legitimacy and ownership of the digital asset is realized by dynamically selecting a matched verification rule and a threshold parameter, so that a digital asset verification result is generated.
The method comprises the steps of obtaining a digital asset characteristic database, obtaining characteristic information of various digital assets, obtaining a preset candidate verification rule set and a threshold parameter range according to classification information coarse-grained classification labels of the digital assets, adopting a decision tree algorithm to select a verification rule subset which is most suitable for the digital assets to be verified from the candidate verification rule set, adopting a weighted average method to calculate comprehensive verification scores of the digital assets to be verified by calculating information gain ratios of all the characteristics, recursively constructing a decision tree by selecting the characteristics with the largest information gain ratio as splitting nodes until the characteristics meet preset stopping conditions, adopting a random forest algorithm to dynamically adjust threshold parameters of the verification rule subset, constructing a plurality of decision trees by randomly selecting sample subsets and characteristic subsets, synthesizing prediction results of the plurality of decision trees to obtain adjustment values of the threshold parameters, adopting the adjusted threshold parameters to judge whether the digital assets to be verified meet preset ownership verification rules, if the authentication rule is met, carrying out judgment verification on the validity of the digital assets to be verified, adopting the weighted average method to calculate the comprehensive verification scores of the digital assets to be verified, weighting average weight coefficients are verified according to the importance of ownership verification and the preset importance of the digital assets to be verified, writing the comprehensive verification scores into the digital asset to be verified with the preset threshold value to obtain the final verification results, and the final verification results of the digital assets to be verified.
Specifically, a digital asset characteristic database is obtained, wherein the database contains characteristic information of various digital assets. For example, the database stores information of digital assets such as A digital currency, including release time, total amount, traffic, consensus mechanism, application scenario, historical price trend, etc. This information characterizes the digital asset, which can be used in a subsequent verification process. A set of pre-set candidate validation rules and a threshold parameter range are obtained from coarse-grained classification tags of digital assets (e.g., "mainstream cryptocurrency," "decentralised tokens," "NFT assets," etc.). For example, for "mainstream cryptocurrency," candidate validation rules may include "market value greater than 100 million dollars", "number of trading platforms greater than 100", "code open source and security audit" and so forth. The threshold parameter range may be a lower limit of a market value, a lower limit of the number of trading platforms, etc. A decision tree algorithm is used to select a subset of validation rules from the set of candidate validation rules that is best suited for the digital asset to be validated. Assume that a new digital asset named "ABC token" is to be validated, with its coarse-granularity classification label "mainstream cryptocurrency". The decision tree algorithm will select the most relevant validation rules based on the feature information of the "ABC token", such as market value, number of transaction platforms, whether the code is open or not, etc. For example, if the market value of "ABC tokens" is only 10 billion dollars, then the rule of "market value greater than 100 billion dollars" is not applicable and is excluded by the decision tree algorithm. The decision tree algorithm is adopted because the most suitable verification rule can be selected according to different characteristics of the digital asset, and the verification efficiency and accuracy are improved. And dynamically adjusting threshold parameters of the verification rule subset by adopting a random forest algorithm. Assuming that the decision tree algorithm selects the rule that the number of trading platforms is greater than 50, the initial threshold is 50. Random forest algorithms dynamically adjust this threshold by analyzing historical data. For example, if the historical data indicates that the "mainstream cryptocurrency" for which the number of trading platforms is greater than 80 is more reliable, then the random forest algorithm will adjust the threshold to 80. The random forest algorithm is adopted because the prediction results of a plurality of decision trees can be synthesized, and the accuracy and the stability of the threshold parameters are improved. And judging whether the digital asset to be verified meets a preset ownership verification rule or not by adopting the adjusted threshold parameters. Assuming that "ABC tokens" are marketed at 85 transaction platforms, the adjusted threshold parameters are satisfied, they are deemed to pass ownership verification. The purpose of ownership verification is to confirm the authenticity and trustworthiness of the digital asset, preventing counterfeiting and fraud. And if the ownership verification rule is met, judging and verifying the legitimacy of the digital asset to be verified. Validity verification primarily examines whether a digital asset complies with relevant legal regulations and regulatory policies. For example, verifying whether the issuance of "ABC tokens" is compliant or involves illegal activity. and calculating the comprehensive verification score of the digital asset to be verified by adopting a weighted average method. Assuming that the weight of ownership verification is 7, the weight of validity verification is 3. The ownership verification score of "ABC token" is 100 points and the validity verification score is 90 points, then the overall verification score is 7 x 100+3 x 90=97 points. By adopting the weighted average method, the importance of different verification dimensions can be comprehensively considered, and a more comprehensive verification result can be obtained. And comparing the comprehensive verification score with a preset threshold value to obtain a final verification result of the digital asset to be verified. Assuming that the preset threshold is 90 points and the comprehensive verification score of "ABC token" is 97 points, the final verification result is "pass". And writing the final verification result, the characteristics of the ABC token, the ownership verification result and the validity verification result information into a digital asset verification result database. The method has the advantages of facilitating subsequent inquiry and analysis and facilitating supervision and management by a supervision organization. The information stored in the database may be used to track the history of the digital asset and identify potential risks and problems.
Step S103, based on the correlation between the digital asset verification result and the digital asset, constructing a digital asset correlation map, adopting a graph neural network algorithm to perform joint modeling on the correlated digital asset, and calculating the verification weight of the digital asset node through message transmission and feature aggregation.
The method comprises the steps of obtaining a correlation between a digital asset verification result and a digital asset, constructing a digital asset correlation graph, preprocessing the digital asset correlation graph, filling missing values, filtering the missing values, extracting attribute features and structural features of the nodes, modeling the preprocessed digital asset correlation graph by adopting a graph rolling neural network algorithm to obtain a graph rolling neural network model, updating feature representation of the nodes through a message transfer mechanism according to the adjacent relation among the nodes in each layer of the graph rolling neural network model, calculating correlation weights among the nodes through an attention mechanism, wherein the attention mechanism comprises the steps of converting the feature representation of the nodes into attention weights, aggregating the features of adjacent nodes through a weighted sum mode, splicing the feature representation of the digital asset nodes and the attention weights as inputs of all connection layers, obtaining verification weights of each digital asset node through all connection layers, judging the digital asset nodes to be higher than a suspicious digital asset node according to the verification weight, and judging the important asset node to be higher than the suspicious node, and marking the important node as a correlation threshold value in the verification result of the correlation verification result of the digital asset model, and continuously verifying the important node is obtained by optimizing the correlation between the nodes.
In particular, the digital asset correlation graph is intended to depict the relationship between digital assets, providing a more comprehensive perspective for risk assessment and verification. Nodes in the graph represent individual digital assets, such as digital a currency, a particular NFT collection, etc. Edges represent associations between them, e.g., based on the same underlying technology, a common team of developers, a trade on the same trading platform, etc. After the atlas is constructed, pre-processing is required to improve the accuracy of the model. For example, the absence of market data for a digital asset may be populated with historical market data for that asset or extrapolated with the average market value for the asset of the same category. For outliers, such as sudden increases in the transaction volume of a newly issued token, the reasons behind it need to be analyzed and should be filtered out if due to market manipulation factors. Furthermore, it is important to extract the attribute features and structural features of the nodes. The attribute features may be market value, transaction amount, issue time, etc., while the structural features may be degree of a node (number of edges connected to the node), centrality, etc. These features will be used for subsequent graph roll-up neural network modeling. The atlas neural network is a deep learning model dedicated to processing atlas data. In each layer, a node exchanges information with its neighboring nodes and updates its own feature representation. For example, if a NFT project is associated with a well-reputable development team, the trustworthiness of the NFT project may be improved, as reflected in the variation in its representation of features. The attention mechanism is able to capture important relationships between nodes. For example, the relationship between a stable currency and the legal currency it anchors is very important, while the relationship with a small exchange is relatively minor. The attention mechanism will give different weights to different relationships, highlighting critical information. The fully connected layer combines the characteristic representation of the nodes with the attention weight to generate a verification weight for each node. This weight reflects the importance of the node and its potential risk. Nodes above the threshold are marked as important nodes and nodes below the threshold are marked as suspicious nodes. And carrying out key verification on the important node and the suspicious node. For example, for a token marked as suspicious, its transaction records, smart contract codes, etc. may be investigated in depth to determine if it is at risk of fraud. The verification result will be added to the graph as a new node attribute. The model is dynamically updated based on the new verification result. For example, if a token that was previously considered important was found to have a security breach, its verification weight would be reduced and the model would adjust its parameters accordingly, thereby improving the accuracy of subsequent verifications. This continuous optimization process ensures that the model is able to adapt to changing digital asset environments. High risk and suspicious digital assets can be more effectively identified through digital asset correlation maps and graph convolution neural networks. For example, a new online, decentralized transaction platform has an association with several suspected parties who have previously issued tokens that present an asset security risk. Through correlation graph and model analysis, the potential risk of the transaction platform can be found in time, and early warning is sent out. As another example, a certain NFT project party claims to have strong technical background and community support, but the graph analysis shows that its association with other well-known projects and communities is low, which may suggest that its promotion has exaggerated components, and investors need to be careful.
Step S104, transaction history data of the digital asset is obtained from the digital asset association map, association rules and abnormal patterns of the digital asset transaction are found by utilizing a time sequence pattern mining technology, and a transaction verification rule base is formed.
The method comprises the steps of obtaining a digital asset association graph, obtaining target digital asset nodes and attribute information of the target digital asset nodes from the digital asset association graph, obtaining associated digital assets associated with the target digital asset according to a preset association relation, obtaining transaction history data of the target digital asset and the associated digital asset, wherein the transaction history data comprise transaction time, transaction amount and transaction opponents, preprocessing the transaction history data, filling missing values by adopting a preset data filling rule, aligning the transaction time of the target digital asset and the transaction time of the associated digital asset to generate a normalized time sequence transaction data set, mining frequent transaction patterns from the time sequence transaction data set by adopting an Apriori algorithm, comparing the frequent transaction patterns with a preset normal transaction pattern rule to judge whether the frequent transaction patterns are abnormal transaction patterns, marking the frequent transaction patterns as abnormal transaction patterns if the frequent transaction patterns do not accord with the normal transaction pattern rule, manually checking the abnormal transaction patterns, adding the manually confirmed abnormal transaction patterns to a preset abnormal transaction pattern library, obtaining new digital asset transaction data based on the digital asset association, obtaining digital asset information associated with the new digital asset data, matching the new digital asset transaction patterns with the abnormal transaction patterns in the abnormal transaction library, and generating abnormal transaction patterns if the abnormal transaction patterns match the abnormal transaction patterns with the abnormal transaction patterns in the digital asset library.
Specifically, the digital asset association graph resembles a huge relational network, with each node representing a digital asset, such as a digital currency or some NFT collection. The links between nodes represent associations between them, e.g., based on the same underlying technology, a common team of developers, a market transaction on the same transaction platform, etc. And acquiring a digital asset association graph, namely drawing a network so as to better understand the relationship between digital assets, and performing risk assessment and verification. For example, to investigate a new NFT item named "XYZ", the item node and its attribute information, such as release time, total amount, release price, etc., may be obtained from the map. To more fully assess the risk of the "XYZ" project, the digital asset associated therewith needs to be acquired. Suppose that the atlas shows that the "XYZ" project is associated with a development team named "ABC" who previously developed another NFT project "DEF". Then the "DEF" item is associated with the "XYZ" item, which can also be included in the investigation. Historical transaction data for "XYZ" and "DEF" is obtained, including transaction time, transaction amount, and transaction opponents. For example, "XYZ" sells 1 NFT at 2024, 1 month 1 day at a price of 1ETH, with a buyer address of 0x123. "DEF" sells 1 NFT at a price of 5ETH on month 1 of 2023, buyer address 0x456. These transaction data will be used for subsequent analysis. There may be missing values in the transaction data, for example, some transactions may not have their amounts recorded. At this time, it is necessary to perform filling using a preset data filling rule. For example, the missing transaction amount may be filled with the historical average transaction price for the asset. Because the transaction times for different digital assets may be different, the transaction times need to be aligned to generate a normalized time series transaction dataset. For example, transaction data for both "XYZ" and "DEF" are aggregated by hour for subsequent analysis. Frequent transaction patterns are mined from the time series transaction data set using an Apriori algorithm. For example, it may be found that the transaction amounts of "XYZ" and "DEF" often rise substantially within the same time period, which may suggest that there is some kind of correlation transaction between the two. And comparing the mined frequent transaction mode with a preset normal transaction mode rule. Normal transaction pattern rules may be formulated based on historical data and market experience. For example, the amount of transactions for an NFT item is typically related to its market popularity, community liveness, and the like. If the transaction volume of an item suddenly rises, but its market popularity and community activity are not significantly improved, anomalies may exist. If the frequent transaction pattern does not meet the normal transaction pattern rules, it is marked as an abnormal transaction pattern. For example, if the transaction amounts of "XYZ" and "DEF" are synchronously expanding, but neither issue any interest messages nor obvious market-pushing factors, they may be marked as abnormal transaction patterns. And (5) manually checking the abnormal transaction mode to confirm whether the abnormal transaction mode really has the abnormality. For example, an analyst may deeply investigate transaction records, funds flow, etc. of "XYZ" and "DEF" to determine if there is market manipulation or other violations. And adding the manually confirmed abnormal transaction pattern into a pre-established abnormal transaction pattern library. For example, the synchronous transaction amount swelling patterns of "XYZ" and "DEF" are added to the library of abnormal transaction patterns for subsequent monitoring. Assuming that new digital asset transaction data is generated, it needs to be matched against patterns in the library of abnormal transaction patterns. For example, a new NFT item named "GHI" also has its transaction volume expanded synchronously with "XYZ" and "DEF". If the new digital asset transaction data matches a pattern in the abnormal transaction pattern library, abnormal transaction alert information is generated. For example, the system may issue an alert that the transaction pattern of "GHI" matches the "XYZ" and "DEF" synchronous transaction patterns in the library of known abnormal transaction patterns, which may be risky. By the method, potential abnormal transaction behaviors can be found in time, the benefits of investors are guaranteed, and the healthy development of the digital asset market is maintained.
Step S105, the transaction verification rule base is applied to the digital asset verification process, a multi-algorithm adaptation layer is designed by combining an encryption algorithm and a consensus mechanism of the digital asset, and compatible processing of different types of digital assets is achieved through algorithm registration and dynamic scheduling.
The method comprises the steps of obtaining corresponding transaction verification rules from a digital asset registration center according to digital asset types, converting the transaction verification rules into executable code logic by adopting an intelligent contract technology and deploying the executable code logic into a blockchain network, determining a common identification mechanism adopted by the blockchain network according to the blockchain network to which the digital asset belongs, calling an intelligent contract to verify the digital asset transaction request when the digital asset transaction request is received, checking the validity of formats, signatures and attributes of the digital asset transaction request, broadcasting the digital asset transaction request to common identification nodes in the blockchain network if the digital asset transaction request is verified, conducting transaction verification and confirmation on the digital asset transaction request according to the common identification mechanism by the common identification node, protecting a transaction verification process by adopting a cryptography technology, verifying the validity of the digital asset transaction request on the premise of not revealing identity information of both sides of the transaction, introducing a cross-chain protocol to the cross-chain digital asset transaction, realizing atomic exchange of the digital asset in different blockchain networks through a transaction verification mechanism and a digital asset mapping mechanism, realizing the propagation, synchronization and verification of the digital asset in the network through a P2P network, guaranteeing the network, and automatic verification of the service to the network, and automatic verification of the digital asset transaction request to the common identification nodes after the digital asset transaction request is carried out by the common identification node, and the intelligent contract management system is carried out by the intelligent contract technology, and the full-contract verification system is carried out, and the full-process is verified by the intelligent contract system.
In particular, the digital asset registry, like an account register of digital assets, records information about various digital assets, including asset types, issuers, totals, and the like. For example, an NFT collection named "art coin" whose registration information would include its name, issuer (e.g., a certain art studio), total release (e.g., 10000), and artwork information corresponding to each NFT. Corresponding transaction verification rules are obtained from the digital asset registry, namely, the corresponding transaction rules are found according to different types of digital assets. For example, for an NFT collection such as "art coin," its transaction rules may be that each transaction can only transfer an integer number of NFTs and cannot split the transaction. the smart contract, like an automatically executed contract, translates transaction validation rules into code logic and deploys onto the blockchain. For example, for the transaction rules of "art coin," the smart contract may contain a piece of code to check if the NFT number per transaction is an integer. If someone tries to trade 0.5 "art notes", the smart contract will automatically block the trade. The blockchain network resembles a distributed ledger with all transaction records recorded on the blockchain. Different blockchain networks employ different consensus mechanisms, such as proof of work (PoW) or proof of rights (PoS). The PoS then decides who has the right to verify the transaction based on the number of digital assets held by the node. Assuming that "art coins" are issued on the blockchain employing the PoS consensus mechanism, nodes holding a greater number of "art coins" will be more likely to gain the right to verify the transaction. When someone wants to trade "art money", he sends a trade request. This transaction request is sent to the smart contract, which is validated. The smart contract will check whether the format, signature, and attributes of the transaction request are legal. For example, the smart contract may check whether the signature of the transaction request matches the sender's private key and whether the NFT number of the transaction is an integer. If the transaction request passes verification of the smart contract, it is broadcast to consensus nodes in the blockchain network. The consensus node validates and validates the transaction according to a blockchain consensus mechanism. For example, under the PoS mechanism, a node with a high coin holding amount votes for a transaction, and if multiple nodes agree, the transaction is validated and added to the blockchain. Cryptography, like a lock, protects the security of transactions. The method can verify the validity of the transaction on the premise of not disclosing the identity information of both sides of the transaction. For example, zero knowledge proof techniques may prove the legitimacy of a transaction without revealing the specific contents of the transaction. For cross-chain transactions, such as someone wanting to digitally currency exchange "art currency" with a-digital currency, it is necessary to introduce a cross-chain protocol. The cross-chain protocol is just like a bridge, connecting different blockchain networks. Atomic exchange of digital assets in different blockchain networks can be achieved through a transaction proof mechanism and a digital asset mapping mechanism. For example, a user may exchange 10 "art coins" with 1A digital currency, and the cross-chain protocol will ensure that the transaction is either completed simultaneously or failed simultaneously. The blockchain nodes communicate over the P2P network, just like a de-centralized network, each of which can be interconnected. The network structure can ensure the decentralization and fault tolerance of the network. Even if some nodes fail, the network can still operate normally. Smart contracts can also be used to manage the full life cycle of digital assets, such as the issuance of "art notes", transactions, destruction, etc. These business rules and flows can be consolidated into the code of the smart contract for automated management. For example, the smart contract may set the total release of "art notes" and automatically control the NFT release process. The intelligent complex needs to be security audited and tested to ensure its security, just prior to deployment to the blockchain network. The plurality of nodes can jointly execute and verify the intelligent contract, so that the security is further improved. For example, in an intelligent contract of "art coin", a security audit flow may be set, and only if the audit is passed, the security audit flow may be formally deployed on the blockchain.
Step S106, according to the data structure of the algorithm adaptation layer, the heterogeneous digital asset data are mapped to the unified feature space by adopting an intelligent data analysis and conversion technology, standardized digital asset data are generated, and a digital asset feature database is formed.
The method comprises the steps of obtaining original data of heterogeneous digital assets, analyzing the original data by an XML analyzer, extracting key attribute fields of the digital assets to construct digital asset attribute feature vectors, carrying out normalization processing on the digital asset attribute feature vectors according to a predefined unified feature space, mapping attribute values of different scales and types into an interval of [0,1] to obtain standardized digital asset attribute feature vectors, converting digital asset attributes which cannot be directly mapped into the unified feature space into feature representation forms of the unified feature space by adopting an One-Hot coding data conversion method, filling the standardized digital asset attribute feature vectors into corresponding dimensions of the unified feature space to obtain feature vector representations of the digital assets in the unified feature space, generating standardized digital asset data records according to predefined data formats and field names according to the feature vectors of the digital assets in the unified feature space, and importing the standardized digital asset data records into a database to establish a digital asset feature data table.
In particular, heterogeneous digital assets vary in data format and fields, and in order to be able to uniformly process and analyze such data, they need to be converted into standardized forms. First, raw data of heterogeneous digital assets needs to be acquired, which is typically stored in different formats, such as database records, XML files, JSON files, etc. Taking the artwork NFT and the game prop as examples, the data of the artwork NFT may include fields of name, author, creation time, work description, image hash value, etc., and the data of the game prop may include fields of name, type, level, attribute, durability, etc. After the original data is obtained, the data needs to be analyzed and extracted. For data in XML format, an XML parser, such as a DOM parser or SAX parser, may be used. For example, the data of artwork NFTs may be stored in an XML file, where the information of each NFT is represented by an XML element. The XML parser may traverse the XML file, extracting key attribute fields for each NFT, such as name, author, image hash value, and so forth. After extracting the key attribute fields, digital asset attribute feature vectors need to be constructed. For example, the feature vector of an artwork NFT may be expressed as [ name, author, authoring time, work description, image hash ]. Since attribute values of different digital assets may have different scales and types, e.g., authoring time is a numeric type and name is a string type, normalization of feature vectors is required. The normalization process may map attribute values of different scales and types into a uniform interval, e.g., [0,1]. For example, the authoring time may be converted to a proportional value relative to the earliest authoring time, and the name converted to a normalized value of the string hash value. This results in a normalized digital asset attribute feature vector. For some digital asset attributes that cannot be mapped directly to the unified feature space, such as the type of play object, one-Hot encoding may be used for conversion. Three types of game props are assumed, namely weapons, defenses and consumables. A three-dimensional vector may be used to represent the type of prop, e.g., weapon represented as [1, 0], armour represented as [0,1,0], consumable represented as [0, 1]. And filling the standardized digital asset attribute feature vector and the One-Hot coded feature vector into corresponding dimensions of the unified feature space, so that the feature vector representation of the digital asset in the unified feature space can be obtained. For example, the feature vector of an artwork NFT may be expressed as [ normalized value of name, normalized value of author, normalized value of authoring time, normalized value of work description, normalized value of image hash value ]. The feature vector of a game prop may be expressed as [ normalized value of name, normalized value of level, normalized value of attribute, normalized value of durability, 1, 0] (assuming the prop is a weapon). From the feature vectors of the digital asset under the unified feature space, a standardized digital asset data record may be generated in accordance with predefined data formats and field names. For example, feature vectors for both artwork NFT and game props may be converted into data records containing asset ID, asset type, feature vectors. Finally, importing the standardized digital asset data record into a database, and establishing a digital asset characteristic data table. Thus, different types of digital assets can be stored, managed and analyzed uniformly. This has the advantage that data retrieval, statistical analysis, machine learning and other operations can be conveniently performed.
And step S107, performing deep verification analysis by utilizing the digital asset characteristic database, obtaining the security attribute information of the digital asset, and dynamically evaluating the security level of the digital asset by adopting a multi-factor risk evaluation model to generate the security rating of the digital asset.
The method comprises the steps of obtaining digital asset characteristic information in a digital asset characteristic database to obtain multi-dimensional characteristic vectors of digital assets to be evaluated, constructing a training data set by the aid of the digital asset characteristic vectors, training the training data set through a deep neural network model to obtain deep representation of digital asset characteristics, inputting the deep representation of the digital asset characteristics into a pre-constructed safety attribute judging model, obtaining safety attribute information of the digital assets through reasoning of the safety attribute judging model, wherein the safety attribute information comprises confidentiality, integrity and usability grades, constructing a comparison matrix according to the obtained digital asset safety attribute information by means of a hierarchical analysis method, calculating characteristic vectors corresponding to the maximum characteristic values of the comparison matrix, carrying out normalization processing to obtain weights of evaluation factors, carrying out weighted summation on the safety attribute information of the digital assets and the weights of the evaluation factors to obtain risk scores of the digital assets under the evaluation dimensions, carrying out periodic dynamic tracking on the digital asset risk scores through a timing task module, triggering a digital asset safety grade adjustment flow when the scores exceed a preset threshold value, and updating the digital asset safety grade database, reading automatic real-time safety grade information of the digital asset from the digital asset safety grade database, and recording the digital asset safety grade information, including the digital asset safety grade information, the digital asset safety grade information and the basic grade change content, and the digital asset safety grade information.
Specifically, the digital asset feature database stores multi-dimensional feature vectors for each digital asset. For example, for an artwork NFT, its feature vector may contain an image hash value, author, creation time, artistic style, etc. For a game prop, its feature vector may include name, type, attribute, rarity, etc. These multidimensional features of the digital asset under evaluation are acquired to form a feature vector for the asset. For example, a game prop "flame sword" may have a feature vector of [ "flame sword", "weapon", "attack +10", "rare" ]. The training data set may be constructed using existing digital asset feature vectors. For example, a training data set may be constructed by collecting a large amount of artwork NFT data, including their feature vectors and corresponding market prices. Then, training the training data set by constructing a deep neural network model through TensorFlow frames to obtain a deep representation of the digital asset characteristics and training the data set. The deep neural network can obtain a deep representation of the digital asset feature by learning a complex relationship between the feature vector and the price. This is equivalent to mapping the original feature vector to a new space where similar digital assets would be closer together. For example, the original feature vector of the "flame sword" may become a high-dimensional vector after the deep neural network, e.g., [2,8,1,9,.], which can more effectively characterize the "flame sword". And constructing a security attribute distinguishing model of the digital asset by using the historical data through a logistic regression model, and inputting the deep representation of the characteristics of the digital asset into the security attribute distinguishing model so as to infer the confidentiality, the integrity and the usability level of the digital asset. The security attribute discrimination model can judge the risk level of the digital asset which is stolen, tampered or inaccessible according to the deep features of the digital asset. For example, an artwork NFT may have a low level of integrity if its image hash value is easily tampered with. If a play object is easily duplicated, its confidentiality level will be low. Assuming that the deep features of the "flame sword" are entered into the model, the model infers that its confidentiality level is "high", integrity level is "medium", and availability level is "high". And then, constructing a pair comparison matrix by adopting an analytic hierarchy process, calculating a feature vector corresponding to the maximum feature value of the pair comparison matrix, and carrying out normalization processing to obtain the weight of each evaluation factor. For example, for a play object, it may be more interesting for its availability and rarity, so the weight of availability and rarity may be higher. The weights for confidentiality, integrity and availability are assumed to be 2,3,5, respectively, after analytic hierarchy process calculation. And carrying out weighted summation on the safety attribute information of the digital asset and the evaluation factor weight to obtain the risk score of the digital asset in each evaluation dimension. The confidentiality, integrity and availability ratings of the "flame sword" are "high", "medium", "high", respectively, and the corresponding scores may be set to 1,5, 1. The risk score of "flame sword" is=2×1+3×5+5×1=85. Digital assets are dynamically tracked periodically (e.g., daily) by a timed task module of the system. If the risk score of the flame sword exceeds a preset threshold (e.g., 9), a digital asset security rating adjustment process is triggered and the digital asset security rating database is updated. For example, the system may turn the integrity level of "flame sword" down to "low". The digital asset security rating database may record real-time security rating information for the digital asset. Such information may be read from a database to automatically generate a digital asset security assessment report. For example, an assessment report of "flame sword" would contain its basic information (name, type, etc.), security attribute levels (high, medium, low), risk scores (85), and security rating change records (e.g., integrity level down from "medium" to "low"). such reporting may help the user understand the security status of the digital asset and make corresponding decisions.
Step S108, based on the digital asset security rating and the verification result, constructing a blockchain verification proving mechanism, forming a hash proving chain of digital asset verification by generating a hash value containing a verification process, the result and a timestamp, and ensuring tamper resistance and traceability of the verification result.
The method comprises the steps of obtaining a security rating result and a verification result of a digital asset, taking the security rating result and the verification result as original data of a verification certificate, adding verification process description and a current timestamp aiming at the original verification data to generate complete verification certificate data, carrying out SHA256 hash operation on the verification certificate data to obtain a unique hash value as a fingerprint of the verification, splicing the current verification hash value and a previous verification hash value to construct a hash certificate chain to form a continuous verification sequence, writing the verification hash value and the hash certificate chain into a block chain through an intelligent Ethernet contract, defining verification rules and a reward mechanism through the intelligent contract, exciting a plurality of nodes to participate in verification, achieving consensus on the verification result, inquiring and tracing the verification data through a block chain browser to obtain historical verification information of the digital asset, verifying the authenticity of the current verification data through comparing the historical verification hash value, synchronizing the verification data in the block chain to a local database, carrying out statistical analysis on the verification data through SQL, calculating a security index of the digital asset, evaluating the historical security analysis report of the digital asset, and generating a security analysis report.
Specifically, the digital asset security rating results, such as "swords of flame," have confidentiality of "high," integrity of "medium," availability of "high," risk score of 85, and verification results, such as "verification pass," will be the original data for verification proof. Adding a verification process description, such as "multi-factor risk assessment of feature vectors of 'flame swords'), and a current timestamp, such as" 2024, 10 months, 17 days, 10:00:00", generates complete verification proof data. SHA256 hash operations are performed on the complete proof of verification data to obtain a unique hash value, e.g. "e4b58eb4642304a2270307021b3f42f3", as the fingerprint for the verification. This hash value may be used to uniquely identify the verification. The hash proof chain is constructed by concatenating the current verification hash value "e4b58eb4642304a2270307021b3f42f3" with a previous verification hash value, e.g., "a1b2c3d4e5f678901234567890abcdef", e.g., "e4b58eb4642304a2270307021b3f42f3|a1b2c3d4e5f678901234567890abcdef", to form a continuous verification sequence. The hash proof chain can trace back the history of the verification. The verification hash value 'e 4b58eb4642304a2270307021b3f42f 3' and the hash certification chain 'e 4b58eb4642304a2270307021b3f42f3|a1b2c3d4e5f678901234567890 abcdeff' are written into the blockchain through the ethernet smart contract, so that the non-tamper property and traceability of data are ensured. The smart contracts define validation rules, such as the format of the validation data, the validation frequency, etc., and rewards and punishments mechanisms, such as awarding rewards to nodes that are properly validated, punishing nodes that are incorrectly validated. These mechanisms may motivate multiple nodes to participate in the authentication and agree on the authentication result. For example, it is prescribed that each node participating in the verification requires a certain amount of Ethernet coins, if the verification result is correct, a reward can be obtained, and if the verification result is incorrect, the mortgage Ethernet coins can be deducted. The historical verification information for the digital asset is obtained using a blockchain browser to query and trace verification data, such as a verification record of "flame sword" in the past month. Such information includes a time stamp for each verification, a verification hash value, a security rating result, etc. The authenticity of the current verification data may be verified by comparing the historical verification hash value, for example, the current hash value "e4b58eb4642304a2270307021b3f42f3" with the previous hash value it stored in the blockchain. If the hash values do not match, it is indicated that the data may be tampered with. The validation data in the blockchain, such as a historical validation record of the "flame sword," is synchronized to the local database. The historical security of the digital asset is assessed by performing statistical analysis on the verification data using SQL, for example, calculating the average risk score, security rating change trend, etc. for "flame sword" over the past month. And generating a safety analysis report according to the statistical analysis result. For example, a security analysis report of "flame sword" may contain its security rating trend graph over the past month, average risk score, and possible security risk, etc. Such information may help the user to understand the historical security status of the digital asset and make corresponding decisions. For example, if the risk score of a "flame sword" continues to rise, the user may need to take steps to improve his safety.
The foregoing description is only illustrative of the present invention and is not intended to limit the scope of the invention, and all equivalent structures or equivalent processes or direct or indirect application in other related technical fields are included in the scope of the present invention.